id
stringlengths 9
104
| author
stringlengths 3
36
| task_category
stringclasses 32
values | tags
sequencelengths 1
4.05k
| created_time
unknowndate 2022-03-02 23:29:04
2025-03-18 02:34:30
| last_modified
stringdate 2021-02-13 00:06:56
2025-03-18 09:30:19
| downloads
int64 0
15.6M
| likes
int64 0
4.86k
| README
stringlengths 44
1.01M
| matched_bigbio_names
sequencelengths 1
8
|
---|---|---|---|---|---|---|---|---|---|
zhan1993/private_library_phi2_epoch_1 | zhan1993 | null | [
"region:us"
] | "2024-04-19T14:41:58Z" | 2024-04-19T21:19:24+00:00 | 0 | 0 | ---
{}
---
Number of experts present in the library: 263
| Expert Name | Base Model | Trained on | Adapter Type |
| --- | --- | --- | --- |
| sciq_Multiple_Choice | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice | lora |
| wiki_hop_original_choose_best_object_interrogative_1 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_1 | lora |
| squad_v2_0_3_0_0 | phi-2 | sordonia/flan-10k-flat/squad_v2_0_3_0_0 | lora |
| wiki_qa_exercise | phi-2 | sordonia/flan-10k-flat/wiki_qa_exercise | lora |
| race_high_Taking_a_test | phi-2 | sordonia/flan-10k-flat/race_high_Taking_a_test | lora |
| adversarial_qa_dbert_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_generate_question | lora |
| quoref_Found_Context_Online | phi-2 | sordonia/flan-10k-flat/quoref_Found_Context_Online | lora |
| web_questions_get_the_answer | phi-2 | sordonia/flan-10k-flat/web_questions_get_the_answer | lora |
| duorc_SelfRC_generate_question_by_answer | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_generate_question_by_answer | lora |
| quarel_testing_students | phi-2 | sordonia/flan-10k-flat/quarel_testing_students | lora |
| qasc_qa_with_separated_facts_1 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_1 | lora |
| wiki_qa_Is_This_True_ | phi-2 | sordonia/flan-10k-flat/wiki_qa_Is_This_True_ | lora |
| race_high_Read_the_article_and_answer_the_question_no_option_ | phi-2 | sordonia/flan-10k-flat/race_high_Read_the_article_and_answer_the_question_no_option_ | lora |
| cot_gsm8k_ii | phi-2 | sordonia/flan-10k-flat/cot_gsm8k_ii | lora |
| gem_wiki_lingua_english_en_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_wiki_lingua_english_en_1_1_0 | lora |
| unified_qa_science_inst | phi-2 | sordonia/flan-10k-flat/unified_qa_science_inst | lora |
| quartz_use_info_from_paragraph_question | phi-2 | sordonia/flan-10k-flat/quartz_use_info_from_paragraph_question | lora |
| wiki_hop_original_generate_object | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_object | lora |
| quoref_What_Is_The_Answer | phi-2 | sordonia/flan-10k-flat/quoref_What_Is_The_Answer | lora |
| adversarial_qa_droberta_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_generate_question | lora |
| wiki_bio_comprehension | phi-2 | sordonia/flan-10k-flat/wiki_bio_comprehension | lora |
| adversarial_qa_dbidaf_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_question_context_answer | lora |
| wiki_bio_what_content | phi-2 | sordonia/flan-10k-flat/wiki_bio_what_content | lora |
| web_questions_whats_the_answer | phi-2 | sordonia/flan-10k-flat/web_questions_whats_the_answer | lora |
| wiqa_what_is_the_missing_first_step | phi-2 | sordonia/flan-10k-flat/wiqa_what_is_the_missing_first_step | lora |
| adversarial_qa_droberta_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_question_context_answer | lora |
| ropes_plain_bottom_hint | phi-2 | sordonia/flan-10k-flat/ropes_plain_bottom_hint | lora |
| kilt_tasks_hotpotqa_combining_facts | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_combining_facts | lora |
| cos_e_v1_11_aligned_with_common_sense | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_aligned_with_common_sense | lora |
| gem_web_nlg_en_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_web_nlg_en_1_1_0 | lora |
| web_questions_potential_correct_answer | phi-2 | sordonia/flan-10k-flat/web_questions_potential_correct_answer | lora |
| wiki_qa_found_on_google | phi-2 | sordonia/flan-10k-flat/wiki_qa_found_on_google | lora |
| duorc_ParaphraseRC_extract_answer | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_extract_answer | lora |
| wmt16_translate_de_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_de_en_1_0_0 | lora |
| quail_no_prompt_id | phi-2 | sordonia/flan-10k-flat/quail_no_prompt_id | lora |
| quoref_Guess_Title_For_Context | phi-2 | sordonia/flan-10k-flat/quoref_Guess_Title_For_Context | lora |
| duorc_SelfRC_decide_worth_it | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_decide_worth_it | lora |
| ropes_prompt_mix | phi-2 | sordonia/flan-10k-flat/ropes_prompt_mix | lora |
| adversarial_qa_droberta_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_tell_what_it_is | lora |
| quail_context_question_answer_description_id | phi-2 | sordonia/flan-10k-flat/quail_context_question_answer_description_id | lora |
| gem_common_gen_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_common_gen_1_1_0 | lora |
| duorc_ParaphraseRC_answer_question | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_answer_question | lora |
| super_glue_cb_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_cb_1_0_2 | lora |
| cnn_dailymail_3_4_0 | phi-2 | sordonia/flan-10k-flat/cnn_dailymail_3_4_0 | lora |
| race_high_Write_a_multi_choice_question_options_given_ | phi-2 | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_options_given_ | lora |
| winogrande_1_1_0 | phi-2 | sordonia/flan-10k-flat/winogrande_1_1_0 | lora |
| duorc_SelfRC_extract_answer | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_extract_answer | lora |
| trec_1_0_0 | phi-2 | sordonia/flan-10k-flat/trec_1_0_0 | lora |
| yelp_polarity_reviews_0_2_0 | phi-2 | sordonia/flan-10k-flat/yelp_polarity_reviews_0_2_0 | lora |
| race_high_Select_the_best_answer | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer | lora |
| para_crawl_enes | phi-2 | sordonia/flan-10k-flat/para_crawl_enes | lora |
| qasc_is_correct_1 | phi-2 | sordonia/flan-10k-flat/qasc_is_correct_1 | lora |
| app_reviews_generate_review | phi-2 | sordonia/flan-10k-flat/app_reviews_generate_review | lora |
| ropes_read_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_read_background_situation | lora |
| dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | lora |
| stream_aqua | phi-2 | sordonia/flan-10k-flat/stream_aqua | lora |
| drop_2_0_0 | phi-2 | sordonia/flan-10k-flat/drop_2_0_0 | lora |
| wiki_hop_original_choose_best_object_affirmative_1 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_1 | lora |
| adversarial_qa_dbidaf_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_answer_the_following_q | lora |
| social_i_qa_Generate_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Generate_answer | lora |
| stream_aqua_ii | phi-2 | sordonia/flan-10k-flat/stream_aqua_ii | lora |
| glue_sst2_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_sst2_2_0_0 | lora |
| cot_esnli | phi-2 | sordonia/flan-10k-flat/cot_esnli | lora |
| race_high_Select_the_best_answer_no_instructions_ | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer_no_instructions_ | lora |
| duorc_SelfRC_build_story_around_qa | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_build_story_around_qa | lora |
| cot_esnli_ii | phi-2 | sordonia/flan-10k-flat/cot_esnli_ii | lora |
| quail_no_prompt_text | phi-2 | sordonia/flan-10k-flat/quail_no_prompt_text | lora |
| ropes_given_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_given_background_situation | lora |
| quarel_logic_test | phi-2 | sordonia/flan-10k-flat/quarel_logic_test | lora |
| adversarial_qa_dbidaf_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_based_on | lora |
| super_glue_copa_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_copa_1_0_2 | lora |
| cos_e_v1_11_i_think | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_i_think | lora |
| quail_context_question_description_answer_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_answer_text | lora |
| math_dataset_algebra__linear_1d_1_0_0 | phi-2 | sordonia/flan-10k-flat/math_dataset_algebra__linear_1d_1_0_0 | lora |
| cosmos_qa_1_0_0 | phi-2 | sordonia/flan-10k-flat/cosmos_qa_1_0_0 | lora |
| wiqa_effect_with_label_answer | phi-2 | sordonia/flan-10k-flat/wiqa_effect_with_label_answer | lora |
| app_reviews_convert_to_star_rating | phi-2 | sordonia/flan-10k-flat/app_reviews_convert_to_star_rating | lora |
| qasc_qa_with_separated_facts_2 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_2 | lora |
| race_middle_Select_the_best_answer | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer | lora |
| quartz_having_read_above_passage | phi-2 | sordonia/flan-10k-flat/quartz_having_read_above_passage | lora |
| glue_qqp_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_qqp_2_0_0 | lora |
| cos_e_v1_11_question_description_option_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_id | lora |
| stream_qed_ii | phi-2 | sordonia/flan-10k-flat/stream_qed_ii | lora |
| cos_e_v1_11_question_option_description_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_text | lora |
| imdb_reviews_plain_text_1_0_0 | phi-2 | sordonia/flan-10k-flat/imdb_reviews_plain_text_1_0_0 | lora |
| wiki_hop_original_choose_best_object_affirmative_2 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_2 | lora |
| natural_questions_open_1_0_0 | phi-2 | sordonia/flan-10k-flat/natural_questions_open_1_0_0 | lora |
| wiqa_effect_with_string_answer | phi-2 | sordonia/flan-10k-flat/wiqa_effect_with_string_answer | lora |
| cos_e_v1_11_rationale | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_rationale | lora |
| race_middle_Write_a_multi_choice_question_options_given_ | phi-2 | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_options_given_ | lora |
| wiki_bio_guess_person | phi-2 | sordonia/flan-10k-flat/wiki_bio_guess_person | lora |
| hellaswag_1_1_0 | phi-2 | sordonia/flan-10k-flat/hellaswag_1_1_0 | lora |
| wiqa_does_the_supposed_perturbation_have_an_effect | phi-2 | sordonia/flan-10k-flat/wiqa_does_the_supposed_perturbation_have_an_effect | lora |
| trivia_qa_rc_1_1_0 | phi-2 | sordonia/flan-10k-flat/trivia_qa_rc_1_1_0 | lora |
| lambada_1_0_0 | phi-2 | sordonia/flan-10k-flat/lambada_1_0_0 | lora |
| quoref_Read_And_Extract_ | phi-2 | sordonia/flan-10k-flat/quoref_Read_And_Extract_ | lora |
| quail_context_description_question_answer_id | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_answer_id | lora |
| quail_context_description_question_answer_text | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_answer_text | lora |
| duorc_SelfRC_question_answering | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_question_answering | lora |
| cot_sensemaking_ii | phi-2 | sordonia/flan-10k-flat/cot_sensemaking_ii | lora |
| fix_punct | phi-2 | sordonia/flan-10k-flat/fix_punct | lora |
| squad_v1_1_3_0_0 | phi-2 | sordonia/flan-10k-flat/squad_v1_1_3_0_0 | lora |
| coqa_1_0_0 | phi-2 | sordonia/flan-10k-flat/coqa_1_0_0 | lora |
| glue_qnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_qnli_2_0_0 | lora |
| wiki_qa_Jeopardy_style | phi-2 | sordonia/flan-10k-flat/wiki_qa_Jeopardy_style | lora |
| qasc_qa_with_separated_facts_5 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_5 | lora |
| glue_mnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_mnli_2_0_0 | lora |
| wiki_bio_key_content | phi-2 | sordonia/flan-10k-flat/wiki_bio_key_content | lora |
| dream_generate_first_utterance | phi-2 | sordonia/flan-10k-flat/dream_generate_first_utterance | lora |
| quartz_read_passage_below_choose | phi-2 | sordonia/flan-10k-flat/quartz_read_passage_below_choose | lora |
| web_questions_question_answer | phi-2 | sordonia/flan-10k-flat/web_questions_question_answer | lora |
| glue_stsb_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_stsb_2_0_0 | lora |
| wmt16_translate_tr_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_tr_en_1_0_0 | lora |
| cot_qasc | phi-2 | sordonia/flan-10k-flat/cot_qasc | lora |
| duorc_ParaphraseRC_title_generation | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_title_generation | lora |
| quail_description_context_question_answer_id | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_answer_id | lora |
| wiki_qa_Topic_Prediction_Question_Only | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_Only | lora |
| quoref_Find_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Find_Answer | lora |
| social_i_qa_I_was_wondering | phi-2 | sordonia/flan-10k-flat/social_i_qa_I_was_wondering | lora |
| wiki_hop_original_choose_best_object_affirmative_3 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_3 | lora |
| duorc_ParaphraseRC_build_story_around_qa | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_build_story_around_qa | lora |
| qasc_qa_with_separated_facts_3 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_3 | lora |
| race_middle_Is_this_the_right_answer | phi-2 | sordonia/flan-10k-flat/race_middle_Is_this_the_right_answer | lora |
| paws_wiki_1_1_0 | phi-2 | sordonia/flan-10k-flat/paws_wiki_1_1_0 | lora |
| app_reviews_categorize_rating_using_review | phi-2 | sordonia/flan-10k-flat/app_reviews_categorize_rating_using_review | lora |
| anli_r3_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r3_0_1_0 | lora |
| app_reviews_convert_to_rating | phi-2 | sordonia/flan-10k-flat/app_reviews_convert_to_rating | lora |
| wiqa_what_is_the_final_step_of_the_following_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_is_the_final_step_of_the_following_process | lora |
| adversarial_qa_droberta_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_answer_the_following_q | lora |
| wiki_qa_Decide_good_answer | phi-2 | sordonia/flan-10k-flat/wiki_qa_Decide_good_answer | lora |
| adversarial_qa_dbert_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_answer_the_following_q | lora |
| gem_dart_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_dart_1_1_0 | lora |
| adversarial_qa_dbert_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_tell_what_it_is | lora |
| quarel_choose_between | phi-2 | sordonia/flan-10k-flat/quarel_choose_between | lora |
| duorc_ParaphraseRC_generate_question_by_answer | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question_by_answer | lora |
| wiki_hop_original_generate_subject | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_subject | lora |
| dream_baseline | phi-2 | sordonia/flan-10k-flat/dream_baseline | lora |
| cos_e_v1_11_question_description_option_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_text | lora |
| aeslc_1_0_0 | phi-2 | sordonia/flan-10k-flat/aeslc_1_0_0 | lora |
| anli_r2_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r2_0_1_0 | lora |
| dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | lora |
| quail_context_question_description_answer_id | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_answer_id | lora |
| race_middle_Select_the_best_answer_no_instructions_ | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_no_instructions_ | lora |
| wmt16_translate_ro_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_ro_en_1_0_0 | lora |
| race_high_Is_this_the_right_answer | phi-2 | sordonia/flan-10k-flat/race_high_Is_this_the_right_answer | lora |
| quail_description_context_question_text | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_text | lora |
| sciq_Direct_Question_Closed_Book_ | phi-2 | sordonia/flan-10k-flat/sciq_Direct_Question_Closed_Book_ | lora |
| openbookqa_0_1_0 | phi-2 | sordonia/flan-10k-flat/openbookqa_0_1_0 | lora |
| duorc_SelfRC_title_generation | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_title_generation | lora |
| cot_gsm8k | phi-2 | sordonia/flan-10k-flat/cot_gsm8k | lora |
| quartz_answer_question_below | phi-2 | sordonia/flan-10k-flat/quartz_answer_question_below | lora |
| snli_1_1_0 | phi-2 | sordonia/flan-10k-flat/snli_1_1_0 | lora |
| sciq_Multiple_Choice_Closed_Book_ | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice_Closed_Book_ | lora |
| cot_strategyqa | phi-2 | sordonia/flan-10k-flat/cot_strategyqa | lora |
| qasc_qa_with_separated_facts_4 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_4 | lora |
| ropes_prompt_bottom_no_hint | phi-2 | sordonia/flan-10k-flat/ropes_prompt_bottom_no_hint | lora |
| duorc_SelfRC_generate_question | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_generate_question | lora |
| quartz_given_the_fact_answer_the_q | phi-2 | sordonia/flan-10k-flat/quartz_given_the_fact_answer_the_q | lora |
| anli_r1_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r1_0_1_0 | lora |
| wiki_qa_Topic_Prediction_Question_and_Answer_Pair | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_and_Answer_Pair | lora |
| wiki_qa_Direct_Answer_to_Question | phi-2 | sordonia/flan-10k-flat/wiki_qa_Direct_Answer_to_Question | lora |
| qasc_is_correct_2 | phi-2 | sordonia/flan-10k-flat/qasc_is_correct_2 | lora |
| wiki_hop_original_generate_subject_and_object | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_subject_and_object | lora |
| ai2_arc_ARC_Challenge_1_0_0 | phi-2 | sordonia/flan-10k-flat/ai2_arc_ARC_Challenge_1_0_0 | lora |
| race_middle_Select_the_best_answer_generate_span_ | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_generate_span_ | lora |
| quail_context_question_answer_description_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_answer_description_text | lora |
| quail_context_question_description_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_text | lora |
| wiki_hop_original_choose_best_object_interrogative_2 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_2 | lora |
| duorc_SelfRC_movie_director | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_movie_director | lora |
| quoref_Given_Context_Answer_Question | phi-2 | sordonia/flan-10k-flat/quoref_Given_Context_Answer_Question | lora |
| wiki_hop_original_explain_relation | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_explain_relation | lora |
| super_glue_record_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_record_1_0_2 | lora |
| adversarial_qa_dbidaf_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_tell_what_it_is | lora |
| cot_ecqa_ii | phi-2 | sordonia/flan-10k-flat/cot_ecqa_ii | lora |
| ropes_background_new_situation_answer | phi-2 | sordonia/flan-10k-flat/ropes_background_new_situation_answer | lora |
| web_questions_short_general_knowledge_q | phi-2 | sordonia/flan-10k-flat/web_questions_short_general_knowledge_q | lora |
| wiqa_what_might_be_the_first_step_of_the_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_might_be_the_first_step_of_the_process | lora |
| duorc_SelfRC_answer_question | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_answer_question | lora |
| ag_news_subset_1_0_0 | phi-2 | sordonia/flan-10k-flat/ag_news_subset_1_0_0 | lora |
| race_middle_Write_a_multi_choice_question_for_the_following_article | phi-2 | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_for_the_following_article | lora |
| wmt14_translate_fr_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt14_translate_fr_en_1_0_0 | lora |
| sciq_Direct_Question | phi-2 | sordonia/flan-10k-flat/sciq_Direct_Question | lora |
| super_glue_multirc_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_multirc_1_0_2 | lora |
| dbpedia_14_given_a_choice_of_categories_ | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_a_choice_of_categories_ | lora |
| super_glue_wic_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_wic_1_0_2 | lora |
| social_i_qa_Show_choices_and_generate_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_answer | lora |
| wiqa_what_might_be_the_last_step_of_the_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_might_be_the_last_step_of_the_process | lora |
| quoref_Answer_Question_Given_Context | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Question_Given_Context | lora |
| quoref_Context_Contains_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Context_Contains_Answer | lora |
| cos_e_v1_11_description_question_option_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_text | lora |
| adversarial_qa_dbert_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_based_on | lora |
| multi_news_1_0_0 | phi-2 | sordonia/flan-10k-flat/multi_news_1_0_0 | lora |
| cos_e_v1_11_generate_explanation_given_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_generate_explanation_given_text | lora |
| true_case | phi-2 | sordonia/flan-10k-flat/true_case | lora |
| duorc_ParaphraseRC_movie_director | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_movie_director | lora |
| quartz_answer_question_based_on | phi-2 | sordonia/flan-10k-flat/quartz_answer_question_based_on | lora |
| bool_q_1_0_0 | phi-2 | sordonia/flan-10k-flat/bool_q_1_0_0 | lora |
| quoref_Guess_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Guess_Answer | lora |
| quarel_do_not_use | phi-2 | sordonia/flan-10k-flat/quarel_do_not_use | lora |
| cos_e_v1_11_explain_why_human | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_explain_why_human | lora |
| wiki_qa_Generate_Question_from_Topic | phi-2 | sordonia/flan-10k-flat/wiki_qa_Generate_Question_from_Topic | lora |
| kilt_tasks_hotpotqa_straighforward_qa | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_straighforward_qa | lora |
| adversarial_qa_dbidaf_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_generate_question | lora |
| dbpedia_14_pick_one_category_for_the_following_text | phi-2 | sordonia/flan-10k-flat/dbpedia_14_pick_one_category_for_the_following_text | lora |
| kilt_tasks_hotpotqa_final_exam | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_final_exam | lora |
| quoref_Answer_Friend_Question | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Friend_Question | lora |
| race_high_Write_a_multi_choice_question_for_the_following_article | phi-2 | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_for_the_following_article | lora |
| ropes_prompt_beginning | phi-2 | sordonia/flan-10k-flat/ropes_prompt_beginning | lora |
| adversarial_qa_dbert_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_question_context_answer | lora |
| cot_creak | phi-2 | sordonia/flan-10k-flat/cot_creak | lora |
| gem_e2e_nlg_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_e2e_nlg_1_1_0 | lora |
| cos_e_v1_11_description_question_option_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_id | lora |
| social_i_qa_Generate_the_question_from_the_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Generate_the_question_from_the_answer | lora |
| quarel_heres_a_story | phi-2 | sordonia/flan-10k-flat/quarel_heres_a_story | lora |
| social_i_qa_Check_if_a_random_answer_is_valid_or_not | phi-2 | sordonia/flan-10k-flat/social_i_qa_Check_if_a_random_answer_is_valid_or_not | lora |
| ropes_background_situation_middle | phi-2 | sordonia/flan-10k-flat/ropes_background_situation_middle | lora |
| sciq_Multiple_Choice_Question_First | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice_Question_First | lora |
| cot_strategyqa_ii | phi-2 | sordonia/flan-10k-flat/cot_strategyqa_ii | lora |
| huggingface_xsum | phi-2 | sordonia/flan-10k-flat/huggingface_xsum | lora |
| kilt_tasks_hotpotqa_complex_question | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_complex_question | lora |
| wmt16_translate_fi_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_fi_en_1_0_0 | lora |
| ai2_arc_ARC_Easy_1_0_0 | phi-2 | sordonia/flan-10k-flat/ai2_arc_ARC_Easy_1_0_0 | lora |
| stream_qed | phi-2 | sordonia/flan-10k-flat/stream_qed | lora |
| definite_pronoun_resolution_1_1_0 | phi-2 | sordonia/flan-10k-flat/definite_pronoun_resolution_1_1_0 | lora |
| super_glue_rte_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_rte_1_0_2 | lora |
| ropes_new_situation_background_answer | phi-2 | sordonia/flan-10k-flat/ropes_new_situation_background_answer | lora |
| dream_read_the_following_conversation_and_answer_the_question | phi-2 | sordonia/flan-10k-flat/dream_read_the_following_conversation_and_answer_the_question | lora |
| cot_sensemaking | phi-2 | sordonia/flan-10k-flat/cot_sensemaking | lora |
| wiki_qa_Topic_Prediction_Answer_Only | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Answer_Only | lora |
| duorc_ParaphraseRC_generate_question | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question | lora |
| dream_generate_last_utterance | phi-2 | sordonia/flan-10k-flat/dream_generate_last_utterance | lora |
| race_middle_Taking_a_test | phi-2 | sordonia/flan-10k-flat/race_middle_Taking_a_test | lora |
| piqa_1_0_0 | phi-2 | sordonia/flan-10k-flat/piqa_1_0_0 | lora |
| cot_ecqa | phi-2 | sordonia/flan-10k-flat/cot_ecqa | lora |
| glue_mrpc_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_mrpc_2_0_0 | lora |
| race_middle_Read_the_article_and_answer_the_question_no_option_ | phi-2 | sordonia/flan-10k-flat/race_middle_Read_the_article_and_answer_the_question_no_option_ | lora |
| ropes_plain_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_plain_background_situation | lora |
| quail_description_context_question_answer_text | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_answer_text | lora |
| qasc_qa_with_combined_facts_1 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_combined_facts_1 | lora |
| duorc_ParaphraseRC_decide_worth_it | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_decide_worth_it | lora |
| quoref_Answer_Test | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Test | lora |
| wiki_bio_who | phi-2 | sordonia/flan-10k-flat/wiki_bio_who | lora |
| kilt_tasks_hotpotqa_formulate | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_formulate | lora |
| glue_wnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_wnli_2_0_0 | lora |
| gigaword_1_2_0 | phi-2 | sordonia/flan-10k-flat/gigaword_1_2_0 | lora |
| quail_context_description_question_text | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_text | lora |
| dream_answer_to_dialogue | phi-2 | sordonia/flan-10k-flat/dream_answer_to_dialogue | lora |
| cos_e_v1_11_question_option_description_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_id | lora |
| duorc_ParaphraseRC_question_answering | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_question_answering | lora |
| wiki_qa_automatic_system | phi-2 | sordonia/flan-10k-flat/wiki_qa_automatic_system | lora |
| adversarial_qa_droberta_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_based_on | lora |
| super_glue_wsc_fixed_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_wsc_fixed_1_0_2 | lora |
| word_segment | phi-2 | sordonia/flan-10k-flat/word_segment | lora |
| quac_1_0_0 | phi-2 | sordonia/flan-10k-flat/quac_1_0_0 | lora |
| quartz_paragraph_question_plain_concat | phi-2 | sordonia/flan-10k-flat/quartz_paragraph_question_plain_concat | lora |
| wiqa_which_of_the_following_is_the_supposed_perturbation | phi-2 | sordonia/flan-10k-flat/wiqa_which_of_the_following_is_the_supposed_perturbation | lora |
| quartz_use_info_from_question_paragraph | phi-2 | sordonia/flan-10k-flat/quartz_use_info_from_question_paragraph | lora |
| ropes_plain_no_background | phi-2 | sordonia/flan-10k-flat/ropes_plain_no_background | lora |
| race_high_Select_the_best_answer_generate_span_ | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer_generate_span_ | lora |
| glue_cola_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_cola_2_0_0 | lora |
| social_i_qa_Show_choices_and_generate_index | phi-2 | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_index | lora |
| ropes_prompt_bottom_hint_beginning | phi-2 | sordonia/flan-10k-flat/ropes_prompt_bottom_hint_beginning | lora |
| cot_creak_ii | phi-2 | sordonia/flan-10k-flat/cot_creak_ii | lora |
Last updated on: 2024-04-19 21:11:52+00:00
| [
"SCIQ"
] |
zhan1993/private_library_phi2_epoch_2 | zhan1993 | null | [
"region:us"
] | "2024-04-19T14:53:26Z" | 2024-04-19T21:35:59+00:00 | 0 | 0 | ---
{}
---
Number of experts present in the library: 263
| Expert Name | Base Model | Trained on | Adapter Type |
| --- | --- | --- | --- |
| sciq_Multiple_Choice | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice | lora |
| wiki_hop_original_choose_best_object_interrogative_1 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_1 | lora |
| squad_v2_0_3_0_0 | phi-2 | sordonia/flan-10k-flat/squad_v2_0_3_0_0 | lora |
| wiki_qa_exercise | phi-2 | sordonia/flan-10k-flat/wiki_qa_exercise | lora |
| race_high_Taking_a_test | phi-2 | sordonia/flan-10k-flat/race_high_Taking_a_test | lora |
| adversarial_qa_dbert_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_generate_question | lora |
| quoref_Found_Context_Online | phi-2 | sordonia/flan-10k-flat/quoref_Found_Context_Online | lora |
| web_questions_get_the_answer | phi-2 | sordonia/flan-10k-flat/web_questions_get_the_answer | lora |
| duorc_SelfRC_generate_question_by_answer | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_generate_question_by_answer | lora |
| quarel_testing_students | phi-2 | sordonia/flan-10k-flat/quarel_testing_students | lora |
| qasc_qa_with_separated_facts_1 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_1 | lora |
| wiki_qa_Is_This_True_ | phi-2 | sordonia/flan-10k-flat/wiki_qa_Is_This_True_ | lora |
| race_high_Read_the_article_and_answer_the_question_no_option_ | phi-2 | sordonia/flan-10k-flat/race_high_Read_the_article_and_answer_the_question_no_option_ | lora |
| cot_gsm8k_ii | phi-2 | sordonia/flan-10k-flat/cot_gsm8k_ii | lora |
| gem_wiki_lingua_english_en_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_wiki_lingua_english_en_1_1_0 | lora |
| unified_qa_science_inst | phi-2 | sordonia/flan-10k-flat/unified_qa_science_inst | lora |
| quartz_use_info_from_paragraph_question | phi-2 | sordonia/flan-10k-flat/quartz_use_info_from_paragraph_question | lora |
| wiki_hop_original_generate_object | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_object | lora |
| quoref_What_Is_The_Answer | phi-2 | sordonia/flan-10k-flat/quoref_What_Is_The_Answer | lora |
| adversarial_qa_droberta_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_generate_question | lora |
| wiki_bio_comprehension | phi-2 | sordonia/flan-10k-flat/wiki_bio_comprehension | lora |
| adversarial_qa_dbidaf_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_question_context_answer | lora |
| wiki_bio_what_content | phi-2 | sordonia/flan-10k-flat/wiki_bio_what_content | lora |
| web_questions_whats_the_answer | phi-2 | sordonia/flan-10k-flat/web_questions_whats_the_answer | lora |
| wiqa_what_is_the_missing_first_step | phi-2 | sordonia/flan-10k-flat/wiqa_what_is_the_missing_first_step | lora |
| adversarial_qa_droberta_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_question_context_answer | lora |
| ropes_plain_bottom_hint | phi-2 | sordonia/flan-10k-flat/ropes_plain_bottom_hint | lora |
| kilt_tasks_hotpotqa_combining_facts | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_combining_facts | lora |
| cos_e_v1_11_aligned_with_common_sense | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_aligned_with_common_sense | lora |
| gem_web_nlg_en_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_web_nlg_en_1_1_0 | lora |
| web_questions_potential_correct_answer | phi-2 | sordonia/flan-10k-flat/web_questions_potential_correct_answer | lora |
| wiki_qa_found_on_google | phi-2 | sordonia/flan-10k-flat/wiki_qa_found_on_google | lora |
| duorc_ParaphraseRC_extract_answer | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_extract_answer | lora |
| wmt16_translate_de_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_de_en_1_0_0 | lora |
| quail_no_prompt_id | phi-2 | sordonia/flan-10k-flat/quail_no_prompt_id | lora |
| quoref_Guess_Title_For_Context | phi-2 | sordonia/flan-10k-flat/quoref_Guess_Title_For_Context | lora |
| duorc_SelfRC_decide_worth_it | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_decide_worth_it | lora |
| ropes_prompt_mix | phi-2 | sordonia/flan-10k-flat/ropes_prompt_mix | lora |
| adversarial_qa_droberta_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_tell_what_it_is | lora |
| quail_context_question_answer_description_id | phi-2 | sordonia/flan-10k-flat/quail_context_question_answer_description_id | lora |
| gem_common_gen_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_common_gen_1_1_0 | lora |
| duorc_ParaphraseRC_answer_question | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_answer_question | lora |
| super_glue_cb_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_cb_1_0_2 | lora |
| cnn_dailymail_3_4_0 | phi-2 | sordonia/flan-10k-flat/cnn_dailymail_3_4_0 | lora |
| race_high_Write_a_multi_choice_question_options_given_ | phi-2 | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_options_given_ | lora |
| winogrande_1_1_0 | phi-2 | sordonia/flan-10k-flat/winogrande_1_1_0 | lora |
| duorc_SelfRC_extract_answer | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_extract_answer | lora |
| trec_1_0_0 | phi-2 | sordonia/flan-10k-flat/trec_1_0_0 | lora |
| yelp_polarity_reviews_0_2_0 | phi-2 | sordonia/flan-10k-flat/yelp_polarity_reviews_0_2_0 | lora |
| race_high_Select_the_best_answer | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer | lora |
| para_crawl_enes | phi-2 | sordonia/flan-10k-flat/para_crawl_enes | lora |
| qasc_is_correct_1 | phi-2 | sordonia/flan-10k-flat/qasc_is_correct_1 | lora |
| app_reviews_generate_review | phi-2 | sordonia/flan-10k-flat/app_reviews_generate_review | lora |
| ropes_read_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_read_background_situation | lora |
| dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | lora |
| stream_aqua | phi-2 | sordonia/flan-10k-flat/stream_aqua | lora |
| drop_2_0_0 | phi-2 | sordonia/flan-10k-flat/drop_2_0_0 | lora |
| wiki_hop_original_choose_best_object_affirmative_1 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_1 | lora |
| adversarial_qa_dbidaf_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_answer_the_following_q | lora |
| social_i_qa_Generate_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Generate_answer | lora |
| stream_aqua_ii | phi-2 | sordonia/flan-10k-flat/stream_aqua_ii | lora |
| glue_sst2_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_sst2_2_0_0 | lora |
| cot_esnli | phi-2 | sordonia/flan-10k-flat/cot_esnli | lora |
| race_high_Select_the_best_answer_no_instructions_ | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer_no_instructions_ | lora |
| duorc_SelfRC_build_story_around_qa | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_build_story_around_qa | lora |
| cot_esnli_ii | phi-2 | sordonia/flan-10k-flat/cot_esnli_ii | lora |
| quail_no_prompt_text | phi-2 | sordonia/flan-10k-flat/quail_no_prompt_text | lora |
| ropes_given_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_given_background_situation | lora |
| quarel_logic_test | phi-2 | sordonia/flan-10k-flat/quarel_logic_test | lora |
| adversarial_qa_dbidaf_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_based_on | lora |
| super_glue_copa_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_copa_1_0_2 | lora |
| cos_e_v1_11_i_think | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_i_think | lora |
| quail_context_question_description_answer_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_answer_text | lora |
| math_dataset_algebra__linear_1d_1_0_0 | phi-2 | sordonia/flan-10k-flat/math_dataset_algebra__linear_1d_1_0_0 | lora |
| cosmos_qa_1_0_0 | phi-2 | sordonia/flan-10k-flat/cosmos_qa_1_0_0 | lora |
| wiqa_effect_with_label_answer | phi-2 | sordonia/flan-10k-flat/wiqa_effect_with_label_answer | lora |
| app_reviews_convert_to_star_rating | phi-2 | sordonia/flan-10k-flat/app_reviews_convert_to_star_rating | lora |
| qasc_qa_with_separated_facts_2 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_2 | lora |
| race_middle_Select_the_best_answer | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer | lora |
| quartz_having_read_above_passage | phi-2 | sordonia/flan-10k-flat/quartz_having_read_above_passage | lora |
| glue_qqp_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_qqp_2_0_0 | lora |
| cos_e_v1_11_question_description_option_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_id | lora |
| stream_qed_ii | phi-2 | sordonia/flan-10k-flat/stream_qed_ii | lora |
| cos_e_v1_11_question_option_description_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_text | lora |
| imdb_reviews_plain_text_1_0_0 | phi-2 | sordonia/flan-10k-flat/imdb_reviews_plain_text_1_0_0 | lora |
| wiki_hop_original_choose_best_object_affirmative_2 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_2 | lora |
| natural_questions_open_1_0_0 | phi-2 | sordonia/flan-10k-flat/natural_questions_open_1_0_0 | lora |
| wiqa_effect_with_string_answer | phi-2 | sordonia/flan-10k-flat/wiqa_effect_with_string_answer | lora |
| cos_e_v1_11_rationale | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_rationale | lora |
| race_middle_Write_a_multi_choice_question_options_given_ | phi-2 | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_options_given_ | lora |
| wiki_bio_guess_person | phi-2 | sordonia/flan-10k-flat/wiki_bio_guess_person | lora |
| hellaswag_1_1_0 | phi-2 | sordonia/flan-10k-flat/hellaswag_1_1_0 | lora |
| wiqa_does_the_supposed_perturbation_have_an_effect | phi-2 | sordonia/flan-10k-flat/wiqa_does_the_supposed_perturbation_have_an_effect | lora |
| trivia_qa_rc_1_1_0 | phi-2 | sordonia/flan-10k-flat/trivia_qa_rc_1_1_0 | lora |
| lambada_1_0_0 | phi-2 | sordonia/flan-10k-flat/lambada_1_0_0 | lora |
| quoref_Read_And_Extract_ | phi-2 | sordonia/flan-10k-flat/quoref_Read_And_Extract_ | lora |
| quail_context_description_question_answer_id | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_answer_id | lora |
| quail_context_description_question_answer_text | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_answer_text | lora |
| duorc_SelfRC_question_answering | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_question_answering | lora |
| cot_sensemaking_ii | phi-2 | sordonia/flan-10k-flat/cot_sensemaking_ii | lora |
| fix_punct | phi-2 | sordonia/flan-10k-flat/fix_punct | lora |
| squad_v1_1_3_0_0 | phi-2 | sordonia/flan-10k-flat/squad_v1_1_3_0_0 | lora |
| coqa_1_0_0 | phi-2 | sordonia/flan-10k-flat/coqa_1_0_0 | lora |
| glue_qnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_qnli_2_0_0 | lora |
| wiki_qa_Jeopardy_style | phi-2 | sordonia/flan-10k-flat/wiki_qa_Jeopardy_style | lora |
| qasc_qa_with_separated_facts_5 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_5 | lora |
| glue_mnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_mnli_2_0_0 | lora |
| wiki_bio_key_content | phi-2 | sordonia/flan-10k-flat/wiki_bio_key_content | lora |
| dream_generate_first_utterance | phi-2 | sordonia/flan-10k-flat/dream_generate_first_utterance | lora |
| quartz_read_passage_below_choose | phi-2 | sordonia/flan-10k-flat/quartz_read_passage_below_choose | lora |
| web_questions_question_answer | phi-2 | sordonia/flan-10k-flat/web_questions_question_answer | lora |
| glue_stsb_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_stsb_2_0_0 | lora |
| wmt16_translate_tr_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_tr_en_1_0_0 | lora |
| cot_qasc | phi-2 | sordonia/flan-10k-flat/cot_qasc | lora |
| duorc_ParaphraseRC_title_generation | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_title_generation | lora |
| quail_description_context_question_answer_id | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_answer_id | lora |
| wiki_qa_Topic_Prediction_Question_Only | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_Only | lora |
| quoref_Find_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Find_Answer | lora |
| social_i_qa_I_was_wondering | phi-2 | sordonia/flan-10k-flat/social_i_qa_I_was_wondering | lora |
| wiki_hop_original_choose_best_object_affirmative_3 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_3 | lora |
| duorc_ParaphraseRC_build_story_around_qa | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_build_story_around_qa | lora |
| qasc_qa_with_separated_facts_3 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_3 | lora |
| race_middle_Is_this_the_right_answer | phi-2 | sordonia/flan-10k-flat/race_middle_Is_this_the_right_answer | lora |
| paws_wiki_1_1_0 | phi-2 | sordonia/flan-10k-flat/paws_wiki_1_1_0 | lora |
| app_reviews_categorize_rating_using_review | phi-2 | sordonia/flan-10k-flat/app_reviews_categorize_rating_using_review | lora |
| anli_r3_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r3_0_1_0 | lora |
| app_reviews_convert_to_rating | phi-2 | sordonia/flan-10k-flat/app_reviews_convert_to_rating | lora |
| wiqa_what_is_the_final_step_of_the_following_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_is_the_final_step_of_the_following_process | lora |
| adversarial_qa_droberta_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_answer_the_following_q | lora |
| wiki_qa_Decide_good_answer | phi-2 | sordonia/flan-10k-flat/wiki_qa_Decide_good_answer | lora |
| adversarial_qa_dbert_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_answer_the_following_q | lora |
| gem_dart_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_dart_1_1_0 | lora |
| adversarial_qa_dbert_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_tell_what_it_is | lora |
| quarel_choose_between | phi-2 | sordonia/flan-10k-flat/quarel_choose_between | lora |
| duorc_ParaphraseRC_generate_question_by_answer | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question_by_answer | lora |
| wiki_hop_original_generate_subject | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_subject | lora |
| dream_baseline | phi-2 | sordonia/flan-10k-flat/dream_baseline | lora |
| cos_e_v1_11_question_description_option_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_text | lora |
| aeslc_1_0_0 | phi-2 | sordonia/flan-10k-flat/aeslc_1_0_0 | lora |
| anli_r2_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r2_0_1_0 | lora |
| dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | lora |
| quail_context_question_description_answer_id | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_answer_id | lora |
| race_middle_Select_the_best_answer_no_instructions_ | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_no_instructions_ | lora |
| wmt16_translate_ro_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_ro_en_1_0_0 | lora |
| race_high_Is_this_the_right_answer | phi-2 | sordonia/flan-10k-flat/race_high_Is_this_the_right_answer | lora |
| quail_description_context_question_text | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_text | lora |
| sciq_Direct_Question_Closed_Book_ | phi-2 | sordonia/flan-10k-flat/sciq_Direct_Question_Closed_Book_ | lora |
| openbookqa_0_1_0 | phi-2 | sordonia/flan-10k-flat/openbookqa_0_1_0 | lora |
| duorc_SelfRC_title_generation | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_title_generation | lora |
| cot_gsm8k | phi-2 | sordonia/flan-10k-flat/cot_gsm8k | lora |
| snli_1_1_0 | phi-2 | sordonia/flan-10k-flat/snli_1_1_0 | lora |
| sciq_Multiple_Choice_Closed_Book_ | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice_Closed_Book_ | lora |
| cot_strategyqa | phi-2 | sordonia/flan-10k-flat/cot_strategyqa | lora |
| qasc_qa_with_separated_facts_4 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_4 | lora |
| ropes_prompt_bottom_no_hint | phi-2 | sordonia/flan-10k-flat/ropes_prompt_bottom_no_hint | lora |
| duorc_SelfRC_generate_question | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_generate_question | lora |
| quartz_given_the_fact_answer_the_q | phi-2 | sordonia/flan-10k-flat/quartz_given_the_fact_answer_the_q | lora |
| anli_r1_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r1_0_1_0 | lora |
| wiki_qa_Topic_Prediction_Question_and_Answer_Pair | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_and_Answer_Pair | lora |
| wiki_qa_Direct_Answer_to_Question | phi-2 | sordonia/flan-10k-flat/wiki_qa_Direct_Answer_to_Question | lora |
| qasc_is_correct_2 | phi-2 | sordonia/flan-10k-flat/qasc_is_correct_2 | lora |
| wiki_hop_original_generate_subject_and_object | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_subject_and_object | lora |
| ai2_arc_ARC_Challenge_1_0_0 | phi-2 | sordonia/flan-10k-flat/ai2_arc_ARC_Challenge_1_0_0 | lora |
| race_middle_Select_the_best_answer_generate_span_ | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_generate_span_ | lora |
| quail_context_question_answer_description_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_answer_description_text | lora |
| quail_context_question_description_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_text | lora |
| wiki_hop_original_choose_best_object_interrogative_2 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_2 | lora |
| duorc_SelfRC_movie_director | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_movie_director | lora |
| quoref_Given_Context_Answer_Question | phi-2 | sordonia/flan-10k-flat/quoref_Given_Context_Answer_Question | lora |
| wiki_hop_original_explain_relation | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_explain_relation | lora |
| super_glue_record_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_record_1_0_2 | lora |
| adversarial_qa_dbidaf_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_tell_what_it_is | lora |
| cot_ecqa_ii | phi-2 | sordonia/flan-10k-flat/cot_ecqa_ii | lora |
| ropes_background_new_situation_answer | phi-2 | sordonia/flan-10k-flat/ropes_background_new_situation_answer | lora |
| web_questions_short_general_knowledge_q | phi-2 | sordonia/flan-10k-flat/web_questions_short_general_knowledge_q | lora |
| wiqa_what_might_be_the_first_step_of_the_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_might_be_the_first_step_of_the_process | lora |
| duorc_SelfRC_answer_question | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_answer_question | lora |
| ag_news_subset_1_0_0 | phi-2 | sordonia/flan-10k-flat/ag_news_subset_1_0_0 | lora |
| race_middle_Write_a_multi_choice_question_for_the_following_article | phi-2 | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_for_the_following_article | lora |
| wmt14_translate_fr_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt14_translate_fr_en_1_0_0 | lora |
| sciq_Direct_Question | phi-2 | sordonia/flan-10k-flat/sciq_Direct_Question | lora |
| super_glue_multirc_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_multirc_1_0_2 | lora |
| dbpedia_14_given_a_choice_of_categories_ | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_a_choice_of_categories_ | lora |
| super_glue_wic_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_wic_1_0_2 | lora |
| social_i_qa_Show_choices_and_generate_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_answer | lora |
| wiqa_what_might_be_the_last_step_of_the_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_might_be_the_last_step_of_the_process | lora |
| quoref_Answer_Question_Given_Context | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Question_Given_Context | lora |
| quoref_Context_Contains_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Context_Contains_Answer | lora |
| cos_e_v1_11_description_question_option_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_text | lora |
| adversarial_qa_dbert_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_based_on | lora |
| multi_news_1_0_0 | phi-2 | sordonia/flan-10k-flat/multi_news_1_0_0 | lora |
| cos_e_v1_11_generate_explanation_given_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_generate_explanation_given_text | lora |
| true_case | phi-2 | sordonia/flan-10k-flat/true_case | lora |
| duorc_ParaphraseRC_movie_director | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_movie_director | lora |
| quartz_answer_question_based_on | phi-2 | sordonia/flan-10k-flat/quartz_answer_question_based_on | lora |
| bool_q_1_0_0 | phi-2 | sordonia/flan-10k-flat/bool_q_1_0_0 | lora |
| quoref_Guess_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Guess_Answer | lora |
| quarel_do_not_use | phi-2 | sordonia/flan-10k-flat/quarel_do_not_use | lora |
| cos_e_v1_11_explain_why_human | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_explain_why_human | lora |
| wiki_qa_Generate_Question_from_Topic | phi-2 | sordonia/flan-10k-flat/wiki_qa_Generate_Question_from_Topic | lora |
| kilt_tasks_hotpotqa_straighforward_qa | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_straighforward_qa | lora |
| adversarial_qa_dbidaf_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_generate_question | lora |
| dbpedia_14_pick_one_category_for_the_following_text | phi-2 | sordonia/flan-10k-flat/dbpedia_14_pick_one_category_for_the_following_text | lora |
| kilt_tasks_hotpotqa_final_exam | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_final_exam | lora |
| quoref_Answer_Friend_Question | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Friend_Question | lora |
| race_high_Write_a_multi_choice_question_for_the_following_article | phi-2 | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_for_the_following_article | lora |
| ropes_prompt_beginning | phi-2 | sordonia/flan-10k-flat/ropes_prompt_beginning | lora |
| adversarial_qa_dbert_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_question_context_answer | lora |
| cot_creak | phi-2 | sordonia/flan-10k-flat/cot_creak | lora |
| gem_e2e_nlg_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_e2e_nlg_1_1_0 | lora |
| cos_e_v1_11_description_question_option_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_id | lora |
| social_i_qa_Generate_the_question_from_the_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Generate_the_question_from_the_answer | lora |
| quarel_heres_a_story | phi-2 | sordonia/flan-10k-flat/quarel_heres_a_story | lora |
| social_i_qa_Check_if_a_random_answer_is_valid_or_not | phi-2 | sordonia/flan-10k-flat/social_i_qa_Check_if_a_random_answer_is_valid_or_not | lora |
| ropes_background_situation_middle | phi-2 | sordonia/flan-10k-flat/ropes_background_situation_middle | lora |
| sciq_Multiple_Choice_Question_First | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice_Question_First | lora |
| cot_strategyqa_ii | phi-2 | sordonia/flan-10k-flat/cot_strategyqa_ii | lora |
| huggingface_xsum | phi-2 | sordonia/flan-10k-flat/huggingface_xsum | lora |
| kilt_tasks_hotpotqa_complex_question | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_complex_question | lora |
| wmt16_translate_fi_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_fi_en_1_0_0 | lora |
| ai2_arc_ARC_Easy_1_0_0 | phi-2 | sordonia/flan-10k-flat/ai2_arc_ARC_Easy_1_0_0 | lora |
| stream_qed | phi-2 | sordonia/flan-10k-flat/stream_qed | lora |
| definite_pronoun_resolution_1_1_0 | phi-2 | sordonia/flan-10k-flat/definite_pronoun_resolution_1_1_0 | lora |
| super_glue_rte_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_rte_1_0_2 | lora |
| ropes_new_situation_background_answer | phi-2 | sordonia/flan-10k-flat/ropes_new_situation_background_answer | lora |
| dream_read_the_following_conversation_and_answer_the_question | phi-2 | sordonia/flan-10k-flat/dream_read_the_following_conversation_and_answer_the_question | lora |
| cot_sensemaking | phi-2 | sordonia/flan-10k-flat/cot_sensemaking | lora |
| wiki_qa_Topic_Prediction_Answer_Only | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Answer_Only | lora |
| duorc_ParaphraseRC_generate_question | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question | lora |
| dream_generate_last_utterance | phi-2 | sordonia/flan-10k-flat/dream_generate_last_utterance | lora |
| race_middle_Taking_a_test | phi-2 | sordonia/flan-10k-flat/race_middle_Taking_a_test | lora |
| piqa_1_0_0 | phi-2 | sordonia/flan-10k-flat/piqa_1_0_0 | lora |
| cot_ecqa | phi-2 | sordonia/flan-10k-flat/cot_ecqa | lora |
| glue_mrpc_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_mrpc_2_0_0 | lora |
| race_middle_Read_the_article_and_answer_the_question_no_option_ | phi-2 | sordonia/flan-10k-flat/race_middle_Read_the_article_and_answer_the_question_no_option_ | lora |
| ropes_plain_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_plain_background_situation | lora |
| quail_description_context_question_answer_text | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_answer_text | lora |
| qasc_qa_with_combined_facts_1 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_combined_facts_1 | lora |
| cot_creak_ii | phi-2 | sordonia/flan-10k-flat/cot_creak_ii | lora |
| duorc_ParaphraseRC_decide_worth_it | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_decide_worth_it | lora |
| quoref_Answer_Test | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Test | lora |
| wiki_bio_who | phi-2 | sordonia/flan-10k-flat/wiki_bio_who | lora |
| kilt_tasks_hotpotqa_formulate | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_formulate | lora |
| glue_wnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_wnli_2_0_0 | lora |
| gigaword_1_2_0 | phi-2 | sordonia/flan-10k-flat/gigaword_1_2_0 | lora |
| quail_context_description_question_text | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_text | lora |
| dream_answer_to_dialogue | phi-2 | sordonia/flan-10k-flat/dream_answer_to_dialogue | lora |
| cos_e_v1_11_question_option_description_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_id | lora |
| duorc_ParaphraseRC_question_answering | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_question_answering | lora |
| wiki_qa_automatic_system | phi-2 | sordonia/flan-10k-flat/wiki_qa_automatic_system | lora |
| adversarial_qa_droberta_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_based_on | lora |
| super_glue_wsc_fixed_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_wsc_fixed_1_0_2 | lora |
| word_segment | phi-2 | sordonia/flan-10k-flat/word_segment | lora |
| quac_1_0_0 | phi-2 | sordonia/flan-10k-flat/quac_1_0_0 | lora |
| quartz_paragraph_question_plain_concat | phi-2 | sordonia/flan-10k-flat/quartz_paragraph_question_plain_concat | lora |
| wiqa_which_of_the_following_is_the_supposed_perturbation | phi-2 | sordonia/flan-10k-flat/wiqa_which_of_the_following_is_the_supposed_perturbation | lora |
| quartz_use_info_from_question_paragraph | phi-2 | sordonia/flan-10k-flat/quartz_use_info_from_question_paragraph | lora |
| ropes_plain_no_background | phi-2 | sordonia/flan-10k-flat/ropes_plain_no_background | lora |
| race_high_Select_the_best_answer_generate_span_ | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer_generate_span_ | lora |
| glue_cola_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_cola_2_0_0 | lora |
| social_i_qa_Show_choices_and_generate_index | phi-2 | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_index | lora |
| ropes_prompt_bottom_hint_beginning | phi-2 | sordonia/flan-10k-flat/ropes_prompt_bottom_hint_beginning | lora |
| quartz_answer_question_below | phi-2 | sordonia/flan-10k-flat/quartz_answer_question_below | lora |
Last updated on: 2024-04-19 21:34:44+00:00
| [
"SCIQ"
] |
zhan1993/private_library_phi2_epoch_3 | zhan1993 | null | [
"region:us"
] | "2024-04-19T14:56:41Z" | 2024-04-19T21:26:29+00:00 | 0 | 0 | ---
{}
---
Number of experts present in the library: 263
| Expert Name | Base Model | Trained on | Adapter Type |
| --- | --- | --- | --- |
| sciq_Multiple_Choice | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice | lora |
| wiki_hop_original_choose_best_object_interrogative_1 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_1 | lora |
| squad_v2_0_3_0_0 | phi-2 | sordonia/flan-10k-flat/squad_v2_0_3_0_0 | lora |
| wiki_qa_exercise | phi-2 | sordonia/flan-10k-flat/wiki_qa_exercise | lora |
| race_high_Taking_a_test | phi-2 | sordonia/flan-10k-flat/race_high_Taking_a_test | lora |
| adversarial_qa_dbert_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_generate_question | lora |
| quoref_Found_Context_Online | phi-2 | sordonia/flan-10k-flat/quoref_Found_Context_Online | lora |
| web_questions_get_the_answer | phi-2 | sordonia/flan-10k-flat/web_questions_get_the_answer | lora |
| duorc_SelfRC_generate_question_by_answer | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_generate_question_by_answer | lora |
| quarel_testing_students | phi-2 | sordonia/flan-10k-flat/quarel_testing_students | lora |
| qasc_qa_with_separated_facts_1 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_1 | lora |
| wiki_qa_Is_This_True_ | phi-2 | sordonia/flan-10k-flat/wiki_qa_Is_This_True_ | lora |
| race_high_Read_the_article_and_answer_the_question_no_option_ | phi-2 | sordonia/flan-10k-flat/race_high_Read_the_article_and_answer_the_question_no_option_ | lora |
| cot_gsm8k_ii | phi-2 | sordonia/flan-10k-flat/cot_gsm8k_ii | lora |
| gem_wiki_lingua_english_en_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_wiki_lingua_english_en_1_1_0 | lora |
| unified_qa_science_inst | phi-2 | sordonia/flan-10k-flat/unified_qa_science_inst | lora |
| quartz_use_info_from_paragraph_question | phi-2 | sordonia/flan-10k-flat/quartz_use_info_from_paragraph_question | lora |
| wiki_hop_original_generate_object | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_object | lora |
| quoref_What_Is_The_Answer | phi-2 | sordonia/flan-10k-flat/quoref_What_Is_The_Answer | lora |
| adversarial_qa_droberta_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_generate_question | lora |
| wiki_bio_comprehension | phi-2 | sordonia/flan-10k-flat/wiki_bio_comprehension | lora |
| adversarial_qa_dbidaf_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_question_context_answer | lora |
| wiki_bio_what_content | phi-2 | sordonia/flan-10k-flat/wiki_bio_what_content | lora |
| web_questions_whats_the_answer | phi-2 | sordonia/flan-10k-flat/web_questions_whats_the_answer | lora |
| wiqa_what_is_the_missing_first_step | phi-2 | sordonia/flan-10k-flat/wiqa_what_is_the_missing_first_step | lora |
| adversarial_qa_droberta_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_question_context_answer | lora |
| ropes_plain_bottom_hint | phi-2 | sordonia/flan-10k-flat/ropes_plain_bottom_hint | lora |
| kilt_tasks_hotpotqa_combining_facts | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_combining_facts | lora |
| cos_e_v1_11_aligned_with_common_sense | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_aligned_with_common_sense | lora |
| gem_web_nlg_en_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_web_nlg_en_1_1_0 | lora |
| web_questions_potential_correct_answer | phi-2 | sordonia/flan-10k-flat/web_questions_potential_correct_answer | lora |
| wiki_qa_found_on_google | phi-2 | sordonia/flan-10k-flat/wiki_qa_found_on_google | lora |
| duorc_ParaphraseRC_extract_answer | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_extract_answer | lora |
| wmt16_translate_de_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_de_en_1_0_0 | lora |
| quail_no_prompt_id | phi-2 | sordonia/flan-10k-flat/quail_no_prompt_id | lora |
| quoref_Guess_Title_For_Context | phi-2 | sordonia/flan-10k-flat/quoref_Guess_Title_For_Context | lora |
| duorc_SelfRC_decide_worth_it | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_decide_worth_it | lora |
| ropes_prompt_mix | phi-2 | sordonia/flan-10k-flat/ropes_prompt_mix | lora |
| adversarial_qa_droberta_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_tell_what_it_is | lora |
| quail_context_question_answer_description_id | phi-2 | sordonia/flan-10k-flat/quail_context_question_answer_description_id | lora |
| gem_common_gen_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_common_gen_1_1_0 | lora |
| duorc_ParaphraseRC_answer_question | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_answer_question | lora |
| super_glue_cb_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_cb_1_0_2 | lora |
| cnn_dailymail_3_4_0 | phi-2 | sordonia/flan-10k-flat/cnn_dailymail_3_4_0 | lora |
| race_high_Write_a_multi_choice_question_options_given_ | phi-2 | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_options_given_ | lora |
| winogrande_1_1_0 | phi-2 | sordonia/flan-10k-flat/winogrande_1_1_0 | lora |
| duorc_SelfRC_extract_answer | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_extract_answer | lora |
| trec_1_0_0 | phi-2 | sordonia/flan-10k-flat/trec_1_0_0 | lora |
| yelp_polarity_reviews_0_2_0 | phi-2 | sordonia/flan-10k-flat/yelp_polarity_reviews_0_2_0 | lora |
| race_high_Select_the_best_answer | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer | lora |
| para_crawl_enes | phi-2 | sordonia/flan-10k-flat/para_crawl_enes | lora |
| qasc_is_correct_1 | phi-2 | sordonia/flan-10k-flat/qasc_is_correct_1 | lora |
| app_reviews_generate_review | phi-2 | sordonia/flan-10k-flat/app_reviews_generate_review | lora |
| ropes_read_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_read_background_situation | lora |
| dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | lora |
| stream_aqua | phi-2 | sordonia/flan-10k-flat/stream_aqua | lora |
| drop_2_0_0 | phi-2 | sordonia/flan-10k-flat/drop_2_0_0 | lora |
| wiki_hop_original_choose_best_object_affirmative_1 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_1 | lora |
| adversarial_qa_dbidaf_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_answer_the_following_q | lora |
| social_i_qa_Generate_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Generate_answer | lora |
| stream_aqua_ii | phi-2 | sordonia/flan-10k-flat/stream_aqua_ii | lora |
| glue_sst2_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_sst2_2_0_0 | lora |
| cot_esnli | phi-2 | sordonia/flan-10k-flat/cot_esnli | lora |
| race_high_Select_the_best_answer_no_instructions_ | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer_no_instructions_ | lora |
| duorc_SelfRC_build_story_around_qa | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_build_story_around_qa | lora |
| cot_esnli_ii | phi-2 | sordonia/flan-10k-flat/cot_esnli_ii | lora |
| quail_no_prompt_text | phi-2 | sordonia/flan-10k-flat/quail_no_prompt_text | lora |
| ropes_given_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_given_background_situation | lora |
| quarel_logic_test | phi-2 | sordonia/flan-10k-flat/quarel_logic_test | lora |
| adversarial_qa_dbidaf_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_based_on | lora |
| super_glue_copa_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_copa_1_0_2 | lora |
| cos_e_v1_11_i_think | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_i_think | lora |
| quail_context_question_description_answer_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_answer_text | lora |
| math_dataset_algebra__linear_1d_1_0_0 | phi-2 | sordonia/flan-10k-flat/math_dataset_algebra__linear_1d_1_0_0 | lora |
| cosmos_qa_1_0_0 | phi-2 | sordonia/flan-10k-flat/cosmos_qa_1_0_0 | lora |
| wiqa_effect_with_label_answer | phi-2 | sordonia/flan-10k-flat/wiqa_effect_with_label_answer | lora |
| app_reviews_convert_to_star_rating | phi-2 | sordonia/flan-10k-flat/app_reviews_convert_to_star_rating | lora |
| qasc_qa_with_separated_facts_2 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_2 | lora |
| race_middle_Select_the_best_answer | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer | lora |
| quartz_having_read_above_passage | phi-2 | sordonia/flan-10k-flat/quartz_having_read_above_passage | lora |
| glue_qqp_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_qqp_2_0_0 | lora |
| cos_e_v1_11_question_description_option_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_id | lora |
| stream_qed_ii | phi-2 | sordonia/flan-10k-flat/stream_qed_ii | lora |
| cos_e_v1_11_question_option_description_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_text | lora |
| imdb_reviews_plain_text_1_0_0 | phi-2 | sordonia/flan-10k-flat/imdb_reviews_plain_text_1_0_0 | lora |
| wiki_hop_original_choose_best_object_affirmative_2 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_2 | lora |
| natural_questions_open_1_0_0 | phi-2 | sordonia/flan-10k-flat/natural_questions_open_1_0_0 | lora |
| wiqa_effect_with_string_answer | phi-2 | sordonia/flan-10k-flat/wiqa_effect_with_string_answer | lora |
| cos_e_v1_11_rationale | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_rationale | lora |
| race_middle_Write_a_multi_choice_question_options_given_ | phi-2 | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_options_given_ | lora |
| wiki_bio_guess_person | phi-2 | sordonia/flan-10k-flat/wiki_bio_guess_person | lora |
| hellaswag_1_1_0 | phi-2 | sordonia/flan-10k-flat/hellaswag_1_1_0 | lora |
| wiqa_does_the_supposed_perturbation_have_an_effect | phi-2 | sordonia/flan-10k-flat/wiqa_does_the_supposed_perturbation_have_an_effect | lora |
| trivia_qa_rc_1_1_0 | phi-2 | sordonia/flan-10k-flat/trivia_qa_rc_1_1_0 | lora |
| lambada_1_0_0 | phi-2 | sordonia/flan-10k-flat/lambada_1_0_0 | lora |
| quoref_Read_And_Extract_ | phi-2 | sordonia/flan-10k-flat/quoref_Read_And_Extract_ | lora |
| quail_context_description_question_answer_id | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_answer_id | lora |
| quail_context_description_question_answer_text | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_answer_text | lora |
| duorc_SelfRC_question_answering | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_question_answering | lora |
| cot_sensemaking_ii | phi-2 | sordonia/flan-10k-flat/cot_sensemaking_ii | lora |
| fix_punct | phi-2 | sordonia/flan-10k-flat/fix_punct | lora |
| squad_v1_1_3_0_0 | phi-2 | sordonia/flan-10k-flat/squad_v1_1_3_0_0 | lora |
| coqa_1_0_0 | phi-2 | sordonia/flan-10k-flat/coqa_1_0_0 | lora |
| glue_qnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_qnli_2_0_0 | lora |
| wiki_qa_Jeopardy_style | phi-2 | sordonia/flan-10k-flat/wiki_qa_Jeopardy_style | lora |
| qasc_qa_with_separated_facts_5 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_5 | lora |
| glue_mnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_mnli_2_0_0 | lora |
| wiki_bio_key_content | phi-2 | sordonia/flan-10k-flat/wiki_bio_key_content | lora |
| dream_generate_first_utterance | phi-2 | sordonia/flan-10k-flat/dream_generate_first_utterance | lora |
| quartz_read_passage_below_choose | phi-2 | sordonia/flan-10k-flat/quartz_read_passage_below_choose | lora |
| web_questions_question_answer | phi-2 | sordonia/flan-10k-flat/web_questions_question_answer | lora |
| glue_stsb_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_stsb_2_0_0 | lora |
| wmt16_translate_tr_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_tr_en_1_0_0 | lora |
| cot_qasc | phi-2 | sordonia/flan-10k-flat/cot_qasc | lora |
| duorc_ParaphraseRC_title_generation | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_title_generation | lora |
| quail_description_context_question_answer_id | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_answer_id | lora |
| wiki_qa_Topic_Prediction_Question_Only | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_Only | lora |
| quoref_Find_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Find_Answer | lora |
| social_i_qa_I_was_wondering | phi-2 | sordonia/flan-10k-flat/social_i_qa_I_was_wondering | lora |
| wiki_hop_original_choose_best_object_affirmative_3 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_3 | lora |
| duorc_ParaphraseRC_build_story_around_qa | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_build_story_around_qa | lora |
| qasc_qa_with_separated_facts_3 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_3 | lora |
| race_middle_Is_this_the_right_answer | phi-2 | sordonia/flan-10k-flat/race_middle_Is_this_the_right_answer | lora |
| paws_wiki_1_1_0 | phi-2 | sordonia/flan-10k-flat/paws_wiki_1_1_0 | lora |
| app_reviews_categorize_rating_using_review | phi-2 | sordonia/flan-10k-flat/app_reviews_categorize_rating_using_review | lora |
| anli_r3_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r3_0_1_0 | lora |
| app_reviews_convert_to_rating | phi-2 | sordonia/flan-10k-flat/app_reviews_convert_to_rating | lora |
| wiqa_what_is_the_final_step_of_the_following_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_is_the_final_step_of_the_following_process | lora |
| adversarial_qa_droberta_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_answer_the_following_q | lora |
| wiki_qa_Decide_good_answer | phi-2 | sordonia/flan-10k-flat/wiki_qa_Decide_good_answer | lora |
| adversarial_qa_dbert_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_answer_the_following_q | lora |
| gem_dart_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_dart_1_1_0 | lora |
| adversarial_qa_dbert_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_tell_what_it_is | lora |
| quarel_choose_between | phi-2 | sordonia/flan-10k-flat/quarel_choose_between | lora |
| duorc_ParaphraseRC_generate_question_by_answer | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question_by_answer | lora |
| wiki_hop_original_generate_subject | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_subject | lora |
| dream_baseline | phi-2 | sordonia/flan-10k-flat/dream_baseline | lora |
| cos_e_v1_11_question_description_option_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_text | lora |
| aeslc_1_0_0 | phi-2 | sordonia/flan-10k-flat/aeslc_1_0_0 | lora |
| anli_r2_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r2_0_1_0 | lora |
| dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | lora |
| quail_context_question_description_answer_id | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_answer_id | lora |
| race_middle_Select_the_best_answer_no_instructions_ | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_no_instructions_ | lora |
| wmt16_translate_ro_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_ro_en_1_0_0 | lora |
| race_high_Is_this_the_right_answer | phi-2 | sordonia/flan-10k-flat/race_high_Is_this_the_right_answer | lora |
| quail_description_context_question_text | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_text | lora |
| sciq_Direct_Question_Closed_Book_ | phi-2 | sordonia/flan-10k-flat/sciq_Direct_Question_Closed_Book_ | lora |
| openbookqa_0_1_0 | phi-2 | sordonia/flan-10k-flat/openbookqa_0_1_0 | lora |
| duorc_SelfRC_title_generation | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_title_generation | lora |
| cot_gsm8k | phi-2 | sordonia/flan-10k-flat/cot_gsm8k | lora |
| quartz_answer_question_below | phi-2 | sordonia/flan-10k-flat/quartz_answer_question_below | lora |
| snli_1_1_0 | phi-2 | sordonia/flan-10k-flat/snli_1_1_0 | lora |
| sciq_Multiple_Choice_Closed_Book_ | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice_Closed_Book_ | lora |
| cot_strategyqa | phi-2 | sordonia/flan-10k-flat/cot_strategyqa | lora |
| qasc_qa_with_separated_facts_4 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_4 | lora |
| ropes_prompt_bottom_no_hint | phi-2 | sordonia/flan-10k-flat/ropes_prompt_bottom_no_hint | lora |
| duorc_SelfRC_generate_question | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_generate_question | lora |
| quartz_given_the_fact_answer_the_q | phi-2 | sordonia/flan-10k-flat/quartz_given_the_fact_answer_the_q | lora |
| anli_r1_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r1_0_1_0 | lora |
| wiki_qa_Topic_Prediction_Question_and_Answer_Pair | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_and_Answer_Pair | lora |
| wiki_qa_Direct_Answer_to_Question | phi-2 | sordonia/flan-10k-flat/wiki_qa_Direct_Answer_to_Question | lora |
| qasc_is_correct_2 | phi-2 | sordonia/flan-10k-flat/qasc_is_correct_2 | lora |
| wiki_hop_original_generate_subject_and_object | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_subject_and_object | lora |
| ai2_arc_ARC_Challenge_1_0_0 | phi-2 | sordonia/flan-10k-flat/ai2_arc_ARC_Challenge_1_0_0 | lora |
| race_middle_Select_the_best_answer_generate_span_ | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_generate_span_ | lora |
| quail_context_question_answer_description_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_answer_description_text | lora |
| quail_context_question_description_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_text | lora |
| wiki_hop_original_choose_best_object_interrogative_2 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_2 | lora |
| duorc_SelfRC_movie_director | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_movie_director | lora |
| quoref_Given_Context_Answer_Question | phi-2 | sordonia/flan-10k-flat/quoref_Given_Context_Answer_Question | lora |
| wiki_hop_original_explain_relation | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_explain_relation | lora |
| super_glue_record_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_record_1_0_2 | lora |
| adversarial_qa_dbidaf_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_tell_what_it_is | lora |
| cot_ecqa_ii | phi-2 | sordonia/flan-10k-flat/cot_ecqa_ii | lora |
| ropes_background_new_situation_answer | phi-2 | sordonia/flan-10k-flat/ropes_background_new_situation_answer | lora |
| web_questions_short_general_knowledge_q | phi-2 | sordonia/flan-10k-flat/web_questions_short_general_knowledge_q | lora |
| wiqa_what_might_be_the_first_step_of_the_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_might_be_the_first_step_of_the_process | lora |
| ag_news_subset_1_0_0 | phi-2 | sordonia/flan-10k-flat/ag_news_subset_1_0_0 | lora |
| race_middle_Write_a_multi_choice_question_for_the_following_article | phi-2 | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_for_the_following_article | lora |
| wmt14_translate_fr_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt14_translate_fr_en_1_0_0 | lora |
| sciq_Direct_Question | phi-2 | sordonia/flan-10k-flat/sciq_Direct_Question | lora |
| super_glue_multirc_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_multirc_1_0_2 | lora |
| dbpedia_14_given_a_choice_of_categories_ | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_a_choice_of_categories_ | lora |
| super_glue_wic_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_wic_1_0_2 | lora |
| social_i_qa_Show_choices_and_generate_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_answer | lora |
| wiqa_what_might_be_the_last_step_of_the_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_might_be_the_last_step_of_the_process | lora |
| quoref_Answer_Question_Given_Context | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Question_Given_Context | lora |
| quoref_Context_Contains_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Context_Contains_Answer | lora |
| cos_e_v1_11_description_question_option_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_text | lora |
| adversarial_qa_dbert_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_based_on | lora |
| multi_news_1_0_0 | phi-2 | sordonia/flan-10k-flat/multi_news_1_0_0 | lora |
| cos_e_v1_11_generate_explanation_given_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_generate_explanation_given_text | lora |
| true_case | phi-2 | sordonia/flan-10k-flat/true_case | lora |
| duorc_ParaphraseRC_movie_director | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_movie_director | lora |
| quartz_answer_question_based_on | phi-2 | sordonia/flan-10k-flat/quartz_answer_question_based_on | lora |
| bool_q_1_0_0 | phi-2 | sordonia/flan-10k-flat/bool_q_1_0_0 | lora |
| quoref_Guess_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Guess_Answer | lora |
| quarel_do_not_use | phi-2 | sordonia/flan-10k-flat/quarel_do_not_use | lora |
| cos_e_v1_11_explain_why_human | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_explain_why_human | lora |
| wiki_qa_Generate_Question_from_Topic | phi-2 | sordonia/flan-10k-flat/wiki_qa_Generate_Question_from_Topic | lora |
| kilt_tasks_hotpotqa_straighforward_qa | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_straighforward_qa | lora |
| adversarial_qa_dbidaf_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_generate_question | lora |
| dbpedia_14_pick_one_category_for_the_following_text | phi-2 | sordonia/flan-10k-flat/dbpedia_14_pick_one_category_for_the_following_text | lora |
| kilt_tasks_hotpotqa_final_exam | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_final_exam | lora |
| quoref_Answer_Friend_Question | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Friend_Question | lora |
| race_high_Write_a_multi_choice_question_for_the_following_article | phi-2 | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_for_the_following_article | lora |
| ropes_prompt_beginning | phi-2 | sordonia/flan-10k-flat/ropes_prompt_beginning | lora |
| adversarial_qa_dbert_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_question_context_answer | lora |
| cot_creak | phi-2 | sordonia/flan-10k-flat/cot_creak | lora |
| gem_e2e_nlg_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_e2e_nlg_1_1_0 | lora |
| cos_e_v1_11_description_question_option_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_id | lora |
| social_i_qa_Generate_the_question_from_the_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Generate_the_question_from_the_answer | lora |
| quarel_heres_a_story | phi-2 | sordonia/flan-10k-flat/quarel_heres_a_story | lora |
| social_i_qa_Check_if_a_random_answer_is_valid_or_not | phi-2 | sordonia/flan-10k-flat/social_i_qa_Check_if_a_random_answer_is_valid_or_not | lora |
| ropes_background_situation_middle | phi-2 | sordonia/flan-10k-flat/ropes_background_situation_middle | lora |
| sciq_Multiple_Choice_Question_First | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice_Question_First | lora |
| cot_strategyqa_ii | phi-2 | sordonia/flan-10k-flat/cot_strategyqa_ii | lora |
| huggingface_xsum | phi-2 | sordonia/flan-10k-flat/huggingface_xsum | lora |
| kilt_tasks_hotpotqa_complex_question | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_complex_question | lora |
| wmt16_translate_fi_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_fi_en_1_0_0 | lora |
| ai2_arc_ARC_Easy_1_0_0 | phi-2 | sordonia/flan-10k-flat/ai2_arc_ARC_Easy_1_0_0 | lora |
| stream_qed | phi-2 | sordonia/flan-10k-flat/stream_qed | lora |
| definite_pronoun_resolution_1_1_0 | phi-2 | sordonia/flan-10k-flat/definite_pronoun_resolution_1_1_0 | lora |
| super_glue_rte_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_rte_1_0_2 | lora |
| ropes_new_situation_background_answer | phi-2 | sordonia/flan-10k-flat/ropes_new_situation_background_answer | lora |
| dream_read_the_following_conversation_and_answer_the_question | phi-2 | sordonia/flan-10k-flat/dream_read_the_following_conversation_and_answer_the_question | lora |
| cot_sensemaking | phi-2 | sordonia/flan-10k-flat/cot_sensemaking | lora |
| wiki_qa_Topic_Prediction_Answer_Only | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Answer_Only | lora |
| duorc_ParaphraseRC_generate_question | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question | lora |
| dream_generate_last_utterance | phi-2 | sordonia/flan-10k-flat/dream_generate_last_utterance | lora |
| race_middle_Taking_a_test | phi-2 | sordonia/flan-10k-flat/race_middle_Taking_a_test | lora |
| piqa_1_0_0 | phi-2 | sordonia/flan-10k-flat/piqa_1_0_0 | lora |
| cot_ecqa | phi-2 | sordonia/flan-10k-flat/cot_ecqa | lora |
| glue_mrpc_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_mrpc_2_0_0 | lora |
| race_middle_Read_the_article_and_answer_the_question_no_option_ | phi-2 | sordonia/flan-10k-flat/race_middle_Read_the_article_and_answer_the_question_no_option_ | lora |
| ropes_plain_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_plain_background_situation | lora |
| quail_description_context_question_answer_text | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_answer_text | lora |
| qasc_qa_with_combined_facts_1 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_combined_facts_1 | lora |
| cot_creak_ii | phi-2 | sordonia/flan-10k-flat/cot_creak_ii | lora |
| duorc_ParaphraseRC_decide_worth_it | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_decide_worth_it | lora |
| quoref_Answer_Test | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Test | lora |
| wiki_bio_who | phi-2 | sordonia/flan-10k-flat/wiki_bio_who | lora |
| kilt_tasks_hotpotqa_formulate | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_formulate | lora |
| glue_wnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_wnli_2_0_0 | lora |
| gigaword_1_2_0 | phi-2 | sordonia/flan-10k-flat/gigaword_1_2_0 | lora |
| quail_context_description_question_text | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_text | lora |
| dream_answer_to_dialogue | phi-2 | sordonia/flan-10k-flat/dream_answer_to_dialogue | lora |
| cos_e_v1_11_question_option_description_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_id | lora |
| duorc_ParaphraseRC_question_answering | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_question_answering | lora |
| wiki_qa_automatic_system | phi-2 | sordonia/flan-10k-flat/wiki_qa_automatic_system | lora |
| adversarial_qa_droberta_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_based_on | lora |
| super_glue_wsc_fixed_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_wsc_fixed_1_0_2 | lora |
| word_segment | phi-2 | sordonia/flan-10k-flat/word_segment | lora |
| quac_1_0_0 | phi-2 | sordonia/flan-10k-flat/quac_1_0_0 | lora |
| quartz_paragraph_question_plain_concat | phi-2 | sordonia/flan-10k-flat/quartz_paragraph_question_plain_concat | lora |
| wiqa_which_of_the_following_is_the_supposed_perturbation | phi-2 | sordonia/flan-10k-flat/wiqa_which_of_the_following_is_the_supposed_perturbation | lora |
| quartz_use_info_from_question_paragraph | phi-2 | sordonia/flan-10k-flat/quartz_use_info_from_question_paragraph | lora |
| ropes_plain_no_background | phi-2 | sordonia/flan-10k-flat/ropes_plain_no_background | lora |
| race_high_Select_the_best_answer_generate_span_ | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer_generate_span_ | lora |
| glue_cola_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_cola_2_0_0 | lora |
| social_i_qa_Show_choices_and_generate_index | phi-2 | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_index | lora |
| ropes_prompt_bottom_hint_beginning | phi-2 | sordonia/flan-10k-flat/ropes_prompt_bottom_hint_beginning | lora |
| duorc_SelfRC_answer_question | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_answer_question | lora |
Last updated on: 2024-04-19 21:25:05+00:00
| [
"SCIQ"
] |
zhan1993/private_library_phi2_epoch_4 | zhan1993 | null | [
"region:us"
] | "2024-04-19T14:57:43Z" | 2024-05-08T12:48:45+00:00 | 0 | 0 | ---
{}
---
Number of experts present in the library: 263
| Expert Name | Base Model | Trained on | Adapter Type |
| --- | --- | --- | --- |
| sciq_Multiple_Choice | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice | lora |
| wiki_hop_original_choose_best_object_interrogative_1 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_1 | lora |
| squad_v2_0_3_0_0 | phi-2 | sordonia/flan-10k-flat/squad_v2_0_3_0_0 | lora |
| wiki_qa_exercise | phi-2 | sordonia/flan-10k-flat/wiki_qa_exercise | lora |
| race_high_Taking_a_test | phi-2 | sordonia/flan-10k-flat/race_high_Taking_a_test | lora |
| adversarial_qa_dbert_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_generate_question | lora |
| quoref_Found_Context_Online | phi-2 | sordonia/flan-10k-flat/quoref_Found_Context_Online | lora |
| web_questions_get_the_answer | phi-2 | sordonia/flan-10k-flat/web_questions_get_the_answer | lora |
| duorc_SelfRC_generate_question_by_answer | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_generate_question_by_answer | lora |
| quarel_testing_students | phi-2 | sordonia/flan-10k-flat/quarel_testing_students | lora |
| qasc_qa_with_separated_facts_1 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_1 | lora |
| wiki_qa_Is_This_True_ | phi-2 | sordonia/flan-10k-flat/wiki_qa_Is_This_True_ | lora |
| race_high_Read_the_article_and_answer_the_question_no_option_ | phi-2 | sordonia/flan-10k-flat/race_high_Read_the_article_and_answer_the_question_no_option_ | lora |
| cot_gsm8k_ii | phi-2 | sordonia/flan-10k-flat/cot_gsm8k_ii | lora |
| gem_wiki_lingua_english_en_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_wiki_lingua_english_en_1_1_0 | lora |
| unified_qa_science_inst | phi-2 | sordonia/flan-10k-flat/unified_qa_science_inst | lora |
| quartz_use_info_from_paragraph_question | phi-2 | sordonia/flan-10k-flat/quartz_use_info_from_paragraph_question | lora |
| wiki_hop_original_generate_object | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_object | lora |
| quoref_What_Is_The_Answer | phi-2 | sordonia/flan-10k-flat/quoref_What_Is_The_Answer | lora |
| adversarial_qa_droberta_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_generate_question | lora |
| wiki_bio_comprehension | phi-2 | sordonia/flan-10k-flat/wiki_bio_comprehension | lora |
| adversarial_qa_dbidaf_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_question_context_answer | lora |
| wiki_bio_what_content | phi-2 | sordonia/flan-10k-flat/wiki_bio_what_content | lora |
| web_questions_whats_the_answer | phi-2 | sordonia/flan-10k-flat/web_questions_whats_the_answer | lora |
| wiqa_what_is_the_missing_first_step | phi-2 | sordonia/flan-10k-flat/wiqa_what_is_the_missing_first_step | lora |
| adversarial_qa_droberta_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_question_context_answer | lora |
| ropes_plain_bottom_hint | phi-2 | sordonia/flan-10k-flat/ropes_plain_bottom_hint | lora |
| kilt_tasks_hotpotqa_combining_facts | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_combining_facts | lora |
| cos_e_v1_11_aligned_with_common_sense | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_aligned_with_common_sense | lora |
| gem_web_nlg_en_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_web_nlg_en_1_1_0 | lora |
| web_questions_potential_correct_answer | phi-2 | sordonia/flan-10k-flat/web_questions_potential_correct_answer | lora |
| wiki_qa_found_on_google | phi-2 | sordonia/flan-10k-flat/wiki_qa_found_on_google | lora |
| duorc_ParaphraseRC_extract_answer | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_extract_answer | lora |
| wmt16_translate_de_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_de_en_1_0_0 | lora |
| quail_no_prompt_id | phi-2 | sordonia/flan-10k-flat/quail_no_prompt_id | lora |
| quoref_Guess_Title_For_Context | phi-2 | sordonia/flan-10k-flat/quoref_Guess_Title_For_Context | lora |
| duorc_SelfRC_decide_worth_it | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_decide_worth_it | lora |
| ropes_prompt_mix | phi-2 | sordonia/flan-10k-flat/ropes_prompt_mix | lora |
| adversarial_qa_droberta_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_tell_what_it_is | lora |
| quail_context_question_answer_description_id | phi-2 | sordonia/flan-10k-flat/quail_context_question_answer_description_id | lora |
| gem_common_gen_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_common_gen_1_1_0 | lora |
| duorc_ParaphraseRC_answer_question | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_answer_question | lora |
| super_glue_cb_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_cb_1_0_2 | lora |
| cnn_dailymail_3_4_0 | phi-2 | sordonia/flan-10k-flat/cnn_dailymail_3_4_0 | lora |
| race_high_Write_a_multi_choice_question_options_given_ | phi-2 | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_options_given_ | lora |
| winogrande_1_1_0 | phi-2 | sordonia/flan-10k-flat/winogrande_1_1_0 | lora |
| duorc_SelfRC_extract_answer | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_extract_answer | lora |
| trec_1_0_0 | phi-2 | sordonia/flan-10k-flat/trec_1_0_0 | lora |
| yelp_polarity_reviews_0_2_0 | phi-2 | sordonia/flan-10k-flat/yelp_polarity_reviews_0_2_0 | lora |
| race_high_Select_the_best_answer | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer | lora |
| para_crawl_enes | phi-2 | sordonia/flan-10k-flat/para_crawl_enes | lora |
| qasc_is_correct_1 | phi-2 | sordonia/flan-10k-flat/qasc_is_correct_1 | lora |
| app_reviews_generate_review | phi-2 | sordonia/flan-10k-flat/app_reviews_generate_review | lora |
| ropes_read_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_read_background_situation | lora |
| dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | lora |
| stream_aqua | phi-2 | sordonia/flan-10k-flat/stream_aqua | lora |
| drop_2_0_0 | phi-2 | sordonia/flan-10k-flat/drop_2_0_0 | lora |
| wiki_hop_original_choose_best_object_affirmative_1 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_1 | lora |
| adversarial_qa_dbidaf_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_answer_the_following_q | lora |
| social_i_qa_Generate_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Generate_answer | lora |
| stream_aqua_ii | phi-2 | sordonia/flan-10k-flat/stream_aqua_ii | lora |
| glue_sst2_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_sst2_2_0_0 | lora |
| cot_esnli | phi-2 | sordonia/flan-10k-flat/cot_esnli | lora |
| race_high_Select_the_best_answer_no_instructions_ | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer_no_instructions_ | lora |
| duorc_SelfRC_build_story_around_qa | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_build_story_around_qa | lora |
| cot_esnli_ii | phi-2 | sordonia/flan-10k-flat/cot_esnli_ii | lora |
| quail_no_prompt_text | phi-2 | sordonia/flan-10k-flat/quail_no_prompt_text | lora |
| ropes_given_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_given_background_situation | lora |
| quarel_logic_test | phi-2 | sordonia/flan-10k-flat/quarel_logic_test | lora |
| adversarial_qa_dbidaf_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_based_on | lora |
| super_glue_copa_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_copa_1_0_2 | lora |
| cos_e_v1_11_i_think | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_i_think | lora |
| quail_context_question_description_answer_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_answer_text | lora |
| math_dataset_algebra__linear_1d_1_0_0 | phi-2 | sordonia/flan-10k-flat/math_dataset_algebra__linear_1d_1_0_0 | lora |
| cosmos_qa_1_0_0 | phi-2 | sordonia/flan-10k-flat/cosmos_qa_1_0_0 | lora |
| wiqa_effect_with_label_answer | phi-2 | sordonia/flan-10k-flat/wiqa_effect_with_label_answer | lora |
| app_reviews_convert_to_star_rating | phi-2 | sordonia/flan-10k-flat/app_reviews_convert_to_star_rating | lora |
| qasc_qa_with_separated_facts_2 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_2 | lora |
| race_middle_Select_the_best_answer | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer | lora |
| quartz_having_read_above_passage | phi-2 | sordonia/flan-10k-flat/quartz_having_read_above_passage | lora |
| glue_qqp_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_qqp_2_0_0 | lora |
| cos_e_v1_11_question_description_option_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_id | lora |
| stream_qed_ii | phi-2 | sordonia/flan-10k-flat/stream_qed_ii | lora |
| cos_e_v1_11_question_option_description_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_text | lora |
| imdb_reviews_plain_text_1_0_0 | phi-2 | sordonia/flan-10k-flat/imdb_reviews_plain_text_1_0_0 | lora |
| wiki_hop_original_choose_best_object_affirmative_2 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_2 | lora |
| natural_questions_open_1_0_0 | phi-2 | sordonia/flan-10k-flat/natural_questions_open_1_0_0 | lora |
| wiqa_effect_with_string_answer | phi-2 | sordonia/flan-10k-flat/wiqa_effect_with_string_answer | lora |
| cos_e_v1_11_rationale | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_rationale | lora |
| race_middle_Write_a_multi_choice_question_options_given_ | phi-2 | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_options_given_ | lora |
| wiki_bio_guess_person | phi-2 | sordonia/flan-10k-flat/wiki_bio_guess_person | lora |
| hellaswag_1_1_0 | phi-2 | sordonia/flan-10k-flat/hellaswag_1_1_0 | lora |
| wiqa_does_the_supposed_perturbation_have_an_effect | phi-2 | sordonia/flan-10k-flat/wiqa_does_the_supposed_perturbation_have_an_effect | lora |
| trivia_qa_rc_1_1_0 | phi-2 | sordonia/flan-10k-flat/trivia_qa_rc_1_1_0 | lora |
| lambada_1_0_0 | phi-2 | sordonia/flan-10k-flat/lambada_1_0_0 | lora |
| quoref_Read_And_Extract_ | phi-2 | sordonia/flan-10k-flat/quoref_Read_And_Extract_ | lora |
| quail_context_description_question_answer_id | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_answer_id | lora |
| quail_context_description_question_answer_text | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_answer_text | lora |
| duorc_SelfRC_question_answering | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_question_answering | lora |
| cot_sensemaking_ii | phi-2 | sordonia/flan-10k-flat/cot_sensemaking_ii | lora |
| fix_punct | phi-2 | sordonia/flan-10k-flat/fix_punct | lora |
| squad_v1_1_3_0_0 | phi-2 | sordonia/flan-10k-flat/squad_v1_1_3_0_0 | lora |
| coqa_1_0_0 | phi-2 | sordonia/flan-10k-flat/coqa_1_0_0 | lora |
| glue_qnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_qnli_2_0_0 | lora |
| wiki_qa_Jeopardy_style | phi-2 | sordonia/flan-10k-flat/wiki_qa_Jeopardy_style | lora |
| qasc_qa_with_separated_facts_5 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_5 | lora |
| glue_mnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_mnli_2_0_0 | lora |
| wiki_bio_key_content | phi-2 | sordonia/flan-10k-flat/wiki_bio_key_content | lora |
| dream_generate_first_utterance | phi-2 | sordonia/flan-10k-flat/dream_generate_first_utterance | lora |
| quartz_read_passage_below_choose | phi-2 | sordonia/flan-10k-flat/quartz_read_passage_below_choose | lora |
| web_questions_question_answer | phi-2 | sordonia/flan-10k-flat/web_questions_question_answer | lora |
| glue_stsb_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_stsb_2_0_0 | lora |
| wmt16_translate_tr_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_tr_en_1_0_0 | lora |
| cot_qasc | phi-2 | sordonia/flan-10k-flat/cot_qasc | lora |
| duorc_ParaphraseRC_title_generation | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_title_generation | lora |
| quail_description_context_question_answer_id | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_answer_id | lora |
| wiki_qa_Topic_Prediction_Question_Only | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_Only | lora |
| quoref_Find_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Find_Answer | lora |
| social_i_qa_I_was_wondering | phi-2 | sordonia/flan-10k-flat/social_i_qa_I_was_wondering | lora |
| wiki_hop_original_choose_best_object_affirmative_3 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_3 | lora |
| duorc_ParaphraseRC_build_story_around_qa | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_build_story_around_qa | lora |
| qasc_qa_with_separated_facts_3 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_3 | lora |
| race_middle_Is_this_the_right_answer | phi-2 | sordonia/flan-10k-flat/race_middle_Is_this_the_right_answer | lora |
| paws_wiki_1_1_0 | phi-2 | sordonia/flan-10k-flat/paws_wiki_1_1_0 | lora |
| app_reviews_categorize_rating_using_review | phi-2 | sordonia/flan-10k-flat/app_reviews_categorize_rating_using_review | lora |
| anli_r3_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r3_0_1_0 | lora |
| app_reviews_convert_to_rating | phi-2 | sordonia/flan-10k-flat/app_reviews_convert_to_rating | lora |
| wiqa_what_is_the_final_step_of_the_following_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_is_the_final_step_of_the_following_process | lora |
| adversarial_qa_droberta_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_answer_the_following_q | lora |
| wiki_qa_Decide_good_answer | phi-2 | sordonia/flan-10k-flat/wiki_qa_Decide_good_answer | lora |
| adversarial_qa_dbert_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_answer_the_following_q | lora |
| gem_dart_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_dart_1_1_0 | lora |
| adversarial_qa_dbert_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_tell_what_it_is | lora |
| quarel_choose_between | phi-2 | sordonia/flan-10k-flat/quarel_choose_between | lora |
| duorc_ParaphraseRC_generate_question_by_answer | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question_by_answer | lora |
| wiki_hop_original_generate_subject | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_subject | lora |
| dream_baseline | phi-2 | sordonia/flan-10k-flat/dream_baseline | lora |
| cos_e_v1_11_question_description_option_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_text | lora |
| aeslc_1_0_0 | phi-2 | sordonia/flan-10k-flat/aeslc_1_0_0 | lora |
| anli_r2_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r2_0_1_0 | lora |
| dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | lora |
| quail_context_question_description_answer_id | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_answer_id | lora |
| race_middle_Select_the_best_answer_no_instructions_ | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_no_instructions_ | lora |
| wmt16_translate_ro_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_ro_en_1_0_0 | lora |
| race_high_Is_this_the_right_answer | phi-2 | sordonia/flan-10k-flat/race_high_Is_this_the_right_answer | lora |
| quail_description_context_question_text | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_text | lora |
| sciq_Direct_Question_Closed_Book_ | phi-2 | sordonia/flan-10k-flat/sciq_Direct_Question_Closed_Book_ | lora |
| openbookqa_0_1_0 | phi-2 | sordonia/flan-10k-flat/openbookqa_0_1_0 | lora |
| duorc_SelfRC_title_generation | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_title_generation | lora |
| cot_gsm8k | phi-2 | sordonia/flan-10k-flat/cot_gsm8k | lora |
| quartz_answer_question_below | phi-2 | sordonia/flan-10k-flat/quartz_answer_question_below | lora |
| snli_1_1_0 | phi-2 | sordonia/flan-10k-flat/snli_1_1_0 | lora |
| sciq_Multiple_Choice_Closed_Book_ | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice_Closed_Book_ | lora |
| cot_strategyqa | phi-2 | sordonia/flan-10k-flat/cot_strategyqa | lora |
| qasc_qa_with_separated_facts_4 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_4 | lora |
| ropes_prompt_bottom_no_hint | phi-2 | sordonia/flan-10k-flat/ropes_prompt_bottom_no_hint | lora |
| duorc_SelfRC_generate_question | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_generate_question | lora |
| quartz_given_the_fact_answer_the_q | phi-2 | sordonia/flan-10k-flat/quartz_given_the_fact_answer_the_q | lora |
| anli_r1_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r1_0_1_0 | lora |
| wiki_qa_Topic_Prediction_Question_and_Answer_Pair | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_and_Answer_Pair | lora |
| wiki_qa_Direct_Answer_to_Question | phi-2 | sordonia/flan-10k-flat/wiki_qa_Direct_Answer_to_Question | lora |
| qasc_is_correct_2 | phi-2 | sordonia/flan-10k-flat/qasc_is_correct_2 | lora |
| wiki_hop_original_generate_subject_and_object | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_subject_and_object | lora |
| ai2_arc_ARC_Challenge_1_0_0 | phi-2 | sordonia/flan-10k-flat/ai2_arc_ARC_Challenge_1_0_0 | lora |
| race_middle_Select_the_best_answer_generate_span_ | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_generate_span_ | lora |
| quail_context_question_answer_description_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_answer_description_text | lora |
| quail_context_question_description_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_text | lora |
| wiki_hop_original_choose_best_object_interrogative_2 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_2 | lora |
| duorc_SelfRC_movie_director | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_movie_director | lora |
| quoref_Given_Context_Answer_Question | phi-2 | sordonia/flan-10k-flat/quoref_Given_Context_Answer_Question | lora |
| wiki_hop_original_explain_relation | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_explain_relation | lora |
| super_glue_record_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_record_1_0_2 | lora |
| adversarial_qa_dbidaf_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_tell_what_it_is | lora |
| cot_ecqa_ii | phi-2 | sordonia/flan-10k-flat/cot_ecqa_ii | lora |
| ropes_background_new_situation_answer | phi-2 | sordonia/flan-10k-flat/ropes_background_new_situation_answer | lora |
| wiqa_what_might_be_the_first_step_of_the_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_might_be_the_first_step_of_the_process | lora |
| duorc_SelfRC_answer_question | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_answer_question | lora |
| ag_news_subset_1_0_0 | phi-2 | sordonia/flan-10k-flat/ag_news_subset_1_0_0 | lora |
| race_middle_Write_a_multi_choice_question_for_the_following_article | phi-2 | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_for_the_following_article | lora |
| wmt14_translate_fr_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt14_translate_fr_en_1_0_0 | lora |
| sciq_Direct_Question | phi-2 | sordonia/flan-10k-flat/sciq_Direct_Question | lora |
| super_glue_multirc_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_multirc_1_0_2 | lora |
| dbpedia_14_given_a_choice_of_categories_ | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_a_choice_of_categories_ | lora |
| super_glue_wic_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_wic_1_0_2 | lora |
| social_i_qa_Show_choices_and_generate_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_answer | lora |
| wiqa_what_might_be_the_last_step_of_the_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_might_be_the_last_step_of_the_process | lora |
| quoref_Answer_Question_Given_Context | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Question_Given_Context | lora |
| quoref_Context_Contains_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Context_Contains_Answer | lora |
| cos_e_v1_11_description_question_option_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_text | lora |
| adversarial_qa_dbert_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_based_on | lora |
| multi_news_1_0_0 | phi-2 | sordonia/flan-10k-flat/multi_news_1_0_0 | lora |
| cos_e_v1_11_generate_explanation_given_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_generate_explanation_given_text | lora |
| true_case | phi-2 | sordonia/flan-10k-flat/true_case | lora |
| duorc_ParaphraseRC_movie_director | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_movie_director | lora |
| quartz_answer_question_based_on | phi-2 | sordonia/flan-10k-flat/quartz_answer_question_based_on | lora |
| bool_q_1_0_0 | phi-2 | sordonia/flan-10k-flat/bool_q_1_0_0 | lora |
| quoref_Guess_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Guess_Answer | lora |
| quarel_do_not_use | phi-2 | sordonia/flan-10k-flat/quarel_do_not_use | lora |
| cos_e_v1_11_explain_why_human | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_explain_why_human | lora |
| wiki_qa_Generate_Question_from_Topic | phi-2 | sordonia/flan-10k-flat/wiki_qa_Generate_Question_from_Topic | lora |
| kilt_tasks_hotpotqa_straighforward_qa | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_straighforward_qa | lora |
| adversarial_qa_dbidaf_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_generate_question | lora |
| dbpedia_14_pick_one_category_for_the_following_text | phi-2 | sordonia/flan-10k-flat/dbpedia_14_pick_one_category_for_the_following_text | lora |
| kilt_tasks_hotpotqa_final_exam | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_final_exam | lora |
| quoref_Answer_Friend_Question | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Friend_Question | lora |
| race_high_Write_a_multi_choice_question_for_the_following_article | phi-2 | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_for_the_following_article | lora |
| ropes_prompt_beginning | phi-2 | sordonia/flan-10k-flat/ropes_prompt_beginning | lora |
| adversarial_qa_dbert_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_question_context_answer | lora |
| cot_creak | phi-2 | sordonia/flan-10k-flat/cot_creak | lora |
| gem_e2e_nlg_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_e2e_nlg_1_1_0 | lora |
| cos_e_v1_11_description_question_option_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_id | lora |
| social_i_qa_Generate_the_question_from_the_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Generate_the_question_from_the_answer | lora |
| quarel_heres_a_story | phi-2 | sordonia/flan-10k-flat/quarel_heres_a_story | lora |
| social_i_qa_Check_if_a_random_answer_is_valid_or_not | phi-2 | sordonia/flan-10k-flat/social_i_qa_Check_if_a_random_answer_is_valid_or_not | lora |
| ropes_background_situation_middle | phi-2 | sordonia/flan-10k-flat/ropes_background_situation_middle | lora |
| sciq_Multiple_Choice_Question_First | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice_Question_First | lora |
| cot_strategyqa_ii | phi-2 | sordonia/flan-10k-flat/cot_strategyqa_ii | lora |
| huggingface_xsum | phi-2 | sordonia/flan-10k-flat/huggingface_xsum | lora |
| kilt_tasks_hotpotqa_complex_question | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_complex_question | lora |
| wmt16_translate_fi_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_fi_en_1_0_0 | lora |
| ai2_arc_ARC_Easy_1_0_0 | phi-2 | sordonia/flan-10k-flat/ai2_arc_ARC_Easy_1_0_0 | lora |
| stream_qed | phi-2 | sordonia/flan-10k-flat/stream_qed | lora |
| definite_pronoun_resolution_1_1_0 | phi-2 | sordonia/flan-10k-flat/definite_pronoun_resolution_1_1_0 | lora |
| super_glue_rte_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_rte_1_0_2 | lora |
| ropes_new_situation_background_answer | phi-2 | sordonia/flan-10k-flat/ropes_new_situation_background_answer | lora |
| dream_read_the_following_conversation_and_answer_the_question | phi-2 | sordonia/flan-10k-flat/dream_read_the_following_conversation_and_answer_the_question | lora |
| cot_sensemaking | phi-2 | sordonia/flan-10k-flat/cot_sensemaking | lora |
| wiki_qa_Topic_Prediction_Answer_Only | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Answer_Only | lora |
| duorc_ParaphraseRC_generate_question | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question | lora |
| dream_generate_last_utterance | phi-2 | sordonia/flan-10k-flat/dream_generate_last_utterance | lora |
| race_middle_Taking_a_test | phi-2 | sordonia/flan-10k-flat/race_middle_Taking_a_test | lora |
| piqa_1_0_0 | phi-2 | sordonia/flan-10k-flat/piqa_1_0_0 | lora |
| cot_ecqa | phi-2 | sordonia/flan-10k-flat/cot_ecqa | lora |
| glue_mrpc_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_mrpc_2_0_0 | lora |
| race_middle_Read_the_article_and_answer_the_question_no_option_ | phi-2 | sordonia/flan-10k-flat/race_middle_Read_the_article_and_answer_the_question_no_option_ | lora |
| ropes_plain_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_plain_background_situation | lora |
| quail_description_context_question_answer_text | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_answer_text | lora |
| qasc_qa_with_combined_facts_1 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_combined_facts_1 | lora |
| cot_creak_ii | phi-2 | sordonia/flan-10k-flat/cot_creak_ii | lora |
| duorc_ParaphraseRC_decide_worth_it | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_decide_worth_it | lora |
| quoref_Answer_Test | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Test | lora |
| wiki_bio_who | phi-2 | sordonia/flan-10k-flat/wiki_bio_who | lora |
| kilt_tasks_hotpotqa_formulate | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_formulate | lora |
| glue_wnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_wnli_2_0_0 | lora |
| gigaword_1_2_0 | phi-2 | sordonia/flan-10k-flat/gigaword_1_2_0 | lora |
| quail_context_description_question_text | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_text | lora |
| dream_answer_to_dialogue | phi-2 | sordonia/flan-10k-flat/dream_answer_to_dialogue | lora |
| cos_e_v1_11_question_option_description_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_id | lora |
| duorc_ParaphraseRC_question_answering | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_question_answering | lora |
| wiki_qa_automatic_system | phi-2 | sordonia/flan-10k-flat/wiki_qa_automatic_system | lora |
| adversarial_qa_droberta_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_based_on | lora |
| super_glue_wsc_fixed_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_wsc_fixed_1_0_2 | lora |
| word_segment | phi-2 | sordonia/flan-10k-flat/word_segment | lora |
| quac_1_0_0 | phi-2 | sordonia/flan-10k-flat/quac_1_0_0 | lora |
| quartz_paragraph_question_plain_concat | phi-2 | sordonia/flan-10k-flat/quartz_paragraph_question_plain_concat | lora |
| wiqa_which_of_the_following_is_the_supposed_perturbation | phi-2 | sordonia/flan-10k-flat/wiqa_which_of_the_following_is_the_supposed_perturbation | lora |
| quartz_use_info_from_question_paragraph | phi-2 | sordonia/flan-10k-flat/quartz_use_info_from_question_paragraph | lora |
| ropes_plain_no_background | phi-2 | sordonia/flan-10k-flat/ropes_plain_no_background | lora |
| race_high_Select_the_best_answer_generate_span_ | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer_generate_span_ | lora |
| glue_cola_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_cola_2_0_0 | lora |
| social_i_qa_Show_choices_and_generate_index | phi-2 | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_index | lora |
| ropes_prompt_bottom_hint_beginning | phi-2 | sordonia/flan-10k-flat/ropes_prompt_bottom_hint_beginning | lora |
| web_questions_short_general_knowledge_q | phi-2 | sordonia/flan-10k-flat/web_questions_short_general_knowledge_q | lora |
Last updated on: 2024-04-19 18:28:23+00:00
| [
"SCIQ"
] |
espnet/iwslt24_indic_en_bn_bpe_tc4000 | espnet | null | [
"espnet",
"audio",
"speech-translation",
"en",
"bn",
"dataset:iwslt24_indic",
"arxiv:1804.00015",
"license:cc-by-4.0",
"region:us"
] | "2024-04-19T15:25:40Z" | 2024-04-19T17:08:16+00:00 | 0 | 0 | ---
datasets:
- iwslt24_indic
language:
- en
- bn
license: cc-by-4.0
tags:
- espnet
- audio
- speech-translation
---
## ESPnet2 ST model
### `espnet/iwslt24_indic_en_bn_bpe_tc4000`
This model was trained by cromz22 using iwslt24_indic recipe in [espnet](https://github.com/espnet/espnet/).
### Demo: How to use in ESPnet2
Follow the [ESPnet installation instructions](https://espnet.github.io/espnet/installation.html)
if you haven't done that already.
```bash
cd espnet
git checkout 3a161c5ac0f74cc593410a4a32001073ed456580
pip install -e .
cd egs2/iwslt24_indic/st1
./run.sh --skip_data_prep false --skip_train true --download_model espnet/iwslt24_indic_en_bn_bpe_tc4000
```
<!-- Generated by scripts/utils/show_translation_result.sh -->
# RESULTS
## Environments
- date: `Wed Apr 17 02:51:38 JST 2024`
- python version: `3.10.14 (main, Mar 21 2024, 16:24:04) [GCC 11.2.0]`
- espnet version: `espnet 202402`
- pytorch version: `pytorch 2.1.0`
- Git hash: `83c179ab842987cf01642df2db372aaae260df55`
- Commit date: `Wed Apr 17 00:28:29 2024 +0900`
## st_train_st_conformer_raw_en_bn_bpe_tc4000
### BLEU
|dataset|score|verbose_score|
|---|---|---|
|decode_st_conformer_st_model_valid.acc.ave/dev.en-bn|2.1|19.7/3.6/1.0/0.3 (BP = 1.000 ratio = 1.185 hyp_len = 46094 ref_len = 38883)|
## ST config
<details><summary>expand</summary>
```
config: conf/tuning/train_st_conformer.yaml
print_config: false
log_level: INFO
drop_last_iter: false
dry_run: false
iterator_type: sequence
valid_iterator_type: null
output_dir: exp/st_train_st_conformer_raw_en_bn_bpe_tc4000
ngpu: 1
seed: 0
num_workers: 1
num_att_plot: 3
dist_backend: nccl
dist_init_method: env://
dist_world_size: null
dist_rank: null
local_rank: 0
dist_master_addr: null
dist_master_port: null
dist_launcher: null
multiprocessing_distributed: false
unused_parameters: false
sharded_ddp: false
cudnn_enabled: true
cudnn_benchmark: false
cudnn_deterministic: true
collect_stats: false
write_collected_feats: false
max_epoch: 80
patience: null
val_scheduler_criterion:
- valid
- loss
early_stopping_criterion:
- valid
- loss
- min
best_model_criterion:
- - valid
- acc
- max
keep_nbest_models: 10
nbest_averaging_interval: 0
grad_clip: 5.0
grad_clip_type: 2.0
grad_noise: false
accum_grad: 2
no_forward_run: false
resume: true
train_dtype: float32
use_amp: false
log_interval: null
use_matplotlib: true
use_tensorboard: true
create_graph_in_tensorboard: false
use_wandb: false
wandb_project: null
wandb_id: null
wandb_entity: null
wandb_name: null
wandb_model_log_interval: -1
detect_anomaly: false
use_adapter: false
adapter: lora
save_strategy: all
adapter_conf: {}
pretrain_path: null
init_param: []
ignore_init_mismatch: false
freeze_param: []
num_iters_per_epoch: null
batch_size: 20
valid_batch_size: null
batch_bins: 25000000
valid_batch_bins: null
train_shape_file:
- exp/st_stats_raw_en_bn_bpe4000/train/speech_shape
- exp/st_stats_raw_en_bn_bpe4000/train/text_shape.bpe
- exp/st_stats_raw_en_bn_bpe4000/train/src_text_shape.bpe
valid_shape_file:
- exp/st_stats_raw_en_bn_bpe4000/valid/speech_shape
- exp/st_stats_raw_en_bn_bpe4000/valid/text_shape.bpe
- exp/st_stats_raw_en_bn_bpe4000/valid/src_text_shape.bpe
batch_type: numel
valid_batch_type: null
fold_length:
- 80000
- 150
- 150
sort_in_batch: descending
shuffle_within_batch: false
sort_batch: descending
multiple_iterator: false
chunk_length: 500
chunk_shift_ratio: 0.5
num_cache_chunks: 1024
chunk_excluded_key_prefixes: []
chunk_default_fs: null
train_data_path_and_name_and_type:
- - dump/raw/train.en-bn/wav.scp
- speech
- kaldi_ark
- - dump/raw/train.en-bn/text.tc.bn
- text
- text
- - dump/raw/train.en-bn/text.lc.rm.en
- src_text
- text
valid_data_path_and_name_and_type:
- - dump/raw/dev.en-bn/wav.scp
- speech
- kaldi_ark
- - dump/raw/dev.en-bn/text.tc.bn
- text
- text
- - dump/raw/dev.en-bn/text.lc.rm.en
- src_text
- text
allow_variable_data_keys: false
max_cache_size: 0.0
max_cache_fd: 32
allow_multi_rates: false
valid_max_cache_size: null
exclude_weight_decay: false
exclude_weight_decay_conf: {}
optim: adam
optim_conf:
lr: 0.002
weight_decay: 1.0e-06
scheduler: warmuplr
scheduler_conf:
warmup_steps: 25000
token_list:
- <blank>
- <unk>
- ▁
- ্
- ▁়
- য
- র
- ▁প
- ▁এবং
- ের
- ▁য
- কে
- ▁স
- ▁ব
- ▁যে
- ▁একটি
- রা
- ও
- ▁যা
- ▁ে
- ▁করে
- ▁ত
- ▁সম
- ▁করা
- ▁জন
- ▁করতে
- ▁এটি
- ে
- স
- হয
- ▁ক
- ▁দ
- ▁আম
- ▁এই
- তে
- ▁ট
- ড
- িয
- ায
- ই
- ▁আমাদের
- ▁।
- ন
- ▁না
- ত
- ▁ন
- ▁তাদের
- ▁আপনি
- টি
- ▁পারে
- ▁আমি
- ভাবে
- ▁কিন
- ▁তু
- ী
- ▁তা
- ▁গ
- ▁তার
- ▁রয
- ▁র
- ▁তারা
- ার
- ▁েছে
- ▁থেকে
- ▁া
- দের
- ▁বা
- ▁হবে
- ▁সাথে
- ▁পর
- ▁হ
- ▁নির
- িত
- ▁ম
- ▁অন
- ক
- ▁ছিল
- ▁যান
- ▁ার
- ▁তি
- ▁আপনার
- ▁নিয
- ▁মধ
- ষ
- ▁আর
- ▁তৈরি
- ▁অ
- ▁ধ
- ▁বন
- ▁জ
- ▁তাই
- ▁যখন
- ▁এক
- ▁বিশ
- ▁কার
- ▁শ
- গুলি
- ▁কিছু
- ▁দে
- ল
- ঙ
- ▁বাস
- ▁তবে
- ▁ণ
- ▁যদি
- ▁শক
- ুক
- ▁অর
- ঞ
- ▁এমন
- ▁চ
- ▁ভ
- ▁কাজ
- ▁এখন
- ▁থ
- ▁হল
- ▁তর
- ▁অনেক
- ▁বেশি
- ▁হতে
- ▁পরিব
- ি
- ▁আ
- িক
- ▁করি
- ▁েছিল
- ▁এর
- ▁যবহার
- মাত
- ▁কারণ
- উত
- ▁যায
- াড
- ▁তখন
- ▁ড
- ▁মতো
- পূর
- চ
- ▁পারি
- ▁সত
- প
- ▁সেই
- ▁ষ
- ▁আমা
- ▁তন
- ▁নয
- ▁চে
- া
- ▁শুরু
- ▁মনে
- ▁বর
- ব
- ▁কম
- ▁উদ
- ▁ু
- ▁কী
- ▁ছে
- ▁ষা
- ▁আছে
- দ
- ▁রি
- ▁বি
- ▁বের
- ▁যক
- ▁করেন
- ▁বলে
- ▁একজন
- ▁তিনি
- ▁হিসাবে
- জ
- ▁এটা
- ▁যন
- '-'
- ▁নতুন
- ▁সে
- ▁দিকে
- ','
- ▁করেছে
- ▁করেছিল
- ▁রক
- ▁রে
- িষ
- ▁যেতে
- ▁দি
- ▁উপর
- ▁জলব
- ▁শুধু
- ▁থান
- ▁রতি
- ▁কি
- ▁ল
- ▁যাক
- ▁পারেন
- ▁কর
- গ
- ▁কয
- ▁রস
- ▁রথম
- ▁যেখানে
- ▁থাকে
- ▁টা
- ▁কাছে
- ▁কথা
- বচে
- ▁লক
- ▁সং
- শ
- ▁ঘ
- ▁আগে
- ▁োজন
- ▁একই
- ▁বারা
- ▁করছে
- ▁যার
- ▁সালে
- ▁দেখা
- ংখ
- ▁এখানে
- ▁করেছিলেন
- ▁ষণ
- ▁কেন
- যোগ
- ▁আপনা
- ▁ভাল
- ▁মূল
- ট
- ▁তির
- ▁সাহা
- ▁বৈ
- ম
- ান
- ▁খুব
- ▁বছর
- ▁ফ
- ▁সর
- ৃষ
- েন
- ▁দেশ
- ▁রম
- ▁অস
- ▁বড
- ▁কোম
- কি
- ▁ঠিক
- ▁ধরে
- ▁বিজ
- ▁করবে
- ফ
- ▁গুরুত
- ▁থা
- ু
- ▁রুত
- ▁বিদ
- ▁যারা
- ▁দেখতে
- ▁নি
- ▁সাধারণ
- ▁পূর
- ▁করেছি
- াও
- ▁মান
- ▁ভাব
- বিষ
- েষ
- ▁যিই
- রণ
- ▁ছ
- ▁করুন
- ▁ধি
- ▁উচ
- ▁রতিটি
- ▁পদ
- ▁বিক
- হ
- ▁গল
- ▁পরে
- ৃদ
- চিত
- ▁রশ
- ▁উ
- ▁উচিত
- ▁সহ
- মধ
- ▁চিত
- ▁জীবন
- ▁েক
- ▁20
- ▁কখন
- উন
- ▁বেশিরভাগ
- ▁ছোট
- ▁রাস
- ▁দিতে
- ▁যাচ
- ▁ঘুম
- ুদ
- ▁মানুষ
- ▁কোন
- দক
- রূপ
- ▁চাই
- ▁বিষ
- ▁রভাব
- ▁থাকা
- ▁দুর
- ▁এ
- রীক
- ▁উপা
- ▁দুটি
- ▁বিশেষ
- ▁আক
- ▁অক
- ▁বলতে
- ▁আন
- দীর
- ▁ষে
- ▁নেই
- ▁ধন
- ▁ষেত
- ▁বলা
- ▁তী
- ▁রত
- ▁পুনর
- ▁সক
- নিশ
- ▁শেষ
- ▁সিস
- ▁আসলে
- াম
- এ
- ণ
- ▁ছা
- ▁ঘন
- ▁মার
- মাধ
- ▁ভাগ
- ▁সঠিক
- ▁কেউ
- ▁ইতি
- ▁কিভাবে
- ▁শিল
- ▁পার
- ▁উদাহরণ
- িং
- ▁কারণে
- ▁বল
- ▁সুত
- িজ
- ▁রতিক
- ▁ফির
- ▁মানুষের
- ▁লোক
- ▁ভব
- ▁সমাধান
- ▁আসে
- ▁চলে
- িতে
- ▁কেবল
- ▁রী
- ▁ঞানী
- ▁নিজে
- ভিন
- ▁সেখানে
- ▁অবস
- বর
- ▁যত
- ▁খুঁজ
- ▁কঠিন
- ▁হাজার
- ▁জানেন
- ▁জানি
- খ
- ▁সব
- ▁বে
- ▁যমে
- বিশ
- ▁রহ
- ▁ধান
- ▁টার
- ▁জিনিস
- ▁থনীতি
- ▁ধরনের
- ▁সহজ
- ▁তব
- ▁রাজ
- ▁তিত
- ▁গেছে
- পক
- াহলে
- িকল
- ▁আলো
- ▁রহণ
- ▁করবেন
- ▁10
- ▁অবশ
- ং
- ▁পনা
- ▁পে
- কারী
- ▁ধে
- িদ
- তার
- ▁যেমন
- ▁চা
- ▁তাপ
- ▁যাপ
- ▁দিন
- ▁এত
- ▁ছি
- ▁নে
- ▁সাম
- ▁গত
- তা
- ▁অংশ
- ▁রান
- ছন
- ▁বিত
- ▁কোষ
- ▁সরকার
- ▁োগ
- তি
- বার
- ▁বিশাল
- ▁পেতে
- ▁উপ
- ▁চিন
- '2'
- ▁রাখা
- ুর
- ▁জিন
- ▁বৈশ
- ▁পানি
- ▁গমন
- ▁াই
- ▁ভবত
- ▁সন
- ▁অগ
- চুর
- ▁পরিস
- ▁েছি
- ▁তিশালী
- ▁শতাংশ
- ▁ভিত
- ▁বছরের
- াল
- ▁যাকসিন
- ▁যবাদ
- ▁রকৃত
- ▁মত
- ▁থাপন
- ▁রণ
- ▁আজ
- ▁লোকেরা
- ▁লা
- ▁রের
- ▁রিক
- ▁ষতি
- শব
- ▁থাকতে
- ▁বিল
- ▁দেশে
- ▁উভ
- ▁মস
- ▁জু
- ▁রমণ
- ▁ষমতা
- ▁রদান
- ▁যবস
- নের
- রুদ
- ▁করেছেন
- ▁সার
- টিকে
- ▁গাছ
- ▁জীবা
- গত
- ▁মিলি
- ▁ডলার
- াং
- ▁পণ
- ▁রূপা
- ▁ষম
- ▁গা
- ▁কল
- নেও
- ▁যাট
- জন
- ▁যথ
- ▁পুরো
- ▁অনুমান
- ▁রাখতে
- ▁যাস
- এর
- ▁েছিলেন
- ▁লেখ
- ▁পরিষ
- ▁জল
- ▁রঙ
- ▁মাত
- ▁বিনি
- ▁জা
- ▁তারপর
- ▁তুলনা
- ▁পৃথিবী
- ▁খরচ
- ▁বিবেচনা
- ▁চল
- ▁রিত
- ▁টেলি
- ▁েছিলাম
- ▁টেম
- ▁সি
- বদ
- ▁অনুস
- ▁আলাদা
- ▁তৃত
- গুলিতে
- ▁ভর
- ▁রাপ
- ানো
- ▁সুযোগ
- ▁মুহ
- ▁মাথা
- ▁সংক
- ▁ভাবনা
- ▁যাগ
- সাথে
- ▁মী
- ▁যাত
- ▁নীচে
- ▁তোলে
- ▁বাইরে
- তির
- ▁তিনটি
- ▁বুঝ
- ▁চিকি
- ▁কোনও
- ▁হার
- ▁19
- ▁মক
- ▁থিতি
- ▁গবেষণা
- ▁সরবরাহ
- ▁তারপরে
- ▁একক
- ▁মের
- ▁সৌর
- ▁চাল
- ▁মহিলা
- ▁চর
- ▁কোনো
- ▁নীতি
- ▁বস
- ▁CO
- ▁সবুজ
- ▁অবশেষে
- ▁যুৎ
- ▁বেগ
- ▁রাখে
- ▁দুই
- ▁ডে
- ▁চান
- ▁রোগ
- ▁বলি
- ▁রমাণ
- ▁নিজ
- ▁গি
- ▁ভুল
- ংক
- ▁টের
- ▁শহরে
- ▁অত
- ▁যাবে
- মে
- ▁শহর
- ▁কের
- ▁মহা
- েবে
- ▁কোথা
- ▁সাইড
- ▁নের
- ির
- ▁ঠ
- গুলো
- ফর
- ▁তথ
- ▁পানির
- ▁চালি
- ▁ভালো
- ▁ধরণ
- ▁ধারণ
- ▁মাণ
- ▁াল
- ▁বিপ
- ▁ভাষা
- ▁দরকার
- ▁রিট
- ▁কো
- ▁রদা
- ▁মৃত
- ▁ছেন
- ▁যুতিক
- ▁যকর
- ▁লাস
- ▁তমান
- ▁মিশর
- ▁রাম
- ▁দল
- ▁নিজের
- ▁ডার
- থায
- ▁সারা
- েও
- োড
- ▁সা
- ▁রাতে
- ▁বিস
- টা
- ▁ছিলেন
- ▁ফলাফল
- ▁ডাই
- ▁ঞাসা
- ▁মিথ
- ▁নীল
- ▁হিস
- ▁চুল
- ঘ
- ▁যালে
- ▁ষেপ
- ▁বব
- ▁যু
- ▁বাধ
- ▁দেশগুলি
- ▁মানে
- ▁ান
- ৃশ
- ▁াতে
- ▁আশ
- ▁খারাপ
- ▁লাল
- ূর
- ▁ধার
- ▁তুত
- ষম
- ▁পরিচিত
- ▁বক
- ▁ডা
- ▁নাম
- ▁জার
- ▁ছিলাম
- টোক
- ▁তম
- োক
- ▁যবসা
- ▁বার
- ▁পথ
- লম
- ▁ধতি
- ▁অনুভব
- ▁কৌশল
- ▁রসারিত
- ▁আঘাত
- ▁জিনিসগুলি
- িন
- ▁গতি
- ▁অতির
- ▁পাচ
- াকে
- ▁করছেন
- াঙ
- ▁মাই
- ▁পা
- ▁জানা
- ▁নব
- ▁আশা
- ▁ধারণা
- ▁অভ
- ▁সুবিধা
- ▁সবাই
- না
- েতু
- ংস
- মেন
- ▁পাঁচ
- ▁জীব
- ▁নিষ
- ▁হুমকি
- ▁বালানি
- ▁নিরাপদ
- ূন
- ▁বোধ
- ▁যগুলি
- ▁গে
- রক
- ▁চাপ
- ▁রোটিন
- নী
- ▁যোগ
- ▁রাণী
- ▁ভারতে
- ▁টির
- ▁রকৃতি
- ▁মহামারী
- সের
- ▁মে
- ▁15
- ▁থনৈতিক
- ▁ঝুঁকি
- ▁রকাশ
- ▁তিন
- ▁সুস
- ▁রাজা
- ▁ডিস
- ▁এড
- ▁মারা
- ▁টন
- শীল
- ▁নামে
- ▁দু
- জু
- ▁উপাদান
- ▁অপ
- থ
- ুষ
- ▁পরিণত
- ▁তত
- ▁বেঁচে
- ▁বালানী
- ▁অনুম
- ▁বেশ
- ▁যানো
- ▁ধমান
- লে
- ▁এগ
- ▁থন
- ▁আবার
- ▁অসম
- ময
- ▁উপস
- াস
- ▁যমান
- ▁শিক
- রামর
- ▁হাই
- কাল
- ▁থী
- ▁ঞান
- ▁পাদন
- ▁রিকা
- ▁দূরে
- ▁হলে
- ো
- ▁ভিন
- ▁নিক
- ▁রাব
- ৎ
- ▁কোপ
- ▁শী
- লব
- ▁দা
- হত
- ▁দেখেছি
- ▁বোঝা
- ▁টিক
- ▁মরুভূমি
- ▁বৃহ
- তম
- ▁তিগত
- ▁অফ
- ▁ষতা
- ▁ফলে
- ▁সীমা
- িহ
- ▁সেন
- ▁যুদ
- ▁মন
- ▁দশকে
- ▁সেগুলি
- ▁গড
- ▁যো
- ▁রদ
- ▁11
- ▁4
- ▁পরিবার
- ▁ডিজাইন
- ▁রজাতি
- ▁হাসি
- ▁নামক
- ▁খাদ
- ▁তো
- ▁তিক
- েক
- সূর
- ▁ভারত
- ▁ইন
- ▁যাপক
- ▁আসা
- ▁কিনা
- ▁ঠান
- ▁আত
- ▁অব
- ▁কোষে
- ▁পুরুষ
- ▁ডি
- ▁রার
- ▁সংগ
- ▁যাকে
- ▁থাকবে
- ▁বিন
- ▁ইংতাই
- ▁মোমবাতি
- ▁রাকৃতিক
- ▁লোকেদের
- ীকরণ
- ▁রতিশ
- ▁খ
- ▁চারপাশে
- ▁এশ
- ▁খনি
- ▁উপরে
- ▁রুতি
- ▁পরিমাণে
- ▁আসন
- ▁বিভ
- পড
- ▁দূর
- ▁1
- ▁বেড
- ▁রিস
- ▁কোষগুলি
- ▁আগ
- ▁একে
- ▁রাক
- ▁শহরগুলি
- ▁সেট
- েই
- তটা
- ▁শরীর
- ▁পরিমাণ
- ▁খিঁচুনি
- ▁ফেলে
- গায
- ▁জো
- দিনের
- নির
- ▁ইমিউন
- ▁যাল
- ▁আস
- ▁অপর
- ▁বাচ
- ▁কত
- ৈর
- ▁তরে
- ▁রেক
- ▁করছি
- ▁অনু
- ▁করলে
- ▁আল
- ▁আধ
- ▁ভাবন
- ▁এমআরএনএ
- ▁টেকসই
- ▁রোজান
- ▁পরিচালনা
- ▁যুত
- ▁বছরে
- ▁যালি
- ▁ডেটা
- ▁একাধিক
- ▁দর
- ▁পিছনে
- ▁মাটি
- ▁যতা
- ▁বদা
- ডিগ
- ▁এগুলি
- ▁ঞতা
- ▁আচরণ
- লা
- ফোর
- ▁একবার
- ▁সহা
- ▁শন
- টিস
- ▁রতিরোধ
- ▁আরেক
- ▁6
- াক
- কার
- লি
- বা
- ▁সেরা
- ▁বংস
- ▁লি
- ▁বপ
- ▁অপসারণ
- s
- ▁মোকাবেলা
- ▁রবেশ
- ▁ইলেক
- ▁চিকিৎসা
- ▁ভেঙ
- ▁বিপরীত
- ▁রধান
- মূলক
- ▁হত
- ▁পাশা
- ▁পুর
- ▁দাস
- ▁জনস
- ▁মডেল
- নি
- োয
- ▁থক
- ▁আপ
- াচ
- রিদ
- ছিলাম
- ▁মা
- বে
- ▁এলাকা
- ▁দশক
- ▁ঘটনা
- ভূত
- ▁কন
- ▁শতা
- ▁তরা
- ▁ভার
- রবর
- িনি
- ▁খা
- ▁নিজেদের
- রূপে
- ▁মোট
- ▁কাঠামো
- ▁যোগাযোগ
- ▁বীকার
- ▁ভূমিকা
- বু
- ▁ঠী
- ▁ডিক
- ▁জোর
- '20'
- ▁আমেরিকান
- ▁সাল
- ▁েন
- ▁নৈতিক
- ঠা
- শত
- াপী
- ▁সপ
- াতে
- বেক
- ▁ফল
- পত
- ▁জীবনে
- ▁গো
- ▁যাই
- ▁অদ
- ▁নত
- ▁ডাক
- ▁সেস
- কৃত
- ▁টিভ
- ▁জটিল
- হীন
- ▁কঠোর
- ▁চাহিদা
- ▁মুখোমুখি
- ▁রকৌশলী
- ▁রাচীন
- ▁উৎপাদন
- ▁রগতি
- ▁লেষণ
- ▁জাতিগত
- ▁শোষণ
- ▁খাবার
- ▁ধীর
- ▁পারবেন
- ুনিক
- ▁ভিতরে
- ▁ভাইরাস
- ▁দেখি
- তিতে
- ▁দেবে
- কল
- ▁লেট
- ▁েছেন
- ৃত
- ▁াম
- ▁ইস
- ▁নিচে
- ▁চম
- ▁গদ
- ▁ধু
- ▁তুলত
- ▁টেবিলে
- পী
- মা
- ▁আকার
- ▁অণু
- ▁অনুপ
- ▁টে
- ▁নিত
- ▁মূ
- ▁ওষুধ
- ▁কাছাকাছি
- ▁ডিএনএ
- ▁সুপারনোভা
- ▁শুনতে
- ▁গপাল
- ▁অভাব
- ▁যপ
- ▁মাঝ
- নাক
- ▁আটকে
- ▁বিচ
- ▁গভীর
- ▁যজনক
- মি
- ▁200
- টিক
- ▁যেভাবে
- ▁পাশে
- ▁রতিদ
- ▁সেলস
- ▁ফেল
- ▁নতি
- ▁বাধা
- ▁বজ
- ▁মানব
- ছে
- ▁থতা
- াই
- ▁শতাংশের
- ▁শান
- ▁হন
- ▁নিম
- ▁শিকার
- পাশ
- বৃত
- ▁সমব
- ▁5
- েছে
- ▁তেলাপোকা
- ▁ঝ
- ▁বসে
- ▁গুণ
- ▁ণবাদ
- ▁লিপ
- ▁যব
- ▁ঘটে
- তী
- ▁আইন
- ▁জানে
- ▁আশেপাশে
- ▁নাগরিক
- ▁গঠন
- ▁তরীণ
- ▁যাটার
- ▁অভিজ
- ▁সংযোগ
- ▁চরম
- ▁করব
- জেন
- ▁পানিগুলি
- ▁ডিম
- লার
- াফল
- ▁জলে
- বহা
- ▁উজ
- ▁সামনে
- ▁30
- ▁থিত
- াগত
- ▁ঝাঁক
- ▁পগুলি
- উড
- ▁যাম
- ▁কুল
- ▁থাগুলি
- ▁মানসিক
- ▁বাঁচ
- ▁পরব
- ▁ডেন
- ▁থে
- ▁রেস
- ▁ছবি
- ▁কাছ
- ▁সমান
- বন
- ▁পান
- ▁সিম
- ▁2
- ▁সহক
- ▁ঞা
- ▁লিপিড
- ▁অধ
- ▁কোভিড
- ▁অবদান
- ▁যোগফল
- ▁সোনা
- ▁েকটি
- ▁কালো
- ▁কমাতে
- ▁গবেষকরা
- ▁অনন
- ▁দেখে
- মান
- ▁মুখী
- ▁রজনন
- ▁সূচক
- ▁জাত
- টাই
- ▁পরিবেশ
- ▁আদ
- ▁ইউরোপ
- ▁আচ
- ▁পেট
- ▁লাগ
- ▁ছিন
- ▁যাশ
- ▁জি
- ▁চিম
- োষ
- ▁মু
- ▁যটি
- ▁গেলে
- ▁ষিণ
- ▁ভিদ
- ▁বেত
- ▁রেম
- ▁বিপর
- ▁তিদের
- েশন
- লেন
- ভুক
- ▁রোগী
- ▁পাত
- ▁চার
- বসম
- ▁রাণ
- ▁ঘো
- ▁আরো
- ▁এম
- মন
- ুরক
- ▁খেলা
- দিকে
- োজ
- ▁রো
- ▁বাভাবিক
- '0000'
- ▁যবহ
- ▁নিন
- ▁ইতিহাস
- ▁শত
- ▁পরিচ
- ▁রাথমিক
- ▁ভাইপার
- ▁জনগণ
- ▁থাকলে
- ▁শোনা
- ▁ঘুর
- ▁বিয
- ▁লোব
- ▁বাণ
- ▁পরিবহন
- ▁যবান
- ▁সাদা
- ▁ওজন
- ▁কিছুটা
- ▁চাকা
- ▁অপে
- ▁ঠে
- ▁মিলিত
- ▁সেক
- ▁বাকি
- ▁শরীরে
- ▁যেকের
- েট
- মাস
- ইচ
- ▁পালি
- ▁রচ
- দার
- ▁আকাশে
- ▁মুখে
- ারি
- ালন
- ▁রবাহ
- ▁কিলোমিটার
- ▁আকারে
- ▁শে
- ারিদ
- ▁সুন
- ভাগ
- পু
- ▁লোকের
- '50'
- ▁বাবা
- ▁মিত
- সাম
- ছেন
- বি
- ▁যৌন
- ▁রবণ
- মণ
- ▁বাক
- ▁ধেক
- ▁বহু
- ▁অদলবদল
- ▁তেজনা
- ▁বসবাস
- ▁পরিমাপ
- ▁রাজনৈতিক
- ▁আবাস
- ▁সংকেত
- ▁পরিবেশগত
- ▁বিকাশ
- ▁বিগুণ
- ▁যানেল
- ▁যাঁ
- ▁ইংরেজি
- ▁অভি
- ▁মিনিটের
- াধিক
- ▁যিকার
- ▁জানত
- ▁রজন
- ▁অসু
- রকম
- ▁থিক
- ▁রেখে
- ▁জেনে
- ▁3
- ণনা
- ▁নারী
- ▁সংয
- াত
- ▁টেমের
- ▁রেড
- লান
- ▁ানো
- ▁সাহ
- ▁চাচ
- ▁কাজটি
- ▁রিড
- ▁থল
- ▁পন
- ▁রন
- াজার
- ▁রিন
- ▁কোপে
- ▁গন
- ▁সৌ
- পথে
- ▁লুপ
- ▁সূ
- ▁ভাই
- ▁ষিত
- ▁কেল
- ▁ওঠে
- ▁70
- ▁জাহাজ
- ৷
- ▁থেরাপি
- ▁চাকরি
- ▁মৌলিক
- ▁চাঁদ
- ▁রতিফল
- ▁নেতৃ
- ▁শাসন
- ▁খবর
- ▁নাটক
- ▁ঘুমানো
- ▁করছিলাম
- ▁ইতিহাসে
- ▁চালানো
- ▁ষরিক
- ▁ষত
- ▁বীপ
- ▁আমেরিকানদের
- হিত
- ▁করছিল
- লাম
- ▁আউট
- ▁যাটারি
- ▁কথোপকথন
- ▁তোলা
- ▁থানে
- সংশ
- ▁দেন
- ▁ঘট
- ▁বাতাস
- ▁নিউ
- ▁নেট
- ামাজ
- জনক
- ▁দাম
- শক
- ূ
- ▁যাকসিনগুলি
- ▁নম
- হার
- ▁রাসা
- ▁শিশু
- োল
- ালের
- ▁কোর
- ▁16
- ▁রাত
- ▁চালা
- ▁100
- ▁সমাজ
- কেন
- ▁তাহ
- ভূমি
- ▁কমে
- ▁মাস
- াময
- ▁12
- শিত
- ▁পেশী
- মাক
- a
- ▁ফোকাস
- ▁শিখেছি
- ▁তহবিল
- ▁রতিবেশী
- ▁রভু
- ▁উপকূল
- ▁দুধ
- ▁পরিচাল
- ▁আলোক
- ▁বলুন
- ▁সিজেন
- ▁দাবি
- ▁দূষণ
- ▁শতকে
- ▁যতক
- ▁পাঠানো
- ▁রাণিত
- ▁রোগীর
- ▁কাউ
- ▁রাখবে
- ▁বোত
- ▁জানতে
- টিভ
- ▁ঞানিক
- ষণা
- ▁গেম
- ▁পুনরা
- োচ
- ▁মিল
- ▁হৃদয
- ▁করেছিলাম
- ▁মুখ
- ▁পোর
- বিদ
- ▁রহস
- ▁পাবল
- ৃ
- ▁কেরি
- ▁রণে
- ▁আজকে
- ▁অপরি
- ংশ
- ▁মহিলার
- ▁রগুলি
- ালোক
- েমন
- ▁জিত
- ▁ষক
- ▁হাতি
- ▁একা
- ষিক
- ▁হাতে
- ▁াস
- তুর
- ▁কা
- ▁কোণ
- ▁দশকের
- ফিল
- ▁গুরুতর
- ▁বলছি
- ▁পাই
- ▁আমেরিকা
- ▁8
- ▁বাজার
- াদী
- ▁চোখে
- ▁রমে
- '3'
- িপিং
- ▁দাঁ
- ▁তরুণ
- '9'
- ▁নদী
- ▁যাপন
- ▁চলেছে
- ▁পাঠ
- ▁অবকাঠামো
- ▁কবুতর
- ▁টুকরো
- ▁অনুবাদ
- ▁একটু
- ▁জিডিপি
- ▁নমুনা
- ▁দখল
- ▁যমজ
- ▁24
- ▁রোজেন
- ▁যাপচার
- '26'
- ▁শারীরিক
- ▁তুলনামূলক
- ▁কিত
- হাউস
- ▁সফল
- ▁দরজা
- ▁নিরাপ
- ▁যালসি
- ▁গরম
- ▁দেখেন
- ▁রমিক
- ▁টাও
- ▁গম
- ▁তিগুলি
- ▁চারটি
- ▁দেবতা
- ▁েল
- ▁তবতা
- ▁শহ
- ▁বিতা
- ▁দৈ
- ▁মাক
- ▁সংকট
- ▁অনুসার
- গুণ
- ▁ইহুদি
- ▁ণবাদী
- ▁রুটি
- ▁মালি
- ▁বালি
- ▁পুনরু
- াশ
- ▁জনক
- ▁পৌঁছা
- ▁উপাদানগুলি
- ▁80
- ▁ইক
- ▁ষি
- ▁কোনটি
- ▁কুশ
- দুর
- রি
- োগ
- ▁করেনি
- ুল
- নিয
- ▁নিব
- ▁জের
- িকভাবে
- ▁শুক
- ▁বান
- ▁রাণীর
- ▁মগুলি
- ুরে
- ▁তাত
- ▁শিখ
- ▁কক
- ুনি
- ▁রেই
- ▁কাট
- ▁তিকর
- পোস
- ▁খালি
- ▁যাগুলি
- ▁বনাইজ
- ▁ভূ
- ▁যেগুলি
- ▁লাভ
- ▁গেল
- ▁জাতিক
- ▁পরিশ
- ▁উপরের
- কর
- ▁মেশিন
- েল
- ▁ছেলে
- ▁সু
- ছিল
- ▁জাম
- ▁শানবো
- সাশ
- ূত
- ▁থিতিশীল
- ▁বো
- ▁তুলা
- ▁বকে
- ▁অবি
- '00'
- ▁থানগুলি
- ালকা
- ▁লু
- ▁ইউ
- ▁অধিকার
- ▁রাইলোবাইট
- ▁টেরিওটাইপ
- ানদের
- ▁মিটার
- ▁জাতি
- ▁ভালবাসা
- ▁সীমিত
- ▁অনুশীলন
- ▁মোনালিসা
- ▁জীবনযাপন
- ▁আলোচনা
- ▁লোরো
- ▁আগামী
- ▁তেজিত
- ▁রনালী
- ▁2030
- ▁উঠেছে
- ▁আগুন
- ▁নেতিবাচক
- ▁যাকটাস
- ৎকার
- ▁যালাক
- ▁থনীতিবিদ
- ▁বিরল
- ▁লেজ
- ▁পৌঁছানো
- ▁বীকৃত
- ▁পাহা
- ▁চেইন
- ▁যামেরা
- ▁রু
- ▁রেখা
- মস
- ▁দেখানো
- ▁চীন
- ▁জনসাধারণ
- ▁তাব
- ▁রাজি
- েড
- ▁ছদ
- ▁ডিং
- ▁তালিকা
- নো
- ▁পরিবেশে
- ▁ফি
- ▁রাউন
- ▁রোল
- দৌ
- ▁চোখ
- ▁সাক
- ▁রোম
- ▁ফাঁদ
- শন
- ▁ডস
- '5'
- ▁সাই
- াজ
- ▁শেখা
- ▁জিনগুলি
- িণ
- ▁টিকেল
- কোণ
- ▁গান
- ▁সেতু
- ▁সরকারকে
- ▁মাসের
- ▁রাপক
- ▁খাল
- ▁কান
- মিশ
- শি
- দস
- কোনো
- ▁শিবির
- ▁হো
- ▁ছাত
- সরি
- ▁রহে
- ▁পথে
- ▁বলবে
- ▁এন
- যুদ
- ▁ভু
- মনী
- সে
- ▁অংশে
- ▁খেল
- জার
- ▁াট
- ▁দী
- '7'
- ইহ
- ▁সিরি
- ▁লাইন
- ▁মাসে
- ▁াদে
- ▁চক
- ▁ছেদ
- ▁খু
- ▁ডল
- ▁রীক
- ▁বলছে
- ▁এসে
- ▁উপকরণ
- কিং
- ▁ভাইরা
- ▁ঐতিহ
- '&'
- ;
- o
- p
- Ş
-
- ▁চুনাপাথর
- ▁মেলাটোনিন
- ▁রাজধানী
- ▁ষুধা
- ▁সবকিছু
- ▁অপারেটর
- ▁মবেশ
- ▁হরমোন
- ▁গাছপালা
- ▁উপভাষা
- ▁আইডি
- ▁যসেবা
- ▁দেশাবলী
- ▁যানবাহন
- ারের
- ▁হারানো
- ▁তরাধিকার
- ▁পাবেন
- ▁বিকৃত
- ▁ষেপণ
- ▁জেট
- ▁অংশগ
- ▁জমি
- ▁অভিযোজন
- ▁বাণী
- ▁বিবর
- ▁যাধি
- ▁হোম
- ▁যাটি
- ▁চলগুলি
- ▁বলেছিল
- ▁টাকা
- ▁খোলা
- ▁মীদের
- লো
- ▁রচার
- ▁রেণী
- ▁সামর
- ▁রহী
- ▁মানবতা
- ▁রতিদিন
- ▁দেহ
- ▁নিজেদেরকে
- ▁যাপার
- ▁াগুলি
- ▁ভারতকে
- ধিক
- বিরক
- ▁গর
- ▁টান
- ▁দান
- ▁90
- ▁কাজে
- ▁িগুলি
- ▁বাদ
- ▁সাত
- ▁25
- ▁হবেন
- ▁লেখক
- বাদী
- াউন
- াদের
- ▁পেরেছি
- ▁পক
- ▁পাইক
- '1'
- ▁1000
- িস
- ▁অল
- ▁রাশ
- ▁উপন
- ▁শিকারী
- সাধারণ
- ভার
- ▁ষিণে
- ▁বুদ
- ▁পশ
- ▁ভুলে
- ▁সাপ
- ▁রিজ
- াইড
- ▁ভূত
- ▁50
- ▁লাগে
- ▁বারে
- দিন
- ▁দৃ
- তন
- ▁পাদ
- '8'
- ▁আট
- ▁আকাশ
- ▁নিচ
- ▁বিগ
- '6'
- চে
- ▁খুল
- ▁ভূগ
- ▁দাতা
- ▁বলেছি
- ▁সুলতান
- পর
- কুচি
- ▁তনশীল
- ▁এতটা
- ▁মানি
- ▁অথ
- ীন
- তুল
- ▁লাই
- ▁পাখি
- ▁রোধ
- ▁নিদ
- ধ
- ▁বাধীন
- ▁এসেছি
- ঢ
- ▁ঘর
- তিবাচক
- ▁ডিভাইস
- ▁মোটামুটি
- T
- ▁পৃথক
- ▁যালঘু
- ▁সহযোগিতা
- ▁পুনঃ
- ▁আবেগ
- ▁যকলাপ
- ▁ঝিল
- ▁নিঃসরণ
- ▁আংশিক
- ▁চিৎকার
- ▁লিওন
- ▁মনোযোগ
- ▁আবেদন
- ▁বিবেচ
- ▁আছি
- ▁ফসল
- ▁পোরেশনগুলি
- ▁পেরু
- ▁বিতরণ
- ▁লাইট
- ▁কিলো
- ▁এসেছে
- ▁বহি
- ▁ইউনি
- ▁বামী
- ▁কভার
- ুব
- ▁ফলাফলগুলি
- ▁কৃষি
- ▁তাক
- কারক
- ▁যাকশন
- ▁পাঠা
- ▁নেতা
- ▁খে
- ▁সকলের
- ▁তনে
- নাইট
- পুর
- ডাউন
- ▁যৌনতা
- ▁ডান
- রম
- ▁শীত
- ▁চলা
- ▁কানের
- ▁মিং
- ▁মুদ
- ▁শাসক
- ▁গোপন
- ▁তোমা
- ▁কৃতি
- ▁টেক
- ▁রেট
- ▁সকালে
- ▁যাবেন
- ▁জান
- ▁পরিসরে
- ▁ফুল
- ▁হাত
- ▁এভাবে
- াইভ
- পূরণ
- ▁হলেন
- ▁শিশুর
- শীর
- ▁ডানা
- পতি
- ▁মাতা
- ▁শুনে
- ▁কাটা
- ▁ধারণাটি
- ▁যিক
- ছা
- ▁গাছে
- ▁রমা
- ▁সমাধানে
- সম
- ীদের
- ▁মাল
- িড
- আই
- ▁দার
- মার
- ুন
- ▁ভে
- ▁চতা
- ▁400
- ▁বাহ
- ▁ইতাল
- লস
- ▁রাইভ
- ▁এরিক
- ▁থি
- ▁হারি
- মাঝ
- েইনফ
- ▁পেরেছিল
- '4'
- ▁টিকে
- েব
- থাক
- ▁শর
- ▁ডাকা
- ▁রেখেছিল
- ▁তুলে
- ▁অসুবিধা
- ▁নগুলি
- ▁আই
- ▁টু
- ▁শেষে
- ▁জনপ
- খানে
- ▁বহুল
- ▁দেখেছিল
- ▁ঋণ
- ▁রুপ
- ▁দূষ
- ▁মহাকা
- ০
- ▁আরএনএ
- ▁নাৎসি
- ▁সুপারহিরো
- ▁রতিযোগিতা
- ▁পাইলট
- ▁বজনীন
- ▁ঐতিহাসিক
- ▁চিঠি
- ▁মরিসন
- ▁বাজেট
- ▁সুপারিশ
- ▁পুলিশ
- ▁দেখুন
- ▁অভিযান
- ▁রাহক
- ▁যবধান
- ▁রতিযোগী
- ▁ডানদিকে
- ▁পাগল
- ▁খনন
- ▁ঘটছে
- ▁বেষণ
- ▁সংবেদন
- ▁লাগানো
- ▁দেখেছিলেন
- ▁ঢে
- ▁পারবে
- ▁কাশন
- ▁বিকেলে
- ▁শুনেছেন
- ▁এসেছিল
- ▁যাসিড
- ▁নেমে
- ▁3-
- ▁রশংস
- ▁বাহু
- ▁করত
- ▁রঙে
- গদ
- ▁40
- ▁গক
- ▁শোষ
- ▁জোট
- ▁গণনা
- ▁হাঁট
- ▁বেস
- ▁রিলি
- ▁টিং
- ▁দাদা
- ▁সরকারগুলি
- ▁অংশগুলি
- ▁দোষ
- ▁খলা
- ▁করতেন
- ▁জাপান
- ▁আধি
- ▁বাহিনী
- ঘাত
- ▁সরকারী
- ▁থিতিতে
- ▁পারেনি
- ▁যাংক
- াসের
- াইজ
- ▁মেট
- ঃ
- ▁কুলে
- ▁বাচন
- ▁কোড
- ▁মাঝা
- ▁রেমে
- েইন
- রমাণব
- ▁যাগগুলি
- বহন
- বাজারে
- ▁টেবিল
- ▁চারা
- ▁রাখ
- ▁ঠানিক
- ▁বংসা
- ▁ধকারে
- ▁ঝুল
- ▁18
- ▁থাকেন
- ▁কৃষ
- ▁তক
- ▁চি
- বিরোধ
- হন
- ▁নাক
- ▁যাতন
- মিন
- দা
- চার
- ▁গগুলি
- ▁আছেন
- '21'
- ▁ডলে
- ▁তিটি
- পা
- ▁রোত
- ▁রকেট
- ▁তাহে
- ▁পাস
- ুলার
- ▁বাঁচা
- ▁আসেন
- ▁যথায
- ▁কৃতিক
- ▁ধকার
- ▁পরিষেবা
- বিক
- ▁তগুলি
- ▁যাণ
- ▁দেবী
- ▁ষর
- ▁সীমান
- ▁কৃত
- সি
- ছি
- ▁পিতামাতার
- ভান
- ▁মেঘ
- ▁আরি
- ▁ফাঁক
- েজ
- ধি
- ▁পরি
- ▁মেটা
- টো
- পাস
- নে
- তিগ
- োপ
- মুখী
- ▁যদ
- জীবন
- '0'
- ▁অতি
- ফো
- ▁মিনিট
- ▁রিপ
- ▁মিক
- ▁পিছ
- ▁কু
- ▁যানবাহনে
- ▁শো
- ▁নাগা
- বেন
- ▁পোরেশন
- ▁োগকারী
- শালী
- ▁জাতিসংঘ
- ৃৎপ
- ▁ডিজিটাল
- ▁নিখুঁত
- ▁পিতামহ
- ▁মহাসাগর
- ▁রিলোবাইট
- ▁রীতদাস
- ▁রোপচার
- ▁সেনাবাহিনী
- ▁অপারেশন
- ▁জরুরী
- ▁শেলোব
- P
- ▁অনুভূমিক
- ▁যাটেলাইট
- ▁বাছাই
- ▁যকারিতা
- ▁আঠালো
- ▁কেটলি
- ▁সৈন
- ▁ইনজেকশন
- ▁একাকী
- ▁রতিকৃতি
- ▁মালিকানা
- ▁রাকচার
- ▁তুলেছে
- ▁কবিতা
- ▁আসুন
- কোহ
- ▁বুশ
- মলত
- ▁আসছে
- ▁আশাবাদী
- ▁আসবে
- ▁উৎসাহ
- ▁বোতাম
- পোকা
- ▁অধীন
- ▁একমত
- ▁ভেবেছিল
- ▁সুখ
- ▁গঠিত
- ▁নজর
- ▁বিবরণ
- ▁201
- ▁দেখবে
- ▁লিনিক
- ছ
- ৌক
- ▁সুইচ
- ▁পরিণতি
- ▁মোটা
- ▁উৎপ
- ▁লেটগুলি
- ▁পাথর
- ▁ফেলবে
- ▁ফরাস
- ▁হৃদ
- িগার
- ▁মাপ
- ▁ভাঙ
- ফুস
- ▁ধুদের
- ▁বিরতি
- ▁কতা
- ▁লাইস
- ▁দিল
- ▁থাকি
- ▁নীতিগুলি
- ▁আবদ
- ▁রেলি
- ▁পেস
- ▁মাইক
- ▁টেমগুলি
- ▁গু
- ▁টেশন
- ▁গেট
- নশীল
- ▁লুক
- ▁পরাজ
- ▁পাঁচটি
- ▁বতন
- ▁পাবে
- ▁রোমান
- ▁বাপক
- ▁লাইনের
- ▁00
- পোর
- ▁উঠ
- ▁17
- ▁যাতি
- ▁জাল
- বাইন
- ▁ঘটা
- ▁কমান
- ▁ইমে
- ▁দগুলি
- ▁উপয
- ▁হতাশা
- ▁যুতে
- ▁নিষি
- ভ
- ▁সেল
- োর
- ▁ফিল
- ▁সিটি
- ▁ভবন
- ▁দীপ
- ▁194
- ▁ষাগুলি
- ▁যাগে
- ▁আবর
- ▁সকল
- মিড
- ▁টিকেলগুলি
- ▁কারণগুলি
- ▁দিক
- ▁হেল
- ▁বিট
- ▁রেরণা
- ▁কুশি
- ▁ঘোরা
- ▁ধরা
- ▁সী
- ফি
- ▁রবৃ
- ▁রোটিনে
- ▁কাজগুলি
- ▁মহাকাশে
- ামগ
- ▁অনেকের
- ▁পলি
- ফিক
- ▁রহণকারী
- ▁বিধ
- রেস
- ▁লোককে
- ▁মহাদেশ
- ুত
- ▁ণতা
- ▁রপ
- ▁মিশ
- ▁উৎস
- ▁গার
- কেটে
- গো
- মেডি
- ▁লেখা
- ▁ভিদে
- ▁ষী
- ▁দিনে
- বশেষ
- ▁দেশটি
- ▁মেস
- ▁বিচারে
- ৌ
- ▁ডিত
- ▁আব
- ▁মহাকাশ
- ▁রেডি
- ▁36
- ▁22
- ▁10000
- োস
- ▁বুজ
- কেল
- ▁বাতাসে
- েটর
- ীর
- ▁বেল
- ▁বীপে
- দন
- লাইন
- ূপ
- ▁সাহারা
- ▁রমণে
- ▁হাস
- ▁েজ
- ▁বলতা
- ▁জুন
- কোস
- ▁হই
- ▁মজা
- ▁নটি
- ▁করণ
- বিজ
- ▁যেকোন
- াবে
- াদা
- ▁রুট
- তিক
- ▁থের
- ▁সহজে
- ▁তাকা
- ▁গবেষক
- ▁ধর
- ▁রাইড
- ▁এলোমেলো
- ▁উঁচু
- ▁উদযাপন
- ▁কীটনাশক
- ▁রতিনিধি
- ▁শিরোনাম
- ▁শৈশব
- ▁াকলাপ
- ▁এনকোড
- ▁মজুরি
- ▁লাটিপাস
- ফেডারে
- ▁থেরানোস
- ▁মনোনিবেশ
- ▁ইটনভিল
- ▁লুরোস
- ▁জরিপ
- ▁টিউমার
- ▁মনিকা
- ▁সমাবেশ
- ▁বাসনালী
- ▁ইংল
- ▁খাঁচা
- ▁জীবিকা
- ▁গৃহ
- ▁ভিডিও
- ▁বেলারুশ
- ▁অধিকাংশ
- ▁রিগস
- ▁বাভাস
- ▁তুলবে
- ▁ঝাঁপ
- ▁পোশাক
- ▁খলিল
- ▁রতিবাদ
- ▁সাফো
- ▁আসল
- ▁সহিংসতা
- ▁সমাধি
- ▁কমিশন
- ▁বিদেশ
- ▁রেখেছিলেন
- ▁রাইম
- ▁কিং
- ▁ধতিগত
- ▁টাইন
- ▁অংশীদারদের
- ▁অনুভূতি
- থার
- ▁লাইম
- ▁বীজন
- ▁বিমান
- ▁রপাতি
- ▁কোলে
- ▁যানেলগুলি
- ুঁ
- ▁লিপিডগুলি
- িশোধ
- ▁সেগুলো
- ▁শিশুদের
- ▁লাফ
- ▁বেকার
- ▁সরানো
- ভাইরাস
- ▁অনুরোধ
- ▁শনিক
- ▁মালিক
- ▁রিকান
- ▁জমা
- ▁ফাঁ
- ▁অনুমোদন
- ▁করিনি
- ▁আবি
- ▁গণত
- ▁সভ
- ▁কমানো
- ▁দীতে
- ▁তৃতা
- ▁রতিরোধী
- ▁যুট
- ▁টাল
- িচ
- ▁রোপণ
- ▁বিবাহ
- বহুল
- ▁রবণতা
- ▁করলেন
- রিকানদের
- ▁দাঁত
- ▁আপস
- ▁যাকিং
- ▁যবাহ
- ▁জে
- ▁বোঝাতে
- ▁রামী
- ▁রুব
- ▁2000
- ▁মাছ
- ▁ারিং
- ▁জীবাণু
- ▁লিনার
- ▁ফুট
- ▁ধাপ
- চাপ
- আইনি
- ভাল
- গম
- ▁লেগে
- লুপ
- ▁কাপ
- ▁রহটি
- দূর
- শাস
- ▁টিমে
- ▁ঘটনাটি
- ▁কিলোমিটারের
- ▁সংগঠ
- থিত
- ▁অণুগুলি
- ▁বীর
- ▁সবে
- ▁করুক
- ▁লিফটে
- ▁সমাজে
- ▁ারশ
- ▁খরা
- ▁তেল
- ▁আঁক
- ▁চেল
- পশ
- ▁পরিপ
- ▁শহরটি
- ▁লোড
- েকটি
- ▁বিচার
- ▁লাগা
- বল
- ▁লাইটে
- ▁ভূমি
- ▁ফার
- সব
- ▁গণিত
- ▁চির
- ▁পৌঁছে
- লিপি
- ▁ালা
- াপ
- ▁আনা
- ▁পানিটি
- চক
- ▁186
- াংস
- িডা
- ▁একদিন
- ▁7
- ▁হারা
- কারীদের
- ুখ
- িএস
- ▁দশ
- োঁ
- ▁অফিসে
- ▁মুছ
- িশ
- ▁সিং
- ▁াশা
- ▁75
- ▁কাঠ
- ▁সাপে
- '11'
- ▁যদেব
- েম
- ▁ারগুলি
- কোষ
- ▁ফোন
- সেট
- ▁কোট
- ▁দলগুলি
- িটি
- ▁শুরুতে
- বিয
- তীতে
- িঁ
- ▁রেন
- ▁দামে
- করা
- ▁সেটা
- ▁ধিত
- দল
- লিক
- ▁টল
- ▁রোস
- ▁জেনি
- '60'
- ▁তাকান
- ▁যাং
- ▁পাতা
- ▁ো
- ▁পরিক
- ▁একবারে
- ▁কথোপকথনে
- ▁সমতা
- ▁ইউরোপে
- ▁দির
- হো
- শু
- ▁রিডে
- িদর
- ▁জৈব
- ▁জাদু
- ▁যালো
- ▁উৎ
- '15'
- টল
- ▁সুই
- ▁চত
- াবধানে
- ▁অনুমোদ
- ▁এখান
- ▁কিশোর
- ালোচনা
- িছু
- ▁কাগজে
- ▁তরল
- ▁বিরত
- ▁সমীক
- ▁রামক
- ▁অংশীদার
- বাজ
- ▁খামার
- বেদন
- ▁01
- ▁ধাঁধা
- ▁যাথোজেন
- ৫
- ৭
- ▁আনুমানিক
- ▁কমিউনিটি
- ▁করোনাভাইরাস
- ▁চাবিকাঠি
- ▁জরুরি
- ▁তঃসংয
- ▁তাভাবনা
- ▁নকশা
- ▁সহানুভূতি
- ▁অভিনেতা
- ▁ওভাররাইড
- ▁মামালেক
- ▁যামিগডালা
- ▁হতবাক
- ▁পুঁজিবাদ
- ▁মেঝে
- ▁বপুরুষ
- ▁জেগোটা
- ▁1970
- কাহিনী
- ▁বিবৃতি
- ▁বিরোধিতা
- ▁আইনজীবী
- ▁মচারী
- ▁থাপিত
- ▁ঞাপন
- ▁লেবেল
- ▁মামলা
- ▁কোলাহল
- ▁রচারণা
- ▁সোলার
- '99'
- ▁14
- ▁দোলন
- ▁গিগা
- ▁ভীক
- ▁ঘটবে
- ▁আপাত
- ▁ফেলেছিল
- ▁লাগবে
- ▁দেখছেন
- ▁যালসাই
- '35'
- ▁উপভ
- ▁বরাবর
- ▁ঘটেছে
- ▁ভেবেছিলেন
- লিভার
- ▁পেরেছিলাম
- ▁নিউরন
- ▁আমূল
- ▁ইরানে
- ▁সমতল
- ▁ওভার
- ▁আদেশ
- ▁কাঁটা
- ▁ধারনা
- ▁যুবক
- ▁এসেছিলেন
- ▁তানুকি
- ▁খামারগুলি
- ▁ণালী
- োফা
- ▁দুজন
- ▁ছুট
- ▁চৌ
- ▁সিরিজ
- ▁বলেছিলেন
- ▁উপক
- ধকতা
- ▁খুঁজছেন
- ▁জস
- ▁সচেতন
- ▁করছিলেন
- ▁লিটার
- ▁পিটার
- ▁রথা
- ▁ষমা
- ▁নথি
- ▁টোট
- ▁জামগুলি
- ▁কাগজ
- ▁তকরণ
- াবলী
- ▁পেশীগুলি
- ▁ঋণী
- ▁বছরগুলিতে
- ▁কেপ
- ▁নেহ
- ▁সেবা
- ▁তুলো
- সাঁ
- ▁অভিবাসী
- ▁পৌঁছেছে
- ▁চারণ
- ▁হেড
- ▁উঠে
- ▁যাডি
- ▁রাইভার
- ▁বেনি
- ▁আইল
- ▁সৃজনশীলতা
- ুমি
- ▁কোরবা
- ▁পারব
- চিং
- ▁চলেছেন
- ▁জীবনযা
- বসতি
- ▁রিফ
- ▁ওঠেন
- ▁ছবিটি
- ▁টাফ
- ▁সভা
- ▁ঘাম
- জগতে
- ▁রঙগুলি
- ▁বাই
- ▁তাৎ
- ▁পানী
- ▁শুনি
- শে
- ▁টেট
- ▁কারখানার
- ▁থাকবেন
- ▁যানগত
- াইরে
- ▁দো
- ▁কাঁ
- ▁সজ
- ▁থাংশ
- তীত
- ▁জেনিস
- ▁মি
- সিস
- ▁তাকালে
- োত
- পার
- ▁মোহ
- ▁পিট
- ▁টাপো
- গান
- ▁জিও
- ▁যাদা
- ▁হাম
- ▁মানিত
- ▁পাচার
- ▁সাহসী
- ▁মানগুলি
- '16'
- ুনির
- ▁ফটোগ
- ▁টাইম
- ▁পৃ
- ▁বংশ
- ▁রাণু
- ▁লট
- ▁মৃতি
- অপস
- ▁27
- '23'
- টে
- হারে
- নুপাত
- ▁শট
- ▁ফেলা
- ▁পশু
- ▁গেছেন
- ▁জারি
- ▁রমিত
- ▁রোতা
- টিং
- ▁জেনারেল
- ▁সৎ
- ▁লেন
- ▁বাগত
- ▁রমণকারী
- ▁চিতভাবে
- ▁বাসা
- ▁মডেলগুলি
- ▁টেন
- ▁গুর
- াগুলি
- দেবী
- ▁রোড
- দাতা
- ▁পরিবারগুলি
- ▁টানা
- লগ
- ▁রিটাউনে
- কিলোমিটার
- ▁রতা
- লাভ
- বৈ
- ▁কাম
- কন
- ▁বাব
- ▁সুবিধাগুলি
- ▁কগুলি
- ▁থীর
- ▁বিকভাবে
- রিশ
- ▁বই
- লিস
- ▁নগ
- দেশ
- ▁যৎ
- ▁দূরব
- ▁রাইভে
- ▁শিলা
- ▁চুরি
- মোন
- ▁অতীতে
- ▁সির
- ▁দেখাতে
- ▁হাব
- ▁কেলে
- সোস
- ▁ডাকে
- ▁আলোকব
- ▁তান
- ▁ামি
- টক
- ▁দানি
- ▁ডগুলি
- ▁পেরে
- ▁কেনা
- ▁ষণিক
- ▁কুশে
- টার
- ▁তৃপ
- ▁নেন
- ▁চাপা
- ভা
- দান
- ▁বিধা
- ▁যাকেজ
- েলে
- ▁গোল
- গন
- পরি
- ▁যাসে
- ছিলেন
- ▁চালান
- ▁নতা
- ▁যাশন
- ▁নাল
- ▁কোপটি
- িবাসী
- বশ
- িরোধী
- ▁অনুগ
- সিলি
- মত
- ▁মুন
- ▁ঞানে
- কালে
- ▁চিল
- েছিল
- ▁পরিত
- ▁যথা
- ▁যাকর
- োট
- ইনস
- ▁মিলে
- তঃ
- ▁সিএ
- ▁েলস
- শেষে
- ▁লোম
- জা
- ▁দেরি
- ▁রল
- টেক
- ▁সাহস
- ▁এইচ
- ▁মনো
- ▁রেরণ
- ▁পালা
- নিক
- ▁বাঁকা
- ছুক
- াইট
- ▁ফর
- ▁আটক
- ▁দটি
- ▁রাফ
- ▁মিস
- ▁ধা
- ▁পরিবারে
- ▁উঠত
- নুষ
- োম
- োদ
- খানার
- ▁অশ
- িরে
- বিত
- ভিল
- ▁ধুত
- ▁পাব
- ▁রেখেছি
- িটা
- ৈ
- াগন
- ▁কামান
- টাস
- ▁কারখানা
- ▁ধানে
- ▁দিত
- ▁অপরাধ
- ভি
- ালী
- রিকা
- ▁20000
- ▁সংঘ
- ▁সৃজনশীল
- '18'
- ▁অভিবাস
- ▁বলব
- ▁ধারক
- খানা
- রাধিকার
- ▁থাকব
- ▁লিখ
- ▁অমরজ
- ▁রপাত
- ▁উঠবে
- ▁রোমা
- াষী
- ▁দেখেছে
- ▁ডিশনার
- ▁াসে
- ▁নীত
- াগারে
- াফা
- ▁160
- জির
- াব
- '87'
- ▁ইনজেক
- ▁গোলকধাঁধা
- C
- L
- r
- ▁ইঁদুর
- ▁ইউটিলিটি
- ▁ইমিউনোথেরাপি
- ▁এলিভেটর
- ▁কাদামাটি
- ▁কৌতূহল
- ▁চিরতরে
- ▁ধারাবাহিক
- ▁মিসৌরি
- ▁রচারাভিযান
- ▁রাজকুমার
- ▁রেনেসাঁ
- ▁শিথিল
- ▁ষরেখা
- ▁হাসপাতাল
- ▁অবজারভেটরি
- ▁পরিকাঠামো
- ▁ররেখা
- ▁তলদেশে
- ▁শৈল
- ▁মদপুর
- ▁ওলাফ
- ▁গতিশীলতা
- ▁সাসপেন
- ▁ঘেটো
- ▁সংহতি
- ▁আইটেম
- ▁মেরামত
- ▁মৃদু
- ঁচট
- ▁96
- ▁রজেকশন
- ▁কংগ
- ▁রাচীর
- ▁রাজনীতিবিদ
- ▁সমালোচনামূলক
- ঘাট
- ▁রাখুন
- ▁উপনিবেশ
- ▁হিম
- ▁অনুকরণ
- ▁রামবাসী
- ▁দেশিকা
- টেইনার
- ▁ডেনিম
- ▁সাজানো
- রফেস
- ▁ষপাত
- ▁সাগর
- ▁পারতাম
- ▁মোতা
- ▁জিনোম
- ▁2019
- ▁এনেছিল
- ▁লুকানো
- িউএ
- ▁অভিজাত
- ▁রিটিশ
- ▁গুণমান
- ▁অভিনব
- ▁পরিপূরক
- ▁টগুলি
- ▁ষাপটে
- ▁রিলিফ
- ▁টানেল
- ▁জেগ
- ▁সুপার
- কটের
- ▁বৈধ
- ▁সেথেস
- ▁কাঁপ
- ▁জটিলতা
- ▁ফোরণ
- ▁টুকরা
- ▁ভরশীল
- ▁শদাতা
- ▁বালতি
- ▁পালক
- লিথি
- ▁ধরন
- ▁পেশা
- ▁পরিণতিগুলি
- ▁বাগান
- ▁মনোভাব
- ▁অনলাইন
- ▁থাপক
- ▁বলেছে
- ▁সেটিং
- ▁ডিফ
- ▁চোরা
- ▁ভিড
- ▁দেখেছেন
- ▁বোঝানো
- ▁শকুন
- ▁থাপকতা
- রবী
- লানিক
- ▁নীতিগত
- ▁করেননি
- ▁বিভাগে
- ▁দিকটি
- ামী
- ▁ওঠা
- িসিসি
- ▁তাকাতে
- ▁বলেছেন
- ▁পিতৃ
- ▁ফেট
- ▁পাঠক
- নাতক
- ▁দাগ
- ▁পারিনি
- ▁চেতনা
- ▁কফি
- ▁পাঠান
- ▁অবসান
- রোধে
- ▁রতিবার
- ▁মুদি
- ▁মূলধারার
- ▁বাতি
- ▁রাগন
- ▁গাম
- াবস
- ▁শনগুলি
- পোলি
- ▁বাধীনতা
- ▁ভাস
- ▁রাণীগুলি
- ▁আইস
- ▁কিছুর
- ▁জানতেন
- ▁জানু
- ▁রামগুলি
- ▁লোহ
- ▁কেজি
- ▁সাব
- ▁রাইট
- াচল
- ▁ইট
- ▁ছাপ
- বৃ
- ▁বিপদ
- সিভ
- ▁কলে
- ▁অসহ
- ▁টেরল
- ▁খাই
- ▁রমিকরা
- আইভ
- ▁উপাদানটি
- ▁মহামারীটি
- ▁যালোকে
- ▁সমাধানগুলি
- ▁যি
- ▁থিতিশীলতা
- ▁ওটা
- ▁রেখেছে
- ▁আদালতে
- ▁রোচ
- ▁গণ
- ▁দলে
- ভিয
- ▁উপহা
- ডেট
- ▁খালটি
- সুবিধাজনক
- ▁মগ
- ▁লালন
- ▁কণা
- ▁নিষেধ
- ▁১
- েলাই
- াবল
- ▁চেক
- ▁নই
- ▁অভিন
- ▁টেমে
- ▁ভট
- োন
- ▁গভীরতা
- ▁ষণগুলি
- ▁সারি
- ▁বরে
- ▁ধেকের
- ▁যাসী
- ▁দিরে
- ▁দৈন
- কড
- ঁ
- মাদ
- ▁টরের
- ▁কারো
- ▁গী
- ▁ফু
- ▁রাজারা
- জেনি
- কো
- ▁বীপগুলি
- ▁কণ
- ▁বাঁক
- ▁পিতামাতা
- ঠিত
- ▁সবাইকে
- ▁থির
- ▁মিনি
- বাহ
- ▁বাসী
- ▁তনগুলি
- ডো
- ▁থাপনা
- রো
- ▁াটি
- ▁রীর
- ▁নেবে
- ▁বুজে
- ▁রীন
- লুস
- রিটি
- নোর
- ▁500
- ▁এলাকাগুলি
- ▁উই
- ▁রোটিনটি
- তাকা
- ঠ
- শনে
- ▁360
- ▁বনে
- ▁সুয
- ▁ফিউ
- বুন
- ▁13
- ▁সাইটে
- শনার
- লাঙ
- টান
- ▁খোঁজ
- ▁ডাল
- ▁কপি
- ▁তুকি
- ▁ধাত
- জাত
- বেচ
- ▁হব
- ▁ইতালি
- োশ
- ▁জুম
- কক
- রুন
- মূল
- ▁মেইন
- ▁েলসে
- পথগুলি
- নিম
- লজি
- ▁টক
- হারা
- ▁দিই
- ▁দোকানে
- পিং
- সাধ
- চালান
- ▁রতিরোধে
- পেস
- '37'
- ▁নিল
- ▁খুলি
- গল
- ধান
- ▁ফের
- ▁জগুলি
- ▁বেলা
- পথ
- ▁কনস
- ▁শেল
- বিল
- ▁নেভিগে
- ▁জাগ
- জাতিক
- উ
- ▁রবাহে
- ুলে
- ফোন
- আপ
- তারা
- ▁অফিস
- ▁পশম
- ▁যুগে
- ▁যাটিন
- ▁ততটা
- লভ
- ▁মহাদেশে
- বো
- েমের
- ▁উৎসে
- ারবার
- ▁কমলা
- পাল
- ▁চলছ
- ভেন
- লিম
- মুন
- ▁202
- সেপ
- দানি
- মেলা
- ▁লিং
- িবার
- ▁সাইট
- ▁কনসা
- ঝর
- িকেল
- াশি
- ঝ
- ▁জানান
- ▁রমাণবাদ
- নেস
- শহ
- ▁নাচ
- ▁যাব
- ফেরা
- ▁124
- ▁পতন
- '12'
- ▁ভরা
- ▁ঘরে
- ▁বাম
- ▁লিক
- লানো
- ▁বী
- খা
- গোল
- ▁রতার
- ▁টেমটি
- '44'
- ▁জেনারে
- ▁রাশি
- ▁ভূমিক
- থি
- ▁ভাষ
- ▁ঝর
- ▁সুদ
- বাসী
- োজা
- ▁হতাশ
- লিং
- ▁চিনি
- হর
- ▁পারলে
- সাইক
- ▁196
- ▁সবা
- ▁ফুলে
- ▁আচরণে
- ভিউ
- হাই
- মদা
- '56'
- ▁তিরা
- ▁ষেপে
- ▁ধারে
- ▁নাইজ
- ▁300
- ▁অনুর
- ামেলা
- ▁মিউ
- ▁দেখ
- ▁থাম
- ▁অভিযোজ
- ▁হাঁটা
- মিক
- শাপ
- ানা
- ▁যাকটি
- ▁রবাল
- ▁বিতর
- কিউ
- ▁সিট
- ধীন
- ▁150
- ঁজ
- ▁গীত
- ▁থাকত
- াঁচে
- '600'
- ▁শুনেছে
- ▁ফসফোলিপিড
- ▁বাঁধ
- ▁বীজ
- কূল
- ▁খুঁজছে
- ▁রাজনীতি
- ▁রজেক
- ৯
- m
- u
- ğ
- ▁অববাহিকা
- ▁এনজাইম
- ▁এলিজাবেথ
- ▁কাটলফিশ
- ▁কূটনীতি
- ▁গিলগামেশ
- ▁টিরিওটাইপ
- ▁নৌবাহিনী
- ▁ফাংশন
- ▁ফারেনহাইট
- ▁বাংলাদেশ
- ▁ভলিউম
- ▁মসৃণ
- ▁মোকাবিলা
- ▁যসাগর
- ▁যাভিগেশন
- ▁যালগরিদম
- ▁রাঘিমাংশ
- ▁সমঝোতা
- ▁সালতানাত
- ▁সোককেলেটন
- ▁একাডেম
- ▁দেহভাজন
- ▁বংশধর
- ▁মহাকাশচারী
- ▁রজাপতি
- ▁হেঁটে
- ▁এমারসন
- ▁ছাসেবক
- ▁তোরাঁ
- ▁ধবিরতি
- ▁বিনোদন
- ▁রুসেডার
- ▁াশোনা
- ▁রণেতাদের
- ▁লাপনা
- দারুণ
- ▁যযুগ
- ১৯
- ▁নৃশংস
- ▁গৃহীত
- ▁সিনেমা
- ▁নেবুলা
- ▁ইমাল
- ▁শাটার
- ▁মহাকাশযান
- ▁পিঠ
- ▁থাকুন
- ▁ভালোবাস
- ▁লেপটিন
- ▁সহযোগী
- ▁পটভূমি
- ▁অবাধ
- ▁দুঃখজনক
- ▁ঢেউ
- ▁অসীম
- '97'
- ▁উপযোগবাদী
- ▁অতিথি
- ▁একেবারে
- ▁াবেটিস
- ▁কভারেজ
- ▁জোরালো
- ▁মশলা
- ▁শেঠ
- '94'
- ▁লেগেছিল
- '95'
- পোষণ
- ▁হিপ
- ▁তশাসন
- ▁টিপাত
- ▁হাজি
- ▁রবিন
- ▁যাটিপাস
- ▁টারনেট
- ▁1930
- ▁মিছিল
- ▁মাঠ
- ▁অটোম
- ▁লিখেছ
- ▁দেখছিলেন
- ▁হিংস
- ▁তৃণ
- '98'
- ▁মোনা
- ▁াংখী
- ▁উঠছে
- ▁আইকন
- ▁ফেলুন
- ভাটা
- লিডার
- ▁পিউট
- ▁যোগদান
- ▁ফীতি
- ▁মিটিং
- ▁বোমা
- ▁রাইবো
- ▁রণালী
- ▁টোরে
- ▁রতিকূল
- ডিপি
- ▁লোরেন
- ▁টারবাইন
- ▁টিবডিগুলি
- ▁ঢিবি
- ▁নোঙ
- ▁ছাদন
- ▁হেসে
- ▁বিভাজ
- ▁গুজরাট
- ▁োএ
- ▁120
- ▁খুনি
- োলেট
- ▁এসি
- ▁55
- ▁ডিজে
- ▁সিকো
- ▁ভেলা
- ▁সাইটগুলি
- ▁যাকচার
- ▁কণাগুলি
- ▁মতামত
- ▁কারখানাগুলি
- ▁ফুটপ
- ▁রাখছেন
- ▁শোনে
- ▁ষতিকর
- ▁ছাকৃত
- ▁শহরগুলো
- ▁াকরণ
- ▁যাদুঘর
- ▁সাগু
- ▁কেলিং
- ুথ
- োনাইজ
- ▁রগামী
- ▁যাসীদের
- ▁ভীত
- ▁রচলন
- ালো
- ▁টিপস
- ▁মৌ
- ▁যাফো
- ▁উঠবেন
- ▁সংবাদ
- ▁কাঁচ
- ▁চালনা
- ▁রেজার
- ▁রাসাদ
- ▁উপকরণগুলি
- ▁এগুলো
- ▁নীতিগ
- ▁0
- ▁নিকট
- ▁টেরিওটাইপগুলি
- ▁ফোরক
- ▁টোন
- ▁খনিজ
- ▁অবনতি
- ▁বনভূমি
- ▁যাটারিগুলি
- গাল
- ▁ডারগ
- ▁লুপগুলি
- ▁লজ
- ▁রনগুলি
- কিশোরী
- ▁ছেলেদের
- ভাষী
- ▁ডিপ
- ▁জুনাজপুস
- ▁গোলা
- ▁গভ
- ▁অধিক
- ▁মাইলের
- ▁কুই
- ▁সমালোচনা
- ▁যাফোস
- ▁অধিকারী
- ▁যবোধ
- ▁ধারকরা
- বিধি
- ▁ইকো
- ▁রিটেন
- ুভ
- ▁উপযোগ
- ▁নভ
- ▁ঠীগুলি
- ▁ঘটনাগুলি
- ▁মাংস
- ▁বাদাম
- োচন
- ▁লেব
- ▁বলছেন
- ▁চুষ
- ▁ঠানগুলি
- ▁শাক
- ▁কোঁ
- ▁বাভাবিকভাবে
- নুকি
- ▁লাইড
- িবিটি
- ▁যবসাগুলি
- িকে
- ▁যুগুলি
- ▁টিপ
- ▁রেফ
- ▁কাটে
- োলজি
- ঘর
- ▁টিমাই
- ▁গজা
- ▁সুযোগগুলি
- ▁বাজি
- ▁বিজি
- নেকের
- ীমা
- গুঁ
- ▁যাকরণ
- ▁গুন
- ▁বাঘ
- ▁দেহে
- সা
- '79'
- ▁যেকটি
- ▁টারে
- সিফ
- ▁লেপ
- ▁শুনেছিল
- ▁শেড
- ▁সুইড
- ▁াটে
- ▁কলাম
- ▁তেমন
- ▁ামে
- বাইক
- ▁ঢালা
- ▁মুখীতা
- ▁শিশুরা
- ▁বরফ
- ধারা
- ▁পৌ
- ▁কোল
- ▁তালা
- ▁লিন
- ▁খালে
- ুলেট
- ▁টিভি
- ▁রিম
- ▁সেনে
- ▁থামা
- ▁মিটারের
- ▁আসি
- ▁টুল
- ▁ভেজ
- ▁লাশ
- ▁রাগ
- ামাল
- টারের
- ▁রিজটি
- ▁দোর
- ▁যাসটি
- টকে
- ▁চালাবে
- ফিস
- ▁সাজ
- ▁যুব
- েবল
- ▁দিলে
- সিন
- ▁অজ
- ▁শা
- ▁টেজ
- ▁শতাংশে
- ▁ডু
- িজম
- জমে
- সাদ
- ▁অবা
- ▁পুরুষকে
- হাঁ
- ▁লুকো
- ▁মেঘে
- জান
- বক
- ▁যুতি
- ▁শতক
- ▁জিম
- রাণি
- ▁যানু
- সো
- ▁মিলন
- ▁চাইবে
- কৃতির
- ▁রোভ
- ▁মাইল
- '30'
- ▁পরিষেবাগুলি
- ▁আমানি
- ▁ছামত
- '500'
- বোল
- ▁ছবিগুলি
- ▁অরি
- ালি
- ▁নিই
- ▁তেলাপোকার
- কারে
- ▁রামে
- ▁সূচ
- ▁ারো
- ▁যাসি
- ▁টেলিভিশনে
- বুক
- টস
- ▁দেখান
- ুসং
- কু
- ▁আদি
- ণের
- িটাল
- ▁মরি
- রীদের
- বিচ
- ▁ধিম
- ▁রিটে
- ▁চাচা
- ▁গানে
- ▁শিবিরে
- টেন
- ▁দুঃ
- ▁টিকেলে
- ▁কেনে
- '000'
- ▁যুগ
- াশা
- '48'
- ▁কুর
- শান
- জিতে
- ▁খেলে
- ▁পরম
- পির
- ▁আঁ
- ভাব
- ানু
- ▁মাতৃ
- পশম
- ▁ষাত
- াণ
- ৃপ
- ▁চো
- কাঠি
- লন
- টারি
- ফল
- করণ
- টন
- ▁অতীত
- াইজার
- আর
- ▁ঝুলে
- িওল
- খোঁজ
- বোধ
- ▁গাগুলি
- ▁পেল
- বেশি
- ঘুরি
- কী
- ▁যাটা
- 08
- িব
- িৎ
- চিব
- '19'
- লাইট
- নৈতিক
- শুদ
- শম
- ▁সরকারে
- গভীর
- রোটিন
- '80'
- লেট
- ভাষা
- নাইজ
- হাত
- অপ
- ধারণ
- জানা
- ▁ঘটান
- অ
- ▁193
- কাজ
- ▁শুনেছি
- জুন
- িউ
- ▁নদ
- চুরি
- হেল
- ▁শেখান
- দি
- ঁকি
- ▁আসাদ
- লোভন
- ▁রিভে
- োগান
- নিউ
- ▁পৌঁছ
- াগ
- ▁াপথ
- ▁শোক
- ফেল
- মাণ
- ঘন
- তাই
- ▁ভুগছ
- ▁তৃ
- ▁বুঝি
- ▁দেখছি
- বসে
- ▁উঠল
- ▁টিম
- ▁180
- ▁জলা
- চা
- ▁লেগ
- ডিএ
- মাই
- ফিউ
- রিসে
- ▁পারমা
- ▁বেষ
- ▁মিলনে
- ▁110
- াংশের
- েটিক
- ▁800
- জিশন
- ▁ধারণে
- ▁তোম
- োনে
- ▁বলত
- ▁রাচ
- ▁বেগে
- ালদে
- ▁শুন
- ▁যারো
- ▁3000
- ▁1500
- ডেন
- ▁মূলধারা
- সিকতা
- ▁ছু
- ▁তাঁ
- ▁খোঁ
- ▁ভাবি
- ▁জুনাজপু
- ▁চালাব
- ▁পাথ
- গণিত
- ▁থেরাপিউটিক
- ▁মেক
- ▁ইংরেজ
- হীনতা
- ▁সেখান
- াহু
- ▁ফুটে
- হাউ
- ▁একগু
- ▁রাখছে
- ▁চমক
- ▁টিবডি
- ▁রাউ
- ৌরব
- ৎসাহ
- ভাসক
- ▁এসমেরাল
- e
- i
- ঊ
- ৬
- ▁1988
- ▁1990
- ▁অবৈধ
- ▁আকসুম
- ▁আজারবাইজান
- ▁ইসমাইল
- ▁কৌতুক
- ▁জরিমানা
- ▁তকণিকা
- ▁দাবানল
- ▁নিবেদিত
- ▁ফিলিপাইনে
- ▁যাবরেটরি
- ▁শৈবাল
- ▁সাবমেরিন
- ▁সিংহভাগ
- ▁সিংহাসনে
- ▁হাইপোথিসিস
- ▁ঘৃণ
- ▁ণযুগ
- ▁কোঅপারেটিভ
- ▁ঘেরলিন
- ▁জেলালেম
- ▁ঠপোষকতা
- ▁বিছানা
- ▁যাচমেকার
- ▁রাজবংশ
- ▁শীতাতপ
- ▁শোধন
- ▁সিকিউটিভ
- ▁হোমোরোটি
- ঘাঁট
- ▁বিলাসিতা
- ▁লেনদেন
- ▁ফোঁটা
- ▁ভালবাসে
- ▁ভূমিধস
- ▁ডেলিভারি
- ▁কমিউনিকে
- ▁এমবেড
- ▁ইউএস
- ▁ঝাঁঝ
- ▁সপোজার
- েমাট
- ▁উপসংহার
- ▁পিনবল
- ▁টাইফুন
- লিউশন
- ▁রবিবার
- ▁লেডগুলি
- ▁লুমিরা
- ▁চিবানো
- ▁রেগারি
- ▁টাইটান
- ▁কিনেছিলেন
- ▁কেরাটিন
- ▁লাজুক
- ▁শুনুন
- ▁সুসংবাদ
- ▁পহেড
- ▁মানবজাতি
- ▁মৌসুম
- ▁রবাদ
- ▁বদলানো
- এইচও
- ▁খল
- ▁রেণি
- ▁মীবাহিনী
- ▁ইরানী
- কোভ
- ▁মিলিমিটার
- ▁রসারণ
- ▁পরিহাস
- ▁রতারণা
- ▁টেসলা
- ▁014
- ▁খোসা
- ▁3500
- ▁ঘনমিটার
- বিধান
- ▁নিউটন
- ▁নেভিগেশন
- ▁গুণফল
- ▁খাঁ
- ▁কেলটন
- রিডিস
- ▁কনভেন
- ▁টেরিও
- থু
- ▁1450
- ▁টোবর
- ▁188
- ▁1980
- ▁কুকুর
- ▁পরিধি
- ▁দুঃখ
- ▁185
- ▁চাবিটি
- ▁লোরিড
- ▁1940
- ▁ধরবেন
- ▁নিঃশ
- ▁ঝাপ
- ▁তপাত
- ▁গীকার
- ▁শহরবাসী
- ▁ফসিল
- ▁যুভর
- ▁টলেশন
- ▁শুনিনি
- ▁যানজট
- ▁ডেভি
- ▁লেগেছে
- ▁জেলা
- ▁ঘটছিল
- ▁রানজিট
- ▁187
- ▁রণোদ
- ▁33
- ▁াবহ
- ▁গেছি
- '05'
- ▁খেলেছে
- ▁জিরো
- ▁ঝরনা
- ▁উপদেশ
- ▁38
- ▁াংখি
- ▁সারাদিন
- ▁শিম
- ▁আগামীকাল
- ▁বেআইনি
- ▁শিখেছে
- সিল
- ▁বাজানো
- ▁লাগছে
- ▁পালগুলি
- ▁লিউ
- ▁পাননি
- মিশনে
- ▁126
- ▁টিথার
- ▁ডোবা
- ▁বিরাজ
- এনবি
- ▁রোথ
- ▁বলছিলেন
- োনাল
- ▁যাংকিং
- চুপ
- ▁রোপড
- ▁টাইমলাইনে
- ▁যাকট
- ▁বাঁধা
- ▁যোনি
- ▁বোনা
- ▁করোন
- াকাশে
- ▁জেনেনা
- ▁ফসফোলিপিডগুলি
- ▁ওভারহ
- লেক
- ▁এলো
- ▁পিকাবু
- ▁আইনগত
- ▁তনালী
- সোম
- ▁উপকূলরেখাকে
- ▁তেরো
- ▁ফেরি
- '89'
- ▁রতিবেদন
- ▁অনুপাতে
- ▁থিম
- ▁ফলিকল
- ▁নলিখিত
- বিটি
- ▁ডিশনারগুলি
- ▁সহজাত
- ▁গুদাম
- ▁কারাগারে
- ▁গেলাম
- ▁হোমো
- ▁ফোটা
- ▁মানজনক
- ▁ঝু
- ▁অবকা
- ▁পেলেন
- ▁ফিনা
- ঃস
- ▁ঠাতা
- ▁লবণ
- ▁বিলাস
- ▁তিনজন
- ▁রশমন
- লিসা
- ▁পরিপূর
- ▁কিউবা
- ▁মিকা
- বদলে
- ▁জেনো
- পসাগর
- ▁বেসরকার
- ▁সুপ
- ▁যুইটি
- ▁চাইনি
- ▁ধিমূলক
- টিউ
- ▁ফাটল
- ▁সেলগুলি
- িওপি
- ▁নজির
- ▁হামলা
- ▁পুরু
- ▁অমরজত
- ▁তরণটি
- ▁করলাম
- ▁কখনো
- ▁মশালটি
- ▁গকগুলি
- ▁দিকগুলি
- ▁গমনকারী
- ▁দেখাবে
- ▁চাইলে
- নেভি
- ▁সাপগুলি
- ▁নোট
- ▁যানবাহনগুলি
- ▁সোমা
- ▁দেখেনি
- ▁োগকারীদের
- ▁রাইলোবাইটগুলি
- ▁ষণশীল
- ▁সেতুগুলি
- ▁বিবেক
- ▁খোঁজে
- ▁দেশগুলো
- ▁তারকা
- রীস
- ▁ডফিল
- ▁নাগাল
- ▁বোনাইজ
- ▁থেরাপিউটিকস
- ▁জিগ
- ▁যাপট
- ▁যৌগ
- ▁রুপার
- ▁রচল
- ▁যারিস
- ▁সহনশীল
- ▁বিনা
- াখা
- ▁যহীনতা
- ▁ভিজি
- ▁আঠা
- ▁ফাইন
- ▁ডুব
- ▁বইটি
- ▁সংযোগগুলি
- ▁রাফট
- ▁রবালের
- ▁ফে
- াসী
- সূরি
- সেছিলেন
- ▁যাসেল
- ▁গাইড
- ▁তাঁর
- ▁রোট
- ▁পনগুলি
- ▁গীতি
- ▁ধৃত
- োবা
- ▁বাবধান
- ▁সারিতে
- নামূল
- কভাবে
- ▁পৌঁছান
- লিখিত
- ▁তূপ
- ▁শিকারি
- ▁যথাস
- মেজ
- ীকৃত
- নাতনি
- ▁টরে
- ুখী
- চেতন
- ▁যাবলে
- ▁ধারণাগুলি
- ▁জীবগুলি
- ▁কাজিন
- ▁560
- হেলম
- ধমনী
- ▁করুণা
- ▁করেছ
- আ
- ১
- '%'
- ':'
- c
- h
- l
- n
- z
- ü
- ২
- ৪
- ৩
- ঔ
- ঈ
- Ö
- U
- ঋ
- ঐ
- '?'
- O
- ।
- ়
- <sos/eos>
src_token_list:
- <blank>
- <unk>
- s
- ▁the
- ▁to
- ▁and
- ▁of
- ▁a
- ing
- ▁in
- ed
- ▁that
- ▁is
- ▁
- ly
- ▁we
- ▁it
- ▁you
- d
- ▁this
- ▁for
- ▁on
- e
- ▁be
- ▁but
- ▁with
- ▁so
- ▁are
- ▁i
- ▁have
- ▁can
- ▁they
- y
- ▁was
- ▁as
- ▁its
- ▁from
- ▁their
- ▁at
- ▁what
- ▁by
- ▁one
- ▁our
- ▁an
- ▁or
- ▁al
- ▁like
- ▁more
- ▁when
- ▁your
- ▁were
- ▁some
- ▁all
- ▁not
- ▁these
- n
- ▁people
- t
- ▁how
- ▁if
- ▁about
- to
- ▁there
- ▁do
- ▁would
- ▁up
- ▁her
- al
- nt
- ▁which
- ▁just
- er
- ▁them
- ▁us
- ▁he
- ▁other
- ▁now
- es
- ▁know
- ▁has
- ▁s
- ▁make
- ▁my
- ▁most
- ▁because
- ▁could
- ▁world
- ▁than
- ▁work
- ▁need
- ▁will
- ▁had
- ▁out
- a
- r
- so
- ▁over
- ▁who
- o
- ▁time
- ▁new
- ▁even
- ▁get
- ve
- many
- ▁well
- ▁she
- ▁no
- ▁energy
- en
- ion
- ▁where
- ▁climate
- ▁take
- st
- ▁change
- ▁me
- ▁power
- ▁un
- ▁two
- ▁first
- ▁help
- ▁look
- ▁way
- le
- ▁dont
- ▁years
- ▁re
- ▁here
- ▁call
- ▁think
- p
- ▁m
- in
- ▁carbon
- able
- c
- ▁much
- ▁those
- ever
- ▁been
- ▁use
- ▁right
- ▁co
- m
- thing
- ▁go
- ▁want
- ▁through
- l
- ▁see
- ▁same
- ive
- i
- ▁may
- ▁really
- ers
- ▁going
- ▁every
- ▁then
- ▁cells
- ▁life
- ▁his
- ▁different
- ▁should
- out
- u
- ▁around
- ▁im
- ▁things
- ight
- re
- g
- ▁still
- ▁sleep
- ▁any
- ▁say
- 'on'
- ▁high
- ▁while
- ▁back
- ▁long
- each
- ous
- ▁light
- ▁b
- ▁let
- ▁does
- ▁down
- th
- ▁own
- ies
- ▁three
- ▁start
- ▁countries
- ▁before
- ▁made
- ▁show
- ▁system
- ▁come
- very
- ▁large
- ic
- ▁tell
- yre
- ▁percent
- ▁global
- ▁india
- ▁after
- ▁water
- ▁n
- ▁often
- ▁off
- ▁again
- ▁part
- ll
- ▁ma
- ▁company
- ▁country
- twe
- ▁being
- k
- ▁next
- ▁used
- ▁point
- ▁de
- ▁today
- ▁hear
- ready
- ▁day
- ▁actually
- ▁try
- ▁year
- ▁lot
- ▁less
- ar
- ting
- ▁ca
- ▁answer
- ▁keep
- ▁develop
- ▁another
- ment
- ▁end
- ▁d
- ation
- ce
- b
- ▁per
- us
- ▁grow
- ▁enough
- ▁why
- ▁live
- ful
- or
- ▁g
- ▁number
- ▁place
- ▁give
- ▁few
- ▁t
- ▁19
- ▁20
- '2'
- ▁create
- ▁important
- ▁believe
- ▁single
- ▁cities
- ▁10
- uch
- ▁move
- ▁example
- ▁fossil
- ▁mrna
- ▁better
- ther
- lar
- ▁fact
- ▁p
- ▁find
- ▁city
- ▁did
- ▁mean
- ▁emissions
- is
- an
- ur
- w
- ▁research
- ▁kind
- ▁government
- ▁allow
- ▁case
- ▁person
- ▁small
- ▁green
- ▁across
- ▁him
- ▁second
- ▁home
- ▁human
- ▁night
- ▁thank
- ▁air
- ▁best
- ▁future
- ▁space
- ▁together
- ▁become
- ince
- ▁both
- ▁must
- ▁cant
- ▁dur
- ▁seem
- ▁side
- h
- ▁man
- one
- ▁body
- ▁car
- ▁problem
- ate
- ity
- ▁too
- ▁under
- ▁final
- ▁play
- ▁build
- ary
- ▁form
- ▁set
- ▁e
- ▁money
- ▁good
- ▁five
- ▁follow
- ▁last
- ▁economy
- ry
- est
- ▁pay
- ite
- ▁plants
- ▁bas
- be
- ▁doing
- ▁companies
- ▁entire
- ▁great
- ▁yet
- ▁far
- ▁everyone
- ▁story
- ▁times
- ▁away
- ▁ra
- ▁f
- ty
- ▁stop
- most
- ▁force
- ▁scale
- ▁impact
- less
- ▁mass
- ▁sp
- ▁once
- ▁found
- ▁ear
- ▁hair
- ▁possible
- ▁sound
- ▁electric
- ▁states
- it
- ▁red
- ▁produce
- ▁million
- ▁feel
- ▁put
- ▁planet
- ▁means
- ▁later
- ▁o
- ▁value
- ▁telescope
- ▁result
- el
- ▁ask
- ▁cost
- '00'
- ▁turn
- ▁health
- ▁industry
- ▁potential
- ▁clear
- ▁consider
- ▁gold
- ▁didnt
- ▁heart
- ▁brain
- v
- ▁lead
- ▁thing
- ▁plant
- making
- ▁working
- ▁process
- ▁technology
- ▁name
- ▁electricity
- ▁sure
- ally
- x
- age
- ▁15
- ble
- ness
- ▁ever
- ▁mo
- ▁family
- ▁earth
- ▁rule
- ▁r
- ▁hand
- ▁god
- ▁others
- ▁old
- ▁share
- ▁vaccines
- ▁k
- ▁heat
- ▁continue
- ▁talk
- ▁question
- ▁fuel
- sh
- ▁business
- ▁economic
- ▁little
- ▁dioxide
- ▁run
- ▁public
- ▁cause
- ▁reach
- ▁scientists
- ▁com
- ▁land
- ▁seen
- ▁st
- '9'
- ▁understand
- ld
- la
- ▁got
- ▁whe
- ▁big
- ▁century
- ▁concrete
- ▁support
- ▁food
- ▁always
- ▁job
- ▁certain
- ▁act
- ▁design
- ▁head
- ▁resources
- ▁6
- ▁came
- ors
- ▁unit
- ▁c
- ▁control
- ▁level
- ▁test
- ▁1
- ge
- ▁war
- ▁gas
- ▁africa
- ▁hard
- ▁mitochondria
- ▁remain
- ▁along
- ▁nature
- ▁near
- ▁billion
- ▁white
- ▁exactly
- ▁11
- ▁instead
- ▁discover
- ter
- ▁3
- ▁4
- '0'
- ▁creat
- ▁includ
- ▁protein
- ine
- ▁love
- ▁close
- ▁amount
- ▁degree
- ▁desert
- ▁particular
- ▁pandemic
- ▁data
- ▁approach
- ▁renewable
- ▁cement
- ▁blue
- ▁lower
- at
- ▁cover
- ▁bring
- ▁lives
- ▁thousands
- ▁fuels
- li
- ▁protect
- ▁idea
- ▁term
- ▁group
- ▁difficult
- ▁systems
- ▁started
- ck
- ▁sea
- ▁treat
- ▁immun
- ▁platform
- ▁women
- ▁current
- ▁model
- ▁history
- ▁began
- ▁short
- ▁eye
- ▁save
- ▁team
- ▁sun
- ▁moment
- ▁usual
- ▁table
- member
- ▁happen
- ▁provide
- ▁true
- ▁coal
- ▁wanted
- tic
- ▁pre
- he
- ▁trees
- ▁ways
- ▁color
- ▁vaccine
- ▁full
- ought
- ▁response
- ▁themselves
- ▁natural
- ▁prevent
- ▁exist
- ▁information
- ▁took
- ▁buy
- ▁thought
- ▁mind
- per
- ▁though
- ized
- til
- il
- ish
- ▁face
- ▁clean
- ian
- ▁product
- ▁increase
- ▁low
- ▁humans
- ▁da
- ▁species
- ▁local
- ▁rich
- ▁huge
- ▁break
- ▁stories
- ▁order
- ▁blood
- ▁late
- ▁fight
- ▁reduce
- ▁quickly
- ▁seat
- ance
- ▁dis
- ▁du
- ▁real
- ir
- ist
- ▁2
- ▁map
- ▁invest
- am
- ▁bone
- ▁experience
- ▁governments
- ▁built
- ▁communities
- ▁yingtai
- ▁situation
- ▁return
- ▁similar
- ▁zero
- ▁south
- ▁standard
- ▁further
- ▁state
- ▁young
- ▁h
- ▁months
- ▁ago
- ory
- ▁sh
- ▁vari
- ▁liv
- te
- ra
- ▁5
- ▁black
- ▁saving
- ▁reason
- ▁laughter
- ▁market
- ▁children
- ▁desp
- ▁open
- ▁sky
- ▁fall
- ▁word
- ▁inter
- ▁vi
- ul
- ▁action
- ▁groups
- ▁bit
- rs
- ▁factor
- ▁dr
- ▁ex
- z
- ety
- ern
- ▁drive
- ▁u
- ▁chemical
- ▁female
- ▁ground
- ▁hundred
- ▁access
- ▁addition
- ▁least
- ▁sex
- ▁specific
- ▁chance
- ▁death
- ▁walk
- ▁sendler
- ▁half
- ▁anxi
- ▁left
- ▁sometime
- ▁port
- ▁pass
- ▁wind
- ▁arent
- ma
- ▁longer
- ▁free
- ▁toward
- ▁questions
- ▁soon
- cy
- ▁king
- ▁w
- came
- ▁cars
- ▁ga
- ▁poor
- ▁bridge
- ▁evidence
- ▁extreme
- ▁risk
- ▁surface
- ▁wrong
- taking
- ▁common
- ▁probab
- ▁plate
- ▁moon
- ▁either
- ▁language
- ▁sw
- co
- ▁cop
- ▁american
- ▁kilometers
- ily
- ▁am
- ▁inside
- ch
- ▁millions
- ize
- ▁house
- ▁materials
- ▁step
- ▁mi
- ▁con
- '8'
- ▁cockroach
- ▁racial
- ▁study
- ▁woman
- ▁modern
- ▁key
- saw
- ▁dna
- ▁survive
- ▁total
- ▁cell
- ▁op
- ▁line
- ▁pa
- ▁tra
- ▁hours
- ▁simple
- hap
- ▁size
- ▁solutions
- ▁200
- ▁safe
- ▁contain
- ▁require
- ant
- ▁past
- ▁success
- ▁fast
- ▁transform
- ▁fear
- ▁behind
- ▁english
- ▁essential
- ▁exponential
- ▁science
- ▁trilobite
- ▁yourself
- ▁demand
- ▁focus
- ▁multiple
- ▁third
- ▁said
- ▁covid
- ▁income
- ▁pressure
- ▁four
- ▁hold
- ▁lack
- ▁significant
- ▁astronomer
- ▁easy
- ▁disk
- ▁done
- ▁growth
- ▁pair
- ▁fire
- ▁dollars
- ▁asian
- ▁matter
- '5'
- ▁vehicles
- id
- vi
- ▁pro
- ▁simpl
- ▁30
- ▁decades
- ▁fat
- ▁results
- ▁areas
- '4'
- ▁waste
- ▁building
- ▁deliver
- ▁sit
- ▁egypt
- ▁view
- ▁community
- ▁conversation
- ▁pigeon
- ▁suggest
- ▁wavelength
- ▁constant
- ▁course
- ▁dream
- ▁unique
- ▁itself
- ▁send
- ▁absorb
- ▁law
- cient
- ▁universe
- ▁north
- ated
- ▁respond
- ▁sense
- ▁eat
- ▁forward
- ▁school
- ▁places
- ▁tree
- ure
- ▁hit
- tton
- '6'
- ▁goal
- ▁individuals
- ▁days
- ▁sha
- ▁patients
- ling
- ▁shift
- ▁limit
- ted
- ▁generate
- ▁en
- ▁star
- ▁solve
- land
- ▁visit
- ▁8
- ▁12
- ▁sub
- ▁begin
- ▁available
- ▁difference
- ▁greek
- ▁measure
- ▁opportunity
- ▁damage
- ▁increasing
- ▁relative
- ▁typical
- ▁release
- ▁middle
- ▁wont
- ▁center
- ▁weather
- ▁cancer
- ▁rise
- ▁loop
- ▁class
- ▁fix
- ▁role
- ▁policies
- ▁loss
- ▁care
- ▁outside
- ▁image
- ▁lord
- ▁basic
- ▁l
- ▁add
- as
- ng
- ▁pret
- ▁nation
- ▁range
- ▁challenge
- ba
- ▁sum
- ▁di
- pri
- ▁learn
- ▁trap
- ▁appear
- ▁interest
- ▁goals
- ▁anti
- though
- ▁determine
- ▁hope
- ▁virus
- ▁campaign
- ▁efficiency
- ▁knew
- ▁plastic
- ▁pollut
- ▁poverty
- ▁racism
- ▁stereotype
- ▁studie
- ▁behavior
- ▁avoid
- ▁river
- ▁moving
- ▁coast
- ▁instance
- ▁square
- ▁skin
- ▁effective
- ▁reality
- ▁faster
- ▁environmental
- ▁rate
- ▁trade
- ▁plan
- ▁completely
- jar
- ▁speed
- ▁tri
- ▁alone
- ▁sort
- ▁post
- ▁national
- ▁parents
- ations
- ▁store
- ier
- ▁connect
- ▁decid
- ▁levels
- ▁non
- ▁trojan
- ▁relationship
- ▁forest
- ▁material
- ol
- ▁patterns
- ▁refugees
- ▁sign
- ▁costs
- ▁talking
- ▁imagine
- ric
- ions
- ▁kill
- ▁hav
- ▁po
- ▁arm
- na
- ▁canal
- um
- ▁strong
- ▁threat
- ▁fel
- ▁extra
- ▁efficient
- ▁incredibl
- ▁infrastructure
- ▁switch
- ▁milk
- ▁panel
- ▁slavery
- ▁social
- ▁present
- ▁warming
- ▁organization
- ▁dangerous
- ▁else
- ▁deep
- ▁corporation
- ▁divers
- ▁society
- ▁listen
- ▁moral
- ▁signal
- ▁stand
- ▁sever
- ▁janie
- ▁separate
- ical
- ▁taken
- ca
- ▁type
- ▁solution
- ▁friend
- ▁main
- ▁animals
- ver
- side
- f
- ▁rules
- ▁options
- ent
- qu
- ▁bags
- ▁engines
- ▁70
- ▁treatments
- ▁ready
- ▁benefits
- ▁top
- ring
- ac
- ▁depend
- ▁manufactur
- tion
- ▁meet
- while
- ▁problems
- ▁americans
- ▁cur
- ▁works
- ▁homes
- ▁central
- ▁political
- ▁sappho
- ▁technologies
- ▁transition
- ▁escape
- ▁ocean
- ▁reflect
- ▁rough
- ▁unlike
- ▁crisis
- ▁jeans
- ▁eventually
- ▁harm
- ▁spouse
- ▁trans
- ▁2020
- ▁atmosphere
- ▁egyptian
- ▁attack
- ▁safety
- ▁mother
- ▁grid
- ▁stay
- ▁vir
- ▁bank
- ▁changed
- '3'
- ide
- go
- ▁structure
- ▁workers
- ▁organ
- ▁giv
- ▁numbers
- ▁ver
- ▁learned
- ▁cl
- ability
- de
- ▁lie
- ▁hi
- ▁16
- ▁recognize
- ▁game
- ▁benefit
- ▁saying
- ▁led
- ▁leave
- ▁temperatures
- ▁br
- ▁men
- ▁estimate
- ▁100
- ped
- ▁read
- des
- ▁th
- are
- ▁travel
- ▁fund
- ▁lo
- ▁improve
- head
- ▁cut
- ▁needs
- ▁tea
- ▁progress
- q
- ▁perform
- ▁complex
- ▁general
- ▁urban
- ▁advance
- ▁california
- ▁journey
- ▁perspective
- ▁quality
- ▁supply
- ▁affect
- ▁birth
- ▁piece
- ▁slow
- ▁dentist
- ▁spider
- ▁famili
- ▁favor
- ▁fair
- ▁child
- ▁trick
- ▁mona
- ▁everybody
- ▁flight
- ▁cool
- ▁flow
- ▁mali
- ▁track
- ▁male
- ▁manage
- ▁durr
- ▁colors
- ▁lying
- ▁notice
- ▁personal
- ial
- ▁doctor
- ▁weeks
- ▁80
- ments
- op
- '0000'
- yon
- cu
- ▁ensure
- ▁offer
- ▁camp
- ▁projects
- ▁lipid
- ▁molecule
- ▁treatment
- ▁candles
- ▁asked
- ▁points
- ▁engineers
- ▁mine
- ▁types
- ▁mu
- ▁chang
- nd
- ard
- ▁id
- my
- ▁compar
- ze
- king
- ▁agree
- ship
- national
- '1'
- ▁supernova
- ward
- tual
- ph
- ▁machine
- ▁author
- ▁capacity
- ▁celsius
- ▁driving
- ▁grandfather
- ▁oxygen
- ▁picture
- ▁satellite
- ▁scientific
- ▁shanbo
- ▁village
- ▁evolv
- ▁record
- ▁equip
- ▁floor
- ▁numer
- ▁regard
- ▁seven
- ▁among
- ▁neighborhood
- ▁lisa
- ▁normal
- ▁feed
- ▁indeed
- ▁catch
- ▁personality
- ▁kid
- ▁lose
- ▁innovation
- ▁east
- ▁six
- ▁trust
- ▁train
- ▁ne
- ▁sell
- ▁germany
- ▁teach
- ▁inject
- ell
- nese
- ▁ball
- ▁rare
- ▁mammals
- ▁carey
- ▁everything
- ▁products
- ▁eggs
- ▁healthy
- ities
- ia
- ▁bonds
- ▁rest
- norm
- ▁adults
- ▁locusts
- ▁cr
- ▁cu
- ▁area
- ▁cooperatives
- ▁tons
- ▁ph
- ▁die
- ▁le
- ▁figure
- ▁conditions
- ▁environment
- ▁regions
- ▁engine
- ru
- ▁seizure
- ▁pattern
- ▁refugee
- ▁bu
- ▁patient
- ▁bag
- ▁hour
- ▁engineer
- ▁turns
- lands
- lf
- ▁18
- ▁50
- ▁weight
- ▁created
- ▁ab
- ▁tru
- bi
- '7'
- ▁forms
- ▁burn
- fortunate
- ▁age
- ect
- ha
- ating
- ▁kings
- ▁europe
- ▁major
- ▁kush
- ▁rock
- ▁account
- ▁ambition
- ▁apnea
- ▁citizens
- ▁dramatically
- ▁education
- ▁footprint
- ▁ourselves
- ▁predators
- ▁producing
- ▁recipient
- ▁student
- ▁sudden
- ▁tackl
- ▁target
- ▁vulnerable
- ▁block
- ▁extend
- ▁racist
- ▁speech
- ▁daughter
- ▁below
- ▁committ
- ▁push
- ▁service
- ▁tiny
- ▁easier
- ▁causing
- ▁sector
- ▁launch
- ▁quite
- ▁describe
- ▁jew
- ▁capital
- ▁mention
- ▁wait
- ▁dome
- ▁pull
- ▁24
- ▁construction
- ▁born
- ▁financial
- ▁chain
- ▁banda
- ▁ari
- ▁sustainable
- ▁position
- ▁humanity
- ▁race
- ▁twin
- sion
- ▁sent
- ▁enter
- ▁expect
- ▁policy
- ▁rain
- wise
- ▁shipp
- ▁shape
- ▁bigge
- ▁medic
- ▁special
- om
- ▁tend
- ▁waves
- ▁remov
- ake
- ▁201
- utch
- ock
- ▁decarbonize
- ▁news
- ▁door
- ▁populations
- ▁fly
- ▁happening
- ▁realiz
- ▁source
- ▁price
- ▁disease
- ▁minute
- ▁candle
- ▁decade
- ▁molecules
- coming
- ▁fl
- dp
- ▁seeing
- un
- ▁el
- act
- ▁challenges
- ▁pee
- ▁wa
- ▁capture
- ▁meaning
- scar
- special
- ▁complete
- ▁swap
- ▁lines
- ▁pack
- ▁tax
- ▁sultan
- ▁novel
- ▁count
- ▁dark
- ▁civil
- ▁advantage
- ▁multi
- ▁physical
- ▁recycl
- ▁surround
- ş
- ▁accord
- ▁alternative
- ▁attention
- ▁cactus
- ▁culture
- ▁economist
- ▁myself
- ▁negative
- ▁relief
- ▁salesman
- ▁secret
- ▁expand
- ▁finance
- ▁report
- ▁glass
- ▁consequences
- ▁excited
- ▁choice
- ▁guess
- ▁loud
- ▁court
- ▁stick
- ▁address
- ▁calcium
- latoni
- ▁electrons
- ▁network
- ▁adopt
- ▁watch
- ▁translate
- ▁maintain
- ▁industrial
- ▁bad
- ▁distance
- ▁2030
- ▁batteries
- ▁communication
- ▁impossible
- ▁location
- ▁25
- ▁galaxy
- ▁frequent
- ▁camera
- ▁anywhere
- ▁truth
- ▁operation
- ▁associate
- ▁direction
- ▁elevator
- ▁fit
- ▁90
- ▁stable
- ▁inequality
- ▁gap
- ▁worse
- ▁bodies
- ▁wor
- ▁legal
- ▁raise
- da
- ▁ship
- ▁severe
- ▁em
- ▁fr
- ▁muscles
- ▁beat
- ▁proteins
- ging
- ▁tools
- ride
- lines
- ▁leader
- ft
- oke
- ten
- ious
- ▁diseases
- ▁minutes
- ▁accelerate
- ▁emit
- ▁seizures
- mat
- ▁understanding
- ▁road
- ▁project
- ▁effect
- some
- ▁viper
- ▁nanoparticle
- ▁condition
- ▁region
- ▁temperature
- ▁individual
- ▁option
- ▁coat
- ▁turning
- ifi
- ▁friends
- res
- ▁stor
- not
- ▁1000
- ▁la
- lob
- amp
- ap
- ▁combin
- ▁speak
- ang
- body
- izing
- house
- ▁lay
- ▁involve
- ▁america
- ▁val
- ▁abundan
- ▁arrest
- ▁chlorophyll
- ▁cloud
- ▁colleague
- ▁context
- ▁couple
- ▁economies
- ▁effort
- ▁enslaved
- ▁erik
- ▁experiment
- ▁genetic
- ▁hurston
- ▁infection
- ▁length
- ▁limestone
- ▁method
- ▁outbreak
- ▁positive
- ▁predict
- ▁recommend
- ▁urethra
- ▁annual
- ▁conflict
- ▁destroy
- ▁dialect
- ▁flood
- ▁hormone
- ▁narrow
- ▁scatter
- ▁therapy
- ▁thick
- ▁throw
- ▁volume
- ▁continent
- ▁ecosystems
- ▁fail
- ▁girl
- ▁thrive
- ▁field
- ▁swarm
- ▁drug
- ▁vibrat
- ▁average
- ants
- ▁morning
- ▁corner
- ▁stretch
- ▁overall
- ▁subject
- ▁hung
- ▁heav
- ▁orbit
- ▁instructions
- ▁pick
- ▁donor
- ▁responsibility
- ▁search
- ▁list
- ▁labor
- ▁match
- ▁join
- ▁european
- ▁stress
- ▁crucial
- ▁nap
- ▁yes
- vas
- ▁worth
- ▁park
- ▁invent
- mbus
- ▁host
- ▁gather
- ▁sethe
- ▁wage
- ▁operator
- ▁check
- ▁letter
- ▁actions
- ▁spread
- ▁gone
- ▁sexual
- ▁inspired
- ▁boy
- ular
- ▁cross
- ▁tail
- ▁production
- ▁critical
- ▁literally
- ▁surprising
- ▁cycle
- ▁connection
- ▁harder
- ▁storm
- ▁involved
- ▁z
- '000'
- ordina
- ible
- ▁fortunate
- ▁song
- eries
- ▁bin
- ▁mid
- ▁characters
- ▁safely
- ▁inten
- ja
- ▁jun
- ▁0
- ▁400
- kept
- ▁ix
- ▁features
- ▁objects
- ▁needed
- ▁practices
- ▁tens
- ▁meeting
- ▁meters
- ▁west
- ▁learning
- hol
- ▁gra
- ▁achieve
- ▁cutt
- ative
- ▁event
- ▁contribut
- ▁nanoparticles
- ▁hum
- ilitar
- ▁fac
- ▁lipids
- ▁period
- war
- ▁population
- ▁animal
- ▁issue
- ▁regulat
- ▁plans
- way
- ▁island
- ▁te
- ▁site
- lle
- ven
- ▁j
- ▁40
- ▁matters
- ▁corrupt
- ▁expos
- where
- ▁dead
- ▁quick
- ▁bar
- ▁languages
- ▁wide
- ▁polic
- ▁abo
- shed
- ▁german
- ▁twi
- room
- ism
- ▁chi
- ▁react
- ▁transport
- tract
- ▁lab
- ▁correct
- ▁immediate
- ▁pilot
- ▁previous
- ▁smuggl
- ▁altogether
- ▁bottom
- ▁component
- ▁husband
- ▁kenya
- ▁kilogram
- ▁pigments
- ▁platypus
- ▁responsible
- ▁suppose
- ▁temple
- ▁torture
- ▁york
- ▁battle
- ▁concentrat
- ▁status
- ▁witness
- ▁globe
- ▁peace
- ▁radical
- ▁serious
- ▁urine
- ▁tough
- ▁dozen
- ▁nice
- ▁minorit
- ▁beginning
- ▁accurate
- ▁consistent
- ▁street
- ▁running
- ▁round
- ▁message
- ▁spike
- ▁tech
- ▁convert
- ▁define
- ▁draw
- ▁2050
- ▁17
- ▁transportation
- ▁peg
- ▁paper
- ▁members
- ▁rna
- ▁necessary
- ▁nucle
- ▁strategy
- ▁rocket
- ▁wall
- ▁office
- ▁activity
- ▁flee
- ▁recent
- ▁bright
- ▁path
- ▁slight
- time
- ▁miss
- ▁drop
- ▁taxes
- ▁independent
- ▁stronger
- rising
- ▁claim
- ▁front
- ▁receive
- rch
- ▁anyone
- ▁displac
- ▁press
- ▁apply
- ium
- ▁widely
- ▁room
- ▁hopeful
- ▁smaller
- ▁internal
- ▁dry
- ▁pathways
- ▁grown
- orn
- ▁balance
- ▁required
- ▁min
- ull
- ▁carrie
- ▁ruler
- ▁captur
- ▁fully
- ▁ste
- ef
- ▁fa
- ▁terms
- ▁ideas
- ▁brothers
- ▁generations
- ▁towers
- ▁7
- ▁compare
- ▁include
- ▁bi
- ▁wr
- ▁manufacturer
- ▁whos
- ▁nor
- ▁died
- ▁stopp
- ▁happened
- ▁active
- ▁issues
- ▁aid
- ▁effects
- ▁seek
- ▁observat
- '50'
- ence
- ▁emitt
- ▁sett
- ▁environments
- ▁hous
- ▁continu
- fi
- ▁decide
- ▁ok
- ▁son
- bility
- ▁investment
- ese
- led
- inning
- ▁african
- ▁consume
- gs
- cent
- mon
- ▁ill
- tain
- gu
- im
- ▁buildings
- ▁film
- nder
- line
- set
- lem
- ▁direct
- ▁threaten
- loved
- work
- ▁li
- back
- ▁rais
- ▁arrive
- ▁sk
- ▁ban
- ▁paint
- ▁div
- ▁apart
- ▁adapt
- ▁germ
- ▁che
- ▁laugh
- if
- timate
- ▁attempt
- ▁survey
- ▁afternoon
- ▁bladder
- ▁budget
- ▁crazy
- ▁disguise
- ▁empire
- ▁equivalent
- ▁exchange
- ▁expensive
- ▁explain
- ▁feet
- ▁habit
- ▁himself
- ▁institution
- ▁japan
- ▁membrane
- ▁morrison
- ▁nazi
- ▁opportunities
- ▁peruggia
- ▁represent
- ▁revolution
- ▁scenario
- ▁shoot
- ▁strength
- ▁symptoms
- ▁tanuki
- ▁transfer
- ▁victim
- ▁wonder
- ▁application
- ▁murder
- ▁promis
- ▁stuck
- ▁worldwide
- ▁kingdom
- ▁grew
- ▁trillion
- ▁priv
- ▁document
- ▁score
- ▁super
- ▁giant
- ▁finish
- ▁confiden
- ▁acid
- ▁popular
- ▁traits
- ▁italian
- ▁wales
- ▁sand
- ▁analys
- ▁solid
- ▁china
- ▁wild
- ▁approv
- ▁sterilization
- ▁steel
- ▁indicat
- ▁compromise
- ▁collective
- ▁nile
- ature
- ▁kushite
- ▁majority
- ▁andrea
- ▁edge
- ▁syrian
- rench
- ▁chel
- ▁traditional
- ▁fur
- ▁visibl
- ▁fam
- ▁root
- ▁box
- ▁suit
- ▁prepared
- ▁stage
- ▁fine
- ▁mark
- ▁reaction
- ▁removal
- ow
- ▁please
- ▁amazing
- ▁gate
- cular
- ▁cold
- '26'
- ▁fi
- ▁convinc
- mit
- ▁art
- ▁cable
- ez
- ▁liter
- acious
- ▁spe
- ▁mission
- ▁station
- mental
- ▁processes
- ▁concerns
- anic
- ▁bones
- ▁warn
- ▁disc
- ▁maps
- com
- tch
- ▁bare
- ▁net
- ▁scar
- ▁code
- ▁flu
- ▁beliefs
- ▁residents
- ▁exposed
- ▁painting
- ▁soil
- ▁primar
- ▁combined
- ▁farms
- ff
- ▁speaking
- fully
- ▁sold
- ▁town
- ished
- ility
- ological
- ▁lit
- ▁pl
- ▁miner
- ▁officials
- ▁ingredients
- ▁spine
- ▁cleaner
- ▁neighbors
- ▁75
- ▁consum
- ati
- eth
- ▁fan
- ▁regulator
- ▁partners
- ▁tests
- ▁shifting
- ham
- ▁loved
- ▁replace
- ction
- unch
- ▁22
- ▁dut
- ific
- ▁hub
- ▁cha
- ▁roads
- ▁ii
- cc
- cious
- ▁prices
- ami
- ▁ho
- ▁equal
- ▁camps
- ▁produces
- ▁ent
- ▁contribute
- ▁games
- ▁organs
- ▁36
- ber
- ▁tissue
- ▁chamber
- ▁partner
- ▁cat
- ▁route
- ave
- ▁cooperative
- ▁vehicle
- ▁month
- ▁thousand
- ▁relationships
- ▁sums
- mal
- nia
- lic
- ▁app
- ▁structur
- ouri
- onvi
- ▁breath
- ternal
- gi
- ▁ran
- wn
- ets
- gh
- how
- ▁perfect
- ▁sta
- ▁leav
- ▁elevat
- ▁emerg
- ▁bill
- ▁command
- cast
- bti
- ▁spot
- ▁instruct
- ▁respect
- refore
- ▁ambitious
- ▁coronavirus
- ▁cultural
- ▁customers
- ▁disaster
- ▁elephant
- ▁encounter
- ▁horizontal
- ▁hydrogen
- ▁kettle
- ▁mainframe
- ▁messenger
- ▁mirror
- ▁permanent
- ▁photograph
- ▁precise
- ▁principle
- ▁rival
- ▁sahara
- ▁scholar
- ▁tunnel
- ▁choose
- ▁deserve
- ▁distribut
- ▁president
- ▁purpose
- ▁task
- ▁temporar
- aught
- ▁blow
- ▁collaborati
- ▁occur
- ▁resilien
- ▁suffer
- ▁surgeon
- ▁august
- ▁coalition
- ▁protest
- ▁programs
- ▁exciting
- ▁naked
- ▁aspect
- ▁photosynthesis
- ▁weapon
- ▁vertical
- ▁shot
- ▁strauss
- particles
- ▁beautiful
- ▁genes
- ▁disappear
- ▁nerve
- ▁prosper
- ▁function
- romed
- ▁preserv
- ▁encourag
- ▁trigger
- ▁trauma
- ▁indig
- ▁detect
- ▁mainstream
- ▁guide
- ▁iran
- ▁approximat
- ▁reused
- ▁justice
- ▁clinic
- ▁roman
- ▁superheroes
- ▁01
- ▁degrad
- ▁self
- ▁commission
- ▁wealth
- ▁crop
- ▁agenc
- ▁sick
- ▁27
- ▁flat
- ▁epic
- ▁hang
- ▁tale
- ▁ethic
- ▁assembl
- ▁proportion
- ▁broad
- ▁lock
- ▁14
- ▁invite
- ▁workplace
- ▁vary
- marine
- ▁194
- ▁definition
- ▁dig
- ▁transformation
- ▁cheaper
- ▁march
- ▁adaptation
- ▁statement
- ▁jet
- ▁load
- man
- ▁metal
- ▁lost
- ▁carry
- ▁projection
- ▁tension
- ▁fill
- ▁owe
- xic
- ▁curve
- ▁digital
- ▁cannon
- ▁ideal
- ▁fish
- ▁bce
- ▁mom
- ption
- ▁rely
- ▁useful
- etic
- ▁perfectly
- ▁army
- ▁movement
- ▁tear
- ▁eugenics
- ▁successful
- ▁reproducti
- ▁agreed
- ▁careful
- net
- ural
- reliable
- ▁threatened
- ▁coin
- ▁corrupted
- ▁bo
- non
- ▁myth
- ▁increases
- ▁coen
- ort
- ▁solv
- ▁mining
- ▁container
- ▁grand
- spend
- ▁indian
- nit
- ▁met
- mber
- ▁si
- ▁micro
- ▁killed
- ▁wi
- ▁funding
- ▁organisms
- ▁oil
- ▁cont
- ▁limited
- ▁alive
- ▁deal
- men
- ▁routes
- ▁tissues
- ▁chambers
- ▁exc
- ▁pin
- ▁investments
- vert
- ▁pursue
- ▁hands
- ▁periods
- ▁10000
- ▁pai
- ▁argue
- ▁vipers
- ▁observ
- vor
- ▁hot
- ▁basi
- ile
- ▁sil
- ▁sources
- ▁operate
- light
- ▁islands
- ▁ten
- ▁owner
- ▁brains
- asse
- ▁werent
- ▁telescopes
- ▁ha
- ▁conve
- ▁leaves
- lin
- ▁obstructi
- ▁2000
- ▁pe
- ▁par
- broken
- rait
- ▁forever
- ▁win
- ▁trojans
- ▁pi
- ▁disorder
- ▁organism
- ▁neighbor
- ▁practice
- ▁locust
- ational
- ▁remove
- ▁week
- rov
- fu
- ▁meter
- ization
- ▁rid
- ▁schools
- ▁cap
- ▁cure
- istic
- over
- day
- ▁lov
- ▁tim
- ept
- uff
- ▁cho
- ▁accept
- ▁manag
- umb
- stand
- bodies
- face
- ▁deci
- ball
- ▁sens
- ▁surviv
- som
- ▁luck
- ▁cheap
- ▁happ
- ▁publi
- enome
- eign
- ▁shut
- ▁repeat
- ▁consist
- ▁prove
- hood
- ▁spirit
- sequence
- ▁confirm
- ▁persuad
- ▁promot
- precedent
- roach
- ▁amarjot
- ▁australia
- ▁beneath
- ▁bushmaster
- ▁channel
- ▁contrast
- ▁democra
- ▁devastat
- ▁diameter
- ▁discrimination
- ▁fragil
- ▁geothermal
- ▁guarantee
- ▁identify
- ▁innocent
- ▁leonardo
- ▁marriage
- ▁maximiz
- ▁mechanisms
- ▁monica
- ▁monitor
- ▁nutmeg
- ▁participant
- ▁pathogen
- ▁philosopher
- ▁piankhy
- ▁plague
- ▁possess
- ▁resolve
- ▁struggling
- ▁theranos
- ▁thrust
- ▁underground
- ▁vessel
- ▁violence
- ▁yeah
- ▁competitor
- ▁compress
- ▁establish
- ▁fashion
- ▁freetown
- ▁harness
- ▁obvious
- ▁purchas
- ▁random
- ▁smart
- ▁snoring
- ▁struggle
- ▁venom
- ▁video
- ▁healthcare
- ▁plenty
- ▁policymakers
- ▁sugar
- ▁regular
- ▁wife
- ▁earlier
- ▁tumor
- ▁encode
- ▁skill
- ▁dump
- ▁disparit
- ▁technique
- ▁brand
- ▁mouth
- ▁commercial
- ▁decisions
- ▁dense
- ▁details
- ▁calm
- ▁clay
- ▁excellen
- ▁scre
- ▁belarus
- ▁link
- ▁sweat
- ▁influence
- ▁license
- ▁media
- ▁allergi
- ▁tall
- ▁reveal
- ▁insects
- ▁brief
- meral
- ▁burrow
- ▁suspen
- ▁easily
- ▁excess
- ▁civilization
- ▁cloth
- ▁spectac
- ▁deploy
- ▁deny
- ▁wound
- ▁catastrophic
- rena
- ▁metric
- ▁industries
- ▁petrol
- ▁sunlight
- ▁flexible
- ▁flip
- ▁contract
- ▁breed
- ▁186
- ▁poop
- ▁khalil
- ▁divide
- ▁observations
- rope
- ffic
- ▁label
- ▁brig
- ▁agriculture
- ▁landscape
- ▁engage
- ▁trouble
- ▁reproduce
- ▁airway
- ▁intersecti
- ▁worry
- ▁char
- ▁authority
- ▁happiness
- ▁affordable
- ▁warsaw
- ▁gravity
- ▁identity
- aker
- ▁lifetime
- ▁everywhere
- ▁management
- ▁hunt
- ▁stem
- ▁gum
- ▁lone
- ▁squa
- ▁sew
- ▁coral
- ▁bio
- ▁clu
- ▁orient
- ▁honest
- ▁proper
- ▁emergenc
- ▁zoo
- ▁cor
- ▁factories
- ▁wings
- ▁tro
- eek
- ▁universal
- ▁experienc
- ▁circumstances
- ▁firefighters
- ▁cage
- ▁truck
- ▁span
- ▁diet
- ▁mile
- ▁rapidly
- ▁orang
- ▁stream
- ▁emotional
- ▁statistical
- ▁durable
- ▁gav
- ▁trip
- ▁mag
- ▁dispose
- ▁soft
- ▁equally
- ▁strike
- ▁directly
- ▁appli
- ddl
- ▁supernovae
- ender
- ▁writing
- ▁prop
- ference
- raw
- ▁earn
- ▁enable
- ▁foundation
- ely
- ▁devices
- ▁breathing
- ▁hell
- ▁serve
- ▁tool
- ▁resistan
- pr
- ▁magic
- ▁artist
- ▁samples
- ▁snakes
- ▁trials
- ▁poet
- ▁books
- ▁profits
- ▁ce
- take
- ▁assum
- ▁traveling
- ▁inf
- eum
- ▁disorders
- pan
- ably
- oid
- ▁desir
- ament
- ▁union
- ▁africans
- signed
- ▁spin
- ▁ro
- ▁protected
- ify
- pul
- ▁sites
- ▁vo
- ▁included
- ▁accelerat
- ▁birds
- ▁bird
- ▁mus
- ▁pit
- ▁write
- bble
- ▁ai
- ▁replac
- ▁desire
- alt
- ▁double
- ▁je
- umble
- ▁valu
- gd
- '23'
- ▁writ
- ▁book
- ▁sample
- ▁snake
- ▁object
- ▁ingredient
- ▁official
- ▁feature
- ▁brother
- ▁muscle
- ▁tower
- ▁gig
- ▁adult
- ▁generation
- ▁emission
- ▁wave
- '30'
- ▁conf
- ▁trial
- finite
- entle
- j
- gram
- ▁jan
- ike
- ▁fo
- posed
- ▁rem
- ▁leg
- ala
- strict
- think
- sleep
- ioniz
- mission
- ▁emotion
- ▁celebrate
- either
- ▁whi
- ▁compet
- ents
- ▁introduc
- ▁disrupt
- ▁collect
- ▁cit
- orba
- ipping
- ▁capita
- ▁govern
- reak
- ▁batter
- ▁arrang
- ▁belong
- ▁hollow
- ▁referr
- ▁suspect
- craft
- rogen
- ▁1970
- ▁architect
- ▁biontech
- ▁centuries
- ▁commitment
- ▁conduct
- ▁contradict
- ▁counterweight
- ▁dinner
- ▁disagree
- ▁display
- ▁disposal
- ▁distinct
- ▁employee
- ▁equator
- ▁eradicat
- ▁exercise
- ▁exoskeletons
- ▁expectancy
- ▁extrover
- ▁gestapo
- ▁hebrides
- ▁infectious
- ▁introver
- ▁journal
- ▁junajpu
- ▁louvre
- ▁nanostructure
- ▁overwhelm
- ▁phospholipid
- ▁physics
- ▁politician
- ▁practical
- ▁procedure
- ▁profess
- ▁pronouns
- ▁quarter
- ▁regime
- ▁reinforce
- ▁review
- ▁supplement
- ▁tempt
- ▁threshold
- ▁vegeta
- ▁violet
- ▁welcome
- ▁yellow
- ▁zegota
- ▁activat
- ▁ancestor
- ▁article
- ▁conquer
- ▁depart
- ▁email
- ▁herself
- ▁neuron
- ▁peak
- ▁psych
- ▁restoration
- ▁error
- ▁mystery
- ▁revenue
- ▁stunn
- ▁theoretical
- ▁household
- ▁notorious
- ▁scaling
- ▁erode
- ▁firm
- ▁frame
- ▁scene
- ▁evidentialis
- ▁freedom
- ▁title
- ▁desperate
- ▁migrat
- ▁select
- ▁summer
- ▁married
- ▁delta
- ▁shelter
- awatt
- ▁fresh
- ▁hidden
- ▁surgery
- ▁fluorescence
- ▁drain
- ▁kick
- ▁nearby
- ▁related
- ▁evad
- ▁bypass
- ▁colon
- ▁sperm
- ▁flore
- ▁begun
- ▁audience
- ▁rescue
- ▁empower
- proof
- ▁concept
- ▁creatures
- ▁museum
- ▁coordinat
- ▁intensive
- ▁diagnostic
- ▁rush
- ▁consumption
- ▁prohibit
- ▁hatch
- ▁vital
- ▁strand
- ▁persist
- tresse
- ▁missile
- ▁converg
- ▁importance
- ▁exploit
- ▁creativity
- ▁wedding
- ▁narrative
- ▁cliff
- ▁capitalism
- ▁surg
- ▁possibilities
- ▁understood
- foot
- ▁drink
- ▁persecution
- ▁sharp
- ▁flock
- courage
- ▁folk
- ▁refus
- ▁wow
- ▁noise
- ▁imagination
- ▁2021
- federa
- ▁remind
- ▁odd
- ▁extract
- ▁navigation
- ▁astronomy
- ▁illness
- ▁defi
- ▁version
- ▁impressive
- ▁systematic
- tamin
- ▁innovative
- minent
- ▁propos
- ▁resistance
- ▁shop
- ▁strait
- ▁horr
- ▁darkness
- ▁objective
- ▁father
- ound
- ▁restore
- ▁buff
- ▁restrict
- ▁excre
- ▁phone
- ▁mountain
- glad
- ▁pants
- ▁highway
- ▁announc
- ▁interconnected
- ▁hole
- ▁conditioners
- ▁pace
- ▁conscious
- ▁era
- ▁buttons
- ▁gain
- ▁lucky
- ▁barr
- ▁combine
- ▁dim
- ▁confus
- ient
- ▁packag
- ▁medication
- ▁career
- ▁board
- ▁bri
- ▁integrate
- ▁insisted
- arity
- ▁compete
- ▁plann
- immer
- ▁nest
- ▁strict
- ▁lesson
- rote
- ▁asia
- ▁band
- ▁complicated
- ▁constructed
- iment
- ▁hire
- fug
- ▁grat
- ▁closet
- roll
- ▁borders
- ▁monks
- rbag
- ▁protection
- ▁accepted
- ▁fal
- ▁26
- rated
- ▁deadly
- zer
- kay
- ▁investigat
- ▁advise
- eters
- ▁observing
- ski
- ▁prov
- ▁difficulty
- ▁improv
- ▁layer
- col
- ▁filmmak
- mad
- ▁grab
- ▁driver
- ▁meant
- ▁13
- alo
- '37'
- ▁unf
- ▁definit
- ▁burned
- cogni
- weigh
- ob
- lum
- ▁wash
- ▁profit
- obb
- ▁mono
- ▁appeared
- ▁interested
- ▁mess
- ▁comput
- ▁log
- ▁electro
- ▁meal
- ▁hid
- ▁reader
- ▁jo
- ctors
- ▁doubl
- top
- ▁ec
- ▁millennia
- ired
- urd
- gotten
- '80'
- ▁del
- eful
- ▁chris
- atic
- pur
- gar
- ▁pop
- ▁stra
- centri
- ▁spr
- comfortable
- ▁demonstrate
- fo
- ▁variet
- ▁pron
- bb
- ▁fertilize
- ▁figur
- arch
- pped
- ▁ensur
- ▁recogniz
- ls
- '20'
- ▁provid
- ▁explore
- '11'
- ▁achiev
- native
- late
- ried
- rts
- ▁160
- ▁500
- ▁ju
- ▁sali
- ▁character
- ▁35
- eeze
- ▁pen
- ▁assume
- '96'
- putation
- mph
- ▁sna
- ▁farm
- ▁fran
- eight
- ology
- ▁bond
- ▁parent
- ▁kilometer
- hal
- ▁resource
- bra
- ▁20000
- ▁serv
- ▁origin
- ▁advi
- ▁possib
- ▁historic
- place
- ification
- ▁strik
- rish
- ▁reduc
- tail
- ▁construct
- ▁pain
- ▁tip
- ▁complicat
- ▁gran
- ▁prime
- ▁substantial
- tag
- stream
- ▁conv
- front
- ▁rapid
- claim
- expect
- respond
- lord
- usual
- fall
- ▁brit
- point
- allow
- ▁exact
- ▁digit
- board
- tons
- ▁literal
- ▁install
- ▁innovat
- emed
- cti
- ▁lens
- ▁mari
- ▁epi
- ▁absor
- ▁subsidi
- ▁fell
- ▁hypothe
- ▁optim
- ▁constitut
- ▁defeat
- ▁diagonal
- ▁immunotherap
- ▁smooth
- bacteria
- ipples
- ologist
- ğ
- ▁1987
- ▁adjacent
- ▁advocate
- ▁anticipate
- ▁assistance
- ▁baby
- ▁beauty
- ▁calculat
- ▁candidate
- ▁challenging
- ▁champion
- ▁cholesterol
- ▁counterparts
- ▁crusade
- ▁curiosity
- ▁descend
- ▁destination
- ▁destruction
- ▁destructive
- ▁disappointment
- ▁dissolve
- ▁distinguish
- ▁empath
- ▁enjoy
- ▁equitabl
- ▁evaluat
- ▁export
- ▁ghetto
- ▁ghost
- ▁gradual
- ▁hospital
- ▁implement
- ▁inclusiv
- ▁inherit
- ▁intervention
- ▁laboratory
- ▁liquid
- ▁luxury
- ▁mercator
- ▁micah
- ▁miracle
- ▁nebula
- ▁nervous
- ▁neutral
- ▁olaf
- ▁opposite
- ▁participat
- ▁pesticide
- ▁puzzle
- ▁pyrethroid
- ▁rainforest
- ▁rattlesnake
- ▁rebuil
- ▁register
- ▁resolution
- ▁rognvald
- ▁secure
- ▁spectrum
- ▁statue
- ▁television
- ▁therapeutic
- ▁throat
- ▁vulture
- ▁wood
- phobia
- ▁abandon
- ▁accident
- ▁automatic
- ▁bucket
- ▁burden
- ▁competency
- ▁consult
- ▁equity
- ▁evaporat
- ▁interview
- ▁knowledge
- ▁legacy
- ▁legislat
- ▁mathematic
- ▁niger
- ▁plummet
- ▁taste
- ▁technical
- ▁transplant
- itarian
- ▁chronic
- ▁compell
- ▁crowd
- ▁empty
- ▁incarcer
- ▁misfir
- ▁poison
- ▁quantit
- ▁turb
- ▁victor
- ▁election
- ▁priorit
- ▁religio
- ▁snore
- defensi
- ▁bundle
- ▁carousel
- ▁climb
- ▁exhaust
- ▁fractur
- ▁garden
- ▁succeed
- ▁suez
- ▁hdpe
- ▁juice
- aguar
- ▁denim
- ▁dividing
- ▁fallacy
- ▁outcomes
- ▁plot
- ▁blind
- ▁shocked
- ▁bounc
- ▁depth
- incident
- ▁subtle
- ▁pump
- rcia
- ▁initiatives
- ▁spray
- ▁haunt
- ▁traverse
- ▁polish
- ▁hypothesis
- ▁voice
- ▁pledge
- ▁burst
- ▁uncle
- ▁sink
- sturb
- ▁anchor
- ▁gratitude
- ▁pause
- ▁quo
- ▁alert
- ▁vast
- ▁van
- ▁attitudes
- ▁grocer
- ▁countdown
- ▁decrease
- ▁extensi
- ▁invasion
- ▁therapi
- ▁instant
- ▁guy
- ▁forget
- ▁lawyer
- ▁reduction
- ▁strange
- ▁boom
- abul
- ▁season
- ▁begg
- ▁underwater
- ▁strategies
- ▁stimulate
- ▁hurt
- ▁alertness
- ▁utilit
- ▁tomb
- ▁elsewhere
- ▁leap
- ▁patch
- ▁preference
- ▁realistic
- ▁fold
- ▁medit
- ▁stair
- itzer
- ▁embr
- ▁addict
- ▁2015
- ▁percepti
- ▁reign
- ▁painful
- egal
- ▁respi
- ▁depriv
- ▁shutter
- ▁chemistry
- ▁sad
- ▁bias
- ▁boost
- ▁wake
- ▁workforce
- ▁varieties
- ▁repair
- ▁genome
- ▁reject
- ▁124
- slide
- ▁mobility
- ▁shade
- ▁medicine
- ▁vent
- ▁hyp
- ▁melt
- ▁cake
- ▁organized
- ▁novelty
- ▁distan
- ▁france
- ▁suck
- ▁parity
- ▁vision
- ▁voc
- ▁sufficient
- charged
- ▁calcine
- ensity
- ▁dart
- ▁collection
- ▁gun
- ▁rays
- ▁pour
- ▁bitter
- ▁funn
- ▁coff
- ▁fearless
- ▁stance
- ▁inner
- ▁retain
- ▁debt
- ▁chile
- fuse
- ▁partial
- ▁mold
- ▁substan
- ▁survival
- ▁seize
- ▁qui
- ▁installation
- ▁cup
- ruel
- ▁boss
- ▁plug
- ▁apartment
- ▁communicate
- ▁sacrifice
- ▁tapp
- ▁grass
- ▁italy
- ▁roy
- ▁squ
- ▁percentage
- ▁dots
- ▁absolutely
- ▁incentivize
- ▁reserve
- ▁navigate
- ▁creative
- viation
- ▁angle
- ▁deb
- ▁agent
- ▁isolat
- spiration
- ▁ramp
- ▁forgotten
- ▁extin
- ▁celebrated
- diff
- ▁substantially
- ▁viruses
- ▁por
- clos
- ▁comment
- ▁closest
- ▁fatal
- ▁triple
- olk
- ▁eliminate
- ▁facilit
- oster
- ▁geo
- erior
- ▁online
- ▁fung
- ▁insight
- ▁bull
- '79'
- ▁swapp
- ▁wipe
- rrow
- ▁historical
- ▁delivery
- hre
- ntine
- erson
- ▁former
- ▁original
- ▁cri
- ▁accura
- ▁bat
- ▁pave
- reci
- mma
- ▁generat
- rum
- decided
- ▁provider
- cell
- ▁intri
- izab
- neck
- ▁pur
- neu
- ▁stepp
- hoppers
- ▁hu
- ▁dye
- ▁chase
- '21'
- ▁impress
- hu
- ▁broke
- ▁obstruct
- ▁360
- ▁explor
- gue
- rate
- ▁controlle
- roc
- bru
- ecta
- ▁gui
- ▁rec
- qua
- ▁imagin
- ▁operat
- ▁fertiliz
- litar
- ▁hotte
- profitable
- ▁argu
- ▁150
- odes
- tify
- llus
- lets
- ▁terr
- poly
- ▁christ
- ctively
- ▁decarboniz
- scribe
- ▁electr
- ▁immigra
- ▁300
- ▁separat
- ▁hopp
- ▁rang
- employed
- mped
- '98'
- rail
- '97'
- ▁device
- ▁pun
- ▁belief
- ▁resident
- ▁pathway
- ▁egg
- ▁dollar
- ▁scientist
- ▁prim
- ▁reliabl
- igation
- ▁aud
- ▁fun
- maker
- ▁marr
- ▁afford
- ▁gro
- ashes
- urning
- ▁cycl
- ject
- ▁surpris
- ▁eliminat
- ▁disco
- ▁univers
- ▁receiv
- stead
- ▁critic
- mark
- ▁plea
- ▁absolute
- pair
- limited
- water
- truck
- sexual
- spread
- '35'
- bank
- virus
- imagine
- consider
- power
- down
- look
- more
- drive
- ▁communicat
- ▁prepare
- cott
- ▁insist
- fish
- ▁gri
- ▁tap
- ▁incentiv
- ▁distort
- ▁jani
- case
- ▁societ
- nounc
- ▁interact
- ▁syria
- ▁eas
- ▁frequen
- ▁significan
- ▁attac
- ▁populat
- ▁except
- ▁steriliz
- ▁cooperat
- ▁khali
- ▁appro
- ivity
- ▁danger
- ▁inform
- ▁stimul
- ▁quest
- ▁memori
- ▁import
- hibit
- stood
- ▁decre
- ▁influ
- rupt
- cense
- ippi
- ▁photosynthe
- augu
- criminat
- ▁biodivers
- ▁cardio
- ▁ridicul
- occupie
- sophisticated
- ▁absolutis
- ▁accused
- ▁afraid
- ▁algorithm
- ▁aristocra
- ▁assaulted
- ▁association
- ▁assyrian
- ▁atlantic
- ▁autonomy
- ▁availability
- ▁brutal
- ▁byproduct
- ▁ceremon
- ▁circle
- ▁conclusion
- ▁congress
- ▁consensus
- ▁diabetes
- ▁dimensional
- ▁diploma
- ▁disadvantage
- ▁disrespect
- ▁dragonfl
- ▁enzymes
- ▁epidemic
- ▁evolution
- ▁expense
- ▁eyebrows
- ▁fairbnb
- ▁follicle
- ▁fragment
- ▁gatekeeper
- ▁geography
- ▁ghrelin
- ▁gilgamesh
- ▁google
- ▁greece
- ▁gujarat
- ▁harvest
- ▁hurricane
- ▁inevitable
- ▁injustice
- ▁intelligen
- ▁ixbalanke
- ▁jetpack
- ▁judgment
- ▁livelihoods
- ▁longitude
- ▁margin
- ▁minimum
- ▁navy
- ▁necessarily
- ▁passenger
- ▁politics
- ▁prejudice
- ▁prospect
- ▁proximity
- ▁relieve
- ▁replicate
- ▁restaurant
- ▁scotland
- ▁senior
- ▁simultaneously
- ▁slot
- ▁stigma
- ▁supreme
- ▁sustainably
- ▁teenager
- ▁thirteen
- ▁thrill
- ▁tiger
- ▁tomorrow
- ▁toothpaste
- ▁tynwald
- ▁underneath
- ▁utilitarian
- ▁volunteer
- ▁vulnerability
- ▁alternate
- ▁assassinat
- ▁branche
- ▁categor
- ▁commute
- ▁defend
- ▁exclusive
- ▁feather
- ▁graduate
- ▁meticulous
- ▁perpetuat
- ▁resettle
- ▁segregat
- ▁treasur
- ▁violent
- ▁align
- ▁apparent
- ▁blades
- ▁competition
- ▁concert
- ▁counteract
- ▁daunting
- ▁debris
- ▁deficienc
- ▁disperse
- ▁england
- ▁fascinat
- ▁inflation
- ▁inhabit
- ▁irony
- ▁midwest
- ▁occasion
- ▁paddy
- ▁pioneer
- ▁praise
- ▁princes
- ▁resembl
- ▁roof
- ▁sensitive
- ▁territori
- ▁unfair
- rugg
- ▁coworkers
- ▁fruit
- ▁gasoline
- ▁impulse
- ▁lung
- ▁megawatt
- ▁palace
- ▁request
- ▁testimon
- ▁unfolding
- ▁yarn
- ▁bomb
- ▁crack
- ▁drastic
- ▁harsh
- ▁hometown
- ▁infected
- ▁john
- ▁minimize
- ▁properties
- ▁swift
- ▁pillar
- ▁endanger
- ▁flaw
- ▁relax
- ▁turk
- ▁admir
- ▁nuance
- ▁declare
- ▁guard
- ▁reunion
- ▁storytell
- ▁butterfl
- ▁scour
- ▁ribo
- ▁ferry
- ▁hacking
- ▁hydro
- ▁thread
- ▁convention
- ▁text
- ▁split
- ▁congest
- ▁translation
- ▁appreciat
- ratory
- ▁iceland
- ▁jaw
- ▁mistake
- ▁95
- programm
- ▁injure
- ▁explosive
- ▁spiritual
- ▁drill
- ▁typh
- ▁smell
- ▁latin
- ▁poem
- ▁asylum
- ▁crime
- ▁sail
- ▁appeal
- ▁guest
- ▁initial
- ▁peekabo
- ▁outlier
- mog
- ▁proud
- ▁bolt
- ▁spurr
- intuiti
- ▁cantilever
- ▁amani
- ▁genre
- ▁afar
- ▁rub
- ▁moistur
- ▁recover
- ▁items
- ▁optimistic
- ▁slippe
- ▁oversee
- ▁sara
- ▁illegal
- ▁rainwater
- ▁opposition
- ▁overnight
- ▁movie
- ▁explosion
- ▁intensity
- ▁linguistic
- ▁emulsi
- ▁radiation
- ▁violat
- morph
- ▁homo
- ▁spice
- ▁vibran
- ▁intact
- ▁rewards
- ▁exceed
- ▁viewpoint
- ▁heroes
- ▁repeatedly
- ▁confront
- rane
- ▁thre
- ▁squir
- ▁wrap
- ▁godred
- ▁orgy
- ▁sentence
- unci
- ▁memorize
- monia
- holder
- ▁quiet
- rpet
- ▁icon
- ▁spark
- ▁deforestation
- ▁nurs
- ▁1945
- ▁finger
- cade
- ▁efficac
- ▁haz
- ▁motivation
- ▁spotted
- ▁pitch
- ▁subsidize
- ▁intention
- ▁window
- ombi
- ▁swim
- ▁winter
- ▁dynami
- ▁executive
- ▁boil
- ▁assess
- ▁2018
- ▁failure
- ▁horse
- ▁enact
- utter
- ▁circulation
- ▁queen
- ▁distract
- flag
- ▁mentor
- ▁lick
- lank
- ▁ebo
- ▁dirt
- ▁remark
- ▁shake
- ▁entry
- frost
- ▁pear
- ▁bound
- ▁rif
- ▁performance
- ▁exception
- ▁189
- ▁straight
- ▁purp
- imeter
- ▁hills
- ▁chew
- scop
- ▁lamp
- ▁fog
- ▁sweet
- ▁cosm
- ▁mysteri
- rbit
- ▁dying
- ▁argument
- ▁intell
- ▁sultanate
- aire
- ▁tile
- ▁monoc
- ▁machinery
- ▁motion
- ▁infant
- ▁healthier
- ▁continuous
- ▁truce
- ▁undergo
- aboo
- ▁commanders
- ▁qualifi
- ▁55
- ▁anyway
- ▁lenses
- ▁offset
- ▁merg
- quent
- tari
- ▁chim
- ptin
- ▁exit
- ▁dash
- ▁meta
- ▁wish
- ▁poorest
- ▁distortion
- ▁interaction
- ▁proposal
- ▁reven
- ▁trace
- ▁perch
- ▁behav
- ▁disruption
- ▁progressive
- introduce
- ▁gall
- ▁stone
- ▁update
- descent
- ▁dance
- ▁polye
- ▁settle
- fellow
- ▁rob
- ▁stre
- ▁kan
- dominant
- ▁bro
- ▁ev
- ▁purif
- ▁agreement
- ▁dominate
- ▁regulation
- ▁improvement
- hase
- ▁ecolog
- hydr
- pical
- ▁conspi
- ▁inhale
- ▁arriv
- ▁fil
- ▁visitor
- ▁greenland
- phasi
- ▁farmer
- ▁cran
- ▁identifi
- ▁chose
- hau
- grega
- mps
- ▁characteriz
- ▁audi
- ▁oppress
- mination
- aint
- ▁determin
- ▁unemploy
- spire
- ▁giga
- ska
- ▁immigrat
- rank
- sport
- aft
- ▁snap
- emper
- equality
- ▁imp
- ▁terri
- ▁interv
- '19'
- hi
- icated
- ▁demonstrat
- kg
- gible
- ix
- grad
- pression
- '16'
- ▁pursu
- ▁hor
- ▁deli
- ▁spar
- ▁suc
- ▁millenni
- connected
- ▁leon
- ▁inspir
- ▁tho
- ▁faci
- ▁domin
- ▁resist
- ▁mobil
- ▁var
- eval
- ▁interfer
- abilities
- ▁enabl
- ▁border
- ▁forci
- ▁monk
- ▁eugenic
- gae
- ▁concern
- ▁fertil
- ▁mammal
- ▁iri
- ▁merc
- ▁blu
- gger
- ▁statistic
- ▁integr
- compa
- nown
- ▁navigat
- ▁amaz
- ▁reserv
- layer
- escription
- ▁angl
- ▁amplif
- force
- plug
- conscious
- compete
- mind
- leader
- honest
- load
- position
- root
- box
- speak
- flow
- complete
- drop
- check
- sustainable
- friend
- track
- game
- moral
- certain
- green
- world
- people
- life
- what
- about
- human
- wind
- suit
- pay
- minis
- ▁tradition
- ▁bloo
- ▁explo
- ▁strateg
- ▁circu
- ▁gravit
- ▁corporat
- ▁activit
- ▁inequalit
- ▁galax
- ▁calci
- ▁energ
- ▁identit
- ▁locat
- ▁que
- ford
- compromis
- ▁swee
- ▁constr
- imitation
- ▁matte
- zoo
- hwa
- ▁dyna
- ▁flexib
- ▁execut
- ▁renew
- ▁catastroph
- ▁deforest
- rink
- ▁auth
- ▁pub
- ▁marc
- ▁furthe
- ▁diagnos
- ecutive
- titude
- ▁compli
- gressive
- nprofit
- pute
- ▁nano
- oxide
- ▁evident
- ▁surp
- ▁arachn
- ▁hippoc
- nivores
- skeleton
- suppress
- thropo
- ü
- ▁accomplish
- ▁accusation
- ▁acknowledg
- ▁activists
- á
- î
- ç
- ö
- <sos/eos>
init: null
input_size: null
ctc_conf:
dropout_rate: 0.0
ctc_type: builtin
reduce: true
ignore_nan_grad: null
zero_infinity: true
brctc_risk_strategy: exp
brctc_group_strategy: end
brctc_risk_factor: 0.0
st_joint_net_conf: null
model_conf:
asr_weight: 0.3
mt_weight: 0.0
mtlalpha: 0.3
lsm_weight: 0.1
length_normalized_loss: false
use_preprocessor: true
token_type: bpe
src_token_type: bpe
bpemodel: data/en_bn_token_list/tgt_bpe_unigram4000/bpe.model
src_bpemodel: data/en_bn_token_list/src_bpe_unigram4000/bpe.model
non_linguistic_symbols: null
cleaner: null
g2p: null
src_g2p: null
speech_volume_normalize: null
rir_scp: null
rir_apply_prob: 1.0
noise_scp: null
noise_apply_prob: 1.0
noise_db_range: '13_15'
short_noise_thres: 0.5
ctc_sample_rate: 0.0
frontend: default
frontend_conf:
n_fft: 400
hop_length: 160
fs: 16k
specaug: specaug
specaug_conf:
apply_time_warp: true
time_warp_window: 5
time_warp_mode: bicubic
apply_freq_mask: true
freq_mask_width_range:
- 0
- 27
num_freq_mask: 2
apply_time_mask: true
time_mask_width_ratio_range:
- 0.0
- 0.05
num_time_mask: 5
normalize: utterance_mvn
normalize_conf: {}
preencoder: null
preencoder_conf: {}
encoder: conformer
encoder_conf:
output_size: 256
attention_heads: 4
linear_units: 2048
num_blocks: 12
dropout_rate: 0.1
positional_dropout_rate: 0.1
attention_dropout_rate: 0.1
input_layer: conv2d
normalize_before: true
macaron_style: true
rel_pos_type: latest
pos_enc_layer_type: rel_pos
selfattention_layer_type: rel_selfattn
activation_type: swish
use_cnn_module: true
cnn_module_kernel: 31
postencoder: null
postencoder_conf: {}
decoder: transformer
decoder_conf:
attention_heads: 4
linear_units: 2048
num_blocks: 6
dropout_rate: 0.1
positional_dropout_rate: 0.1
self_attention_dropout_rate: 0.1
src_attention_dropout_rate: 0.1
extra_asr_decoder: transformer
extra_asr_decoder_conf:
input_layer: embed
num_blocks: 6
linear_units: 2048
dropout_rate: 0.1
extra_mt_decoder: transformer
extra_mt_decoder_conf:
input_layer: embed
num_blocks: 2
linear_units: 2048
dropout_rate: 0.1
md_encoder: null
md_encoder_conf: {}
hier_encoder: null
hier_encoder_conf: {}
extra_mt_encoder: null
extra_mt_encoder_conf: {}
preprocessor: default
preprocessor_conf: {}
required:
- output_dir
- token_list
version: '202402'
distributed: false
```
</details>
### Citing ESPnet
```BibTex
@inproceedings{watanabe2018espnet,
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
title={{ESPnet}: End-to-End Speech Processing Toolkit},
year={2018},
booktitle={Proceedings of Interspeech},
pages={2207--2211},
doi={10.21437/Interspeech.2018-1456},
url={http://dx.doi.org/10.21437/Interspeech.2018-1456}
}
```
or arXiv:
```bibtex
@misc{watanabe2018espnet,
title={ESPnet: End-to-End Speech Processing Toolkit},
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
year={2018},
eprint={1804.00015},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| [
"CRAFT"
] |
espnet/iwslt24_indic_en_ta_bpe_tc4000 | espnet | null | [
"espnet",
"audio",
"speech-translation",
"en",
"bn",
"dataset:iwslt24_indic",
"arxiv:1804.00015",
"license:cc-by-4.0",
"region:us"
] | "2024-04-19T17:56:11Z" | 2024-04-19T18:18:31+00:00 | 0 | 0 | ---
datasets:
- iwslt24_indic
language:
- en
- bn
license: cc-by-4.0
tags:
- espnet
- audio
- speech-translation
---
## ESPnet2 ST model
### `espnet/iwslt24_indic_en_ta_bpe_tc4000_use_wandbtrue`
This model was trained by cromz22 using iwslt24_indic recipe in [espnet](https://github.com/espnet/espnet/).
### Demo: How to use in ESPnet2
Follow the [ESPnet installation instructions](https://espnet.github.io/espnet/installation.html)
if you haven't done that already.
```bash
cd espnet
git checkout 3a161c5ac0f74cc593410a4a32001073ed456580
pip install -e .
cd egs2/iwslt24_indic/st1
./run.sh --skip_data_prep false --skip_train true --download_model espnet/iwslt24_indic_en_ta_bpe_tc4000_use_wandbtrue
```
<!-- Generated by scripts/utils/show_translation_result.sh -->
# RESULTS
## Environments
- date: `Wed Apr 17 02:51:38 JST 2024`
- python version: `3.10.14 (main, Mar 21 2024, 16:24:04) [GCC 11.2.0]`
- espnet version: `espnet 202402`
- pytorch version: `pytorch 2.1.0`
- Git hash: `83c179ab842987cf01642df2db372aaae260df55`
- Commit date: `Wed Apr 17 00:28:29 2024 +0900`
## st_train_st_conformer_raw_en_bn_bpe_tc4000
### BLEU
|dataset|score|verbose_score|
|---|---|---|
|decode_st_conformer_st_model_valid.acc.ave/dev.en-bn|2.1|19.7/3.6/1.0/0.3 (BP = 1.000 ratio = 1.185 hyp_len = 46094 ref_len = 38883)|
## ST config
<details><summary>expand</summary>
```
config: conf/tuning/train_st_conformer.yaml
print_config: false
log_level: INFO
drop_last_iter: false
dry_run: false
iterator_type: sequence
valid_iterator_type: null
output_dir: exp/st_train_st_conformer_raw_en_bn_bpe_tc4000
ngpu: 1
seed: 0
num_workers: 1
num_att_plot: 3
dist_backend: nccl
dist_init_method: env://
dist_world_size: null
dist_rank: null
local_rank: 0
dist_master_addr: null
dist_master_port: null
dist_launcher: null
multiprocessing_distributed: false
unused_parameters: false
sharded_ddp: false
cudnn_enabled: true
cudnn_benchmark: false
cudnn_deterministic: true
collect_stats: false
write_collected_feats: false
max_epoch: 80
patience: null
val_scheduler_criterion:
- valid
- loss
early_stopping_criterion:
- valid
- loss
- min
best_model_criterion:
- - valid
- acc
- max
keep_nbest_models: 10
nbest_averaging_interval: 0
grad_clip: 5.0
grad_clip_type: 2.0
grad_noise: false
accum_grad: 2
no_forward_run: false
resume: true
train_dtype: float32
use_amp: false
log_interval: null
use_matplotlib: true
use_tensorboard: true
create_graph_in_tensorboard: false
use_wandb: false
wandb_project: null
wandb_id: null
wandb_entity: null
wandb_name: null
wandb_model_log_interval: -1
detect_anomaly: false
use_adapter: false
adapter: lora
save_strategy: all
adapter_conf: {}
pretrain_path: null
init_param: []
ignore_init_mismatch: false
freeze_param: []
num_iters_per_epoch: null
batch_size: 20
valid_batch_size: null
batch_bins: 25000000
valid_batch_bins: null
train_shape_file:
- exp/st_stats_raw_en_bn_bpe4000/train/speech_shape
- exp/st_stats_raw_en_bn_bpe4000/train/text_shape.bpe
- exp/st_stats_raw_en_bn_bpe4000/train/src_text_shape.bpe
valid_shape_file:
- exp/st_stats_raw_en_bn_bpe4000/valid/speech_shape
- exp/st_stats_raw_en_bn_bpe4000/valid/text_shape.bpe
- exp/st_stats_raw_en_bn_bpe4000/valid/src_text_shape.bpe
batch_type: numel
valid_batch_type: null
fold_length:
- 80000
- 150
- 150
sort_in_batch: descending
shuffle_within_batch: false
sort_batch: descending
multiple_iterator: false
chunk_length: 500
chunk_shift_ratio: 0.5
num_cache_chunks: 1024
chunk_excluded_key_prefixes: []
chunk_default_fs: null
train_data_path_and_name_and_type:
- - dump/raw/train.en-bn/wav.scp
- speech
- kaldi_ark
- - dump/raw/train.en-bn/text.tc.bn
- text
- text
- - dump/raw/train.en-bn/text.lc.rm.en
- src_text
- text
valid_data_path_and_name_and_type:
- - dump/raw/dev.en-bn/wav.scp
- speech
- kaldi_ark
- - dump/raw/dev.en-bn/text.tc.bn
- text
- text
- - dump/raw/dev.en-bn/text.lc.rm.en
- src_text
- text
allow_variable_data_keys: false
max_cache_size: 0.0
max_cache_fd: 32
allow_multi_rates: false
valid_max_cache_size: null
exclude_weight_decay: false
exclude_weight_decay_conf: {}
optim: adam
optim_conf:
lr: 0.002
weight_decay: 1.0e-06
scheduler: warmuplr
scheduler_conf:
warmup_steps: 25000
token_list:
- <blank>
- <unk>
- ▁
- ্
- ▁়
- য
- র
- ▁প
- ▁এবং
- ের
- ▁য
- কে
- ▁স
- ▁ব
- ▁যে
- ▁একটি
- রা
- ও
- ▁যা
- ▁ে
- ▁করে
- ▁ত
- ▁সম
- ▁করা
- ▁জন
- ▁করতে
- ▁এটি
- ে
- স
- হয
- ▁ক
- ▁দ
- ▁আম
- ▁এই
- তে
- ▁ট
- ড
- িয
- ায
- ই
- ▁আমাদের
- ▁।
- ন
- ▁না
- ত
- ▁ন
- ▁তাদের
- ▁আপনি
- টি
- ▁পারে
- ▁আমি
- ভাবে
- ▁কিন
- ▁তু
- ী
- ▁তা
- ▁গ
- ▁তার
- ▁রয
- ▁র
- ▁তারা
- ার
- ▁েছে
- ▁থেকে
- ▁া
- দের
- ▁বা
- ▁হবে
- ▁সাথে
- ▁পর
- ▁হ
- ▁নির
- িত
- ▁ম
- ▁অন
- ক
- ▁ছিল
- ▁যান
- ▁ার
- ▁তি
- ▁আপনার
- ▁নিয
- ▁মধ
- ষ
- ▁আর
- ▁তৈরি
- ▁অ
- ▁ধ
- ▁বন
- ▁জ
- ▁তাই
- ▁যখন
- ▁এক
- ▁বিশ
- ▁কার
- ▁শ
- গুলি
- ▁কিছু
- ▁দে
- ল
- ঙ
- ▁বাস
- ▁তবে
- ▁ণ
- ▁যদি
- ▁শক
- ুক
- ▁অর
- ঞ
- ▁এমন
- ▁চ
- ▁ভ
- ▁কাজ
- ▁এখন
- ▁থ
- ▁হল
- ▁তর
- ▁অনেক
- ▁বেশি
- ▁হতে
- ▁পরিব
- ি
- ▁আ
- িক
- ▁করি
- ▁েছিল
- ▁এর
- ▁যবহার
- মাত
- ▁কারণ
- উত
- ▁যায
- াড
- ▁তখন
- ▁ড
- ▁মতো
- পূর
- চ
- ▁পারি
- ▁সত
- প
- ▁সেই
- ▁ষ
- ▁আমা
- ▁তন
- ▁নয
- ▁চে
- া
- ▁শুরু
- ▁মনে
- ▁বর
- ব
- ▁কম
- ▁উদ
- ▁ু
- ▁কী
- ▁ছে
- ▁ষা
- ▁আছে
- দ
- ▁রি
- ▁বি
- ▁বের
- ▁যক
- ▁করেন
- ▁বলে
- ▁একজন
- ▁তিনি
- ▁হিসাবে
- জ
- ▁এটা
- ▁যন
- '-'
- ▁নতুন
- ▁সে
- ▁দিকে
- ','
- ▁করেছে
- ▁করেছিল
- ▁রক
- ▁রে
- িষ
- ▁যেতে
- ▁দি
- ▁উপর
- ▁জলব
- ▁শুধু
- ▁থান
- ▁রতি
- ▁কি
- ▁ল
- ▁যাক
- ▁পারেন
- ▁কর
- গ
- ▁কয
- ▁রস
- ▁রথম
- ▁যেখানে
- ▁থাকে
- ▁টা
- ▁কাছে
- ▁কথা
- বচে
- ▁লক
- ▁সং
- শ
- ▁ঘ
- ▁আগে
- ▁োজন
- ▁একই
- ▁বারা
- ▁করছে
- ▁যার
- ▁সালে
- ▁দেখা
- ংখ
- ▁এখানে
- ▁করেছিলেন
- ▁ষণ
- ▁কেন
- যোগ
- ▁আপনা
- ▁ভাল
- ▁মূল
- ট
- ▁তির
- ▁সাহা
- ▁বৈ
- ম
- ান
- ▁খুব
- ▁বছর
- ▁ফ
- ▁সর
- ৃষ
- েন
- ▁দেশ
- ▁রম
- ▁অস
- ▁বড
- ▁কোম
- কি
- ▁ঠিক
- ▁ধরে
- ▁বিজ
- ▁করবে
- ফ
- ▁গুরুত
- ▁থা
- ু
- ▁রুত
- ▁বিদ
- ▁যারা
- ▁দেখতে
- ▁নি
- ▁সাধারণ
- ▁পূর
- ▁করেছি
- াও
- ▁মান
- ▁ভাব
- বিষ
- েষ
- ▁যিই
- রণ
- ▁ছ
- ▁করুন
- ▁ধি
- ▁উচ
- ▁রতিটি
- ▁পদ
- ▁বিক
- হ
- ▁গল
- ▁পরে
- ৃদ
- চিত
- ▁রশ
- ▁উ
- ▁উচিত
- ▁সহ
- মধ
- ▁চিত
- ▁জীবন
- ▁েক
- ▁20
- ▁কখন
- উন
- ▁বেশিরভাগ
- ▁ছোট
- ▁রাস
- ▁দিতে
- ▁যাচ
- ▁ঘুম
- ুদ
- ▁মানুষ
- ▁কোন
- দক
- রূপ
- ▁চাই
- ▁বিষ
- ▁রভাব
- ▁থাকা
- ▁দুর
- ▁এ
- রীক
- ▁উপা
- ▁দুটি
- ▁বিশেষ
- ▁আক
- ▁অক
- ▁বলতে
- ▁আন
- দীর
- ▁ষে
- ▁নেই
- ▁ধন
- ▁ষেত
- ▁বলা
- ▁তী
- ▁রত
- ▁পুনর
- ▁সক
- নিশ
- ▁শেষ
- ▁সিস
- ▁আসলে
- াম
- এ
- ণ
- ▁ছা
- ▁ঘন
- ▁মার
- মাধ
- ▁ভাগ
- ▁সঠিক
- ▁কেউ
- ▁ইতি
- ▁কিভাবে
- ▁শিল
- ▁পার
- ▁উদাহরণ
- িং
- ▁কারণে
- ▁বল
- ▁সুত
- িজ
- ▁রতিক
- ▁ফির
- ▁মানুষের
- ▁লোক
- ▁ভব
- ▁সমাধান
- ▁আসে
- ▁চলে
- িতে
- ▁কেবল
- ▁রী
- ▁ঞানী
- ▁নিজে
- ভিন
- ▁সেখানে
- ▁অবস
- বর
- ▁যত
- ▁খুঁজ
- ▁কঠিন
- ▁হাজার
- ▁জানেন
- ▁জানি
- খ
- ▁সব
- ▁বে
- ▁যমে
- বিশ
- ▁রহ
- ▁ধান
- ▁টার
- ▁জিনিস
- ▁থনীতি
- ▁ধরনের
- ▁সহজ
- ▁তব
- ▁রাজ
- ▁তিত
- ▁গেছে
- পক
- াহলে
- িকল
- ▁আলো
- ▁রহণ
- ▁করবেন
- ▁10
- ▁অবশ
- ং
- ▁পনা
- ▁পে
- কারী
- ▁ধে
- িদ
- তার
- ▁যেমন
- ▁চা
- ▁তাপ
- ▁যাপ
- ▁দিন
- ▁এত
- ▁ছি
- ▁নে
- ▁সাম
- ▁গত
- তা
- ▁অংশ
- ▁রান
- ছন
- ▁বিত
- ▁কোষ
- ▁সরকার
- ▁োগ
- তি
- বার
- ▁বিশাল
- ▁পেতে
- ▁উপ
- ▁চিন
- '2'
- ▁রাখা
- ুর
- ▁জিন
- ▁বৈশ
- ▁পানি
- ▁গমন
- ▁াই
- ▁ভবত
- ▁সন
- ▁অগ
- চুর
- ▁পরিস
- ▁েছি
- ▁তিশালী
- ▁শতাংশ
- ▁ভিত
- ▁বছরের
- াল
- ▁যাকসিন
- ▁যবাদ
- ▁রকৃত
- ▁মত
- ▁থাপন
- ▁রণ
- ▁আজ
- ▁লোকেরা
- ▁লা
- ▁রের
- ▁রিক
- ▁ষতি
- শব
- ▁থাকতে
- ▁বিল
- ▁দেশে
- ▁উভ
- ▁মস
- ▁জু
- ▁রমণ
- ▁ষমতা
- ▁রদান
- ▁যবস
- নের
- রুদ
- ▁করেছেন
- ▁সার
- টিকে
- ▁গাছ
- ▁জীবা
- গত
- ▁মিলি
- ▁ডলার
- াং
- ▁পণ
- ▁রূপা
- ▁ষম
- ▁গা
- ▁কল
- নেও
- ▁যাট
- জন
- ▁যথ
- ▁পুরো
- ▁অনুমান
- ▁রাখতে
- ▁যাস
- এর
- ▁েছিলেন
- ▁লেখ
- ▁পরিষ
- ▁জল
- ▁রঙ
- ▁মাত
- ▁বিনি
- ▁জা
- ▁তারপর
- ▁তুলনা
- ▁পৃথিবী
- ▁খরচ
- ▁বিবেচনা
- ▁চল
- ▁রিত
- ▁টেলি
- ▁েছিলাম
- ▁টেম
- ▁সি
- বদ
- ▁অনুস
- ▁আলাদা
- ▁তৃত
- গুলিতে
- ▁ভর
- ▁রাপ
- ানো
- ▁সুযোগ
- ▁মুহ
- ▁মাথা
- ▁সংক
- ▁ভাবনা
- ▁যাগ
- সাথে
- ▁মী
- ▁যাত
- ▁নীচে
- ▁তোলে
- ▁বাইরে
- তির
- ▁তিনটি
- ▁বুঝ
- ▁চিকি
- ▁কোনও
- ▁হার
- ▁19
- ▁মক
- ▁থিতি
- ▁গবেষণা
- ▁সরবরাহ
- ▁তারপরে
- ▁একক
- ▁মের
- ▁সৌর
- ▁চাল
- ▁মহিলা
- ▁চর
- ▁কোনো
- ▁নীতি
- ▁বস
- ▁CO
- ▁সবুজ
- ▁অবশেষে
- ▁যুৎ
- ▁বেগ
- ▁রাখে
- ▁দুই
- ▁ডে
- ▁চান
- ▁রোগ
- ▁বলি
- ▁রমাণ
- ▁নিজ
- ▁গি
- ▁ভুল
- ংক
- ▁টের
- ▁শহরে
- ▁অত
- ▁যাবে
- মে
- ▁শহর
- ▁কের
- ▁মহা
- েবে
- ▁কোথা
- ▁সাইড
- ▁নের
- ির
- ▁ঠ
- গুলো
- ফর
- ▁তথ
- ▁পানির
- ▁চালি
- ▁ভালো
- ▁ধরণ
- ▁ধারণ
- ▁মাণ
- ▁াল
- ▁বিপ
- ▁ভাষা
- ▁দরকার
- ▁রিট
- ▁কো
- ▁রদা
- ▁মৃত
- ▁ছেন
- ▁যুতিক
- ▁যকর
- ▁লাস
- ▁তমান
- ▁মিশর
- ▁রাম
- ▁দল
- ▁নিজের
- ▁ডার
- থায
- ▁সারা
- েও
- োড
- ▁সা
- ▁রাতে
- ▁বিস
- টা
- ▁ছিলেন
- ▁ফলাফল
- ▁ডাই
- ▁ঞাসা
- ▁মিথ
- ▁নীল
- ▁হিস
- ▁চুল
- ঘ
- ▁যালে
- ▁ষেপ
- ▁বব
- ▁যু
- ▁বাধ
- ▁দেশগুলি
- ▁মানে
- ▁ান
- ৃশ
- ▁াতে
- ▁আশ
- ▁খারাপ
- ▁লাল
- ূর
- ▁ধার
- ▁তুত
- ষম
- ▁পরিচিত
- ▁বক
- ▁ডা
- ▁নাম
- ▁জার
- ▁ছিলাম
- টোক
- ▁তম
- োক
- ▁যবসা
- ▁বার
- ▁পথ
- লম
- ▁ধতি
- ▁অনুভব
- ▁কৌশল
- ▁রসারিত
- ▁আঘাত
- ▁জিনিসগুলি
- িন
- ▁গতি
- ▁অতির
- ▁পাচ
- াকে
- ▁করছেন
- াঙ
- ▁মাই
- ▁পা
- ▁জানা
- ▁নব
- ▁আশা
- ▁ধারণা
- ▁অভ
- ▁সুবিধা
- ▁সবাই
- না
- েতু
- ংস
- মেন
- ▁পাঁচ
- ▁জীব
- ▁নিষ
- ▁হুমকি
- ▁বালানি
- ▁নিরাপদ
- ূন
- ▁বোধ
- ▁যগুলি
- ▁গে
- রক
- ▁চাপ
- ▁রোটিন
- নী
- ▁যোগ
- ▁রাণী
- ▁ভারতে
- ▁টির
- ▁রকৃতি
- ▁মহামারী
- সের
- ▁মে
- ▁15
- ▁থনৈতিক
- ▁ঝুঁকি
- ▁রকাশ
- ▁তিন
- ▁সুস
- ▁রাজা
- ▁ডিস
- ▁এড
- ▁মারা
- ▁টন
- শীল
- ▁নামে
- ▁দু
- জু
- ▁উপাদান
- ▁অপ
- থ
- ুষ
- ▁পরিণত
- ▁তত
- ▁বেঁচে
- ▁বালানী
- ▁অনুম
- ▁বেশ
- ▁যানো
- ▁ধমান
- লে
- ▁এগ
- ▁থন
- ▁আবার
- ▁অসম
- ময
- ▁উপস
- াস
- ▁যমান
- ▁শিক
- রামর
- ▁হাই
- কাল
- ▁থী
- ▁ঞান
- ▁পাদন
- ▁রিকা
- ▁দূরে
- ▁হলে
- ো
- ▁ভিন
- ▁নিক
- ▁রাব
- ৎ
- ▁কোপ
- ▁শী
- লব
- ▁দা
- হত
- ▁দেখেছি
- ▁বোঝা
- ▁টিক
- ▁মরুভূমি
- ▁বৃহ
- তম
- ▁তিগত
- ▁অফ
- ▁ষতা
- ▁ফলে
- ▁সীমা
- িহ
- ▁সেন
- ▁যুদ
- ▁মন
- ▁দশকে
- ▁সেগুলি
- ▁গড
- ▁যো
- ▁রদ
- ▁11
- ▁4
- ▁পরিবার
- ▁ডিজাইন
- ▁রজাতি
- ▁হাসি
- ▁নামক
- ▁খাদ
- ▁তো
- ▁তিক
- েক
- সূর
- ▁ভারত
- ▁ইন
- ▁যাপক
- ▁আসা
- ▁কিনা
- ▁ঠান
- ▁আত
- ▁অব
- ▁কোষে
- ▁পুরুষ
- ▁ডি
- ▁রার
- ▁সংগ
- ▁যাকে
- ▁থাকবে
- ▁বিন
- ▁ইংতাই
- ▁মোমবাতি
- ▁রাকৃতিক
- ▁লোকেদের
- ীকরণ
- ▁রতিশ
- ▁খ
- ▁চারপাশে
- ▁এশ
- ▁খনি
- ▁উপরে
- ▁রুতি
- ▁পরিমাণে
- ▁আসন
- ▁বিভ
- পড
- ▁দূর
- ▁1
- ▁বেড
- ▁রিস
- ▁কোষগুলি
- ▁আগ
- ▁একে
- ▁রাক
- ▁শহরগুলি
- ▁সেট
- েই
- তটা
- ▁শরীর
- ▁পরিমাণ
- ▁খিঁচুনি
- ▁ফেলে
- গায
- ▁জো
- দিনের
- নির
- ▁ইমিউন
- ▁যাল
- ▁আস
- ▁অপর
- ▁বাচ
- ▁কত
- ৈর
- ▁তরে
- ▁রেক
- ▁করছি
- ▁অনু
- ▁করলে
- ▁আল
- ▁আধ
- ▁ভাবন
- ▁এমআরএনএ
- ▁টেকসই
- ▁রোজান
- ▁পরিচালনা
- ▁যুত
- ▁বছরে
- ▁যালি
- ▁ডেটা
- ▁একাধিক
- ▁দর
- ▁পিছনে
- ▁মাটি
- ▁যতা
- ▁বদা
- ডিগ
- ▁এগুলি
- ▁ঞতা
- ▁আচরণ
- লা
- ফোর
- ▁একবার
- ▁সহা
- ▁শন
- টিস
- ▁রতিরোধ
- ▁আরেক
- ▁6
- াক
- কার
- লি
- বা
- ▁সেরা
- ▁বংস
- ▁লি
- ▁বপ
- ▁অপসারণ
- s
- ▁মোকাবেলা
- ▁রবেশ
- ▁ইলেক
- ▁চিকিৎসা
- ▁ভেঙ
- ▁বিপরীত
- ▁রধান
- মূলক
- ▁হত
- ▁পাশা
- ▁পুর
- ▁দাস
- ▁জনস
- ▁মডেল
- নি
- োয
- ▁থক
- ▁আপ
- াচ
- রিদ
- ছিলাম
- ▁মা
- বে
- ▁এলাকা
- ▁দশক
- ▁ঘটনা
- ভূত
- ▁কন
- ▁শতা
- ▁তরা
- ▁ভার
- রবর
- িনি
- ▁খা
- ▁নিজেদের
- রূপে
- ▁মোট
- ▁কাঠামো
- ▁যোগাযোগ
- ▁বীকার
- ▁ভূমিকা
- বু
- ▁ঠী
- ▁ডিক
- ▁জোর
- '20'
- ▁আমেরিকান
- ▁সাল
- ▁েন
- ▁নৈতিক
- ঠা
- শত
- াপী
- ▁সপ
- াতে
- বেক
- ▁ফল
- পত
- ▁জীবনে
- ▁গো
- ▁যাই
- ▁অদ
- ▁নত
- ▁ডাক
- ▁সেস
- কৃত
- ▁টিভ
- ▁জটিল
- হীন
- ▁কঠোর
- ▁চাহিদা
- ▁মুখোমুখি
- ▁রকৌশলী
- ▁রাচীন
- ▁উৎপাদন
- ▁রগতি
- ▁লেষণ
- ▁জাতিগত
- ▁শোষণ
- ▁খাবার
- ▁ধীর
- ▁পারবেন
- ুনিক
- ▁ভিতরে
- ▁ভাইরাস
- ▁দেখি
- তিতে
- ▁দেবে
- কল
- ▁লেট
- ▁েছেন
- ৃত
- ▁াম
- ▁ইস
- ▁নিচে
- ▁চম
- ▁গদ
- ▁ধু
- ▁তুলত
- ▁টেবিলে
- পী
- মা
- ▁আকার
- ▁অণু
- ▁অনুপ
- ▁টে
- ▁নিত
- ▁মূ
- ▁ওষুধ
- ▁কাছাকাছি
- ▁ডিএনএ
- ▁সুপারনোভা
- ▁শুনতে
- ▁গপাল
- ▁অভাব
- ▁যপ
- ▁মাঝ
- নাক
- ▁আটকে
- ▁বিচ
- ▁গভীর
- ▁যজনক
- মি
- ▁200
- টিক
- ▁যেভাবে
- ▁পাশে
- ▁রতিদ
- ▁সেলস
- ▁ফেল
- ▁নতি
- ▁বাধা
- ▁বজ
- ▁মানব
- ছে
- ▁থতা
- াই
- ▁শতাংশের
- ▁শান
- ▁হন
- ▁নিম
- ▁শিকার
- পাশ
- বৃত
- ▁সমব
- ▁5
- েছে
- ▁তেলাপোকা
- ▁ঝ
- ▁বসে
- ▁গুণ
- ▁ণবাদ
- ▁লিপ
- ▁যব
- ▁ঘটে
- তী
- ▁আইন
- ▁জানে
- ▁আশেপাশে
- ▁নাগরিক
- ▁গঠন
- ▁তরীণ
- ▁যাটার
- ▁অভিজ
- ▁সংযোগ
- ▁চরম
- ▁করব
- জেন
- ▁পানিগুলি
- ▁ডিম
- লার
- াফল
- ▁জলে
- বহা
- ▁উজ
- ▁সামনে
- ▁30
- ▁থিত
- াগত
- ▁ঝাঁক
- ▁পগুলি
- উড
- ▁যাম
- ▁কুল
- ▁থাগুলি
- ▁মানসিক
- ▁বাঁচ
- ▁পরব
- ▁ডেন
- ▁থে
- ▁রেস
- ▁ছবি
- ▁কাছ
- ▁সমান
- বন
- ▁পান
- ▁সিম
- ▁2
- ▁সহক
- ▁ঞা
- ▁লিপিড
- ▁অধ
- ▁কোভিড
- ▁অবদান
- ▁যোগফল
- ▁সোনা
- ▁েকটি
- ▁কালো
- ▁কমাতে
- ▁গবেষকরা
- ▁অনন
- ▁দেখে
- মান
- ▁মুখী
- ▁রজনন
- ▁সূচক
- ▁জাত
- টাই
- ▁পরিবেশ
- ▁আদ
- ▁ইউরোপ
- ▁আচ
- ▁পেট
- ▁লাগ
- ▁ছিন
- ▁যাশ
- ▁জি
- ▁চিম
- োষ
- ▁মু
- ▁যটি
- ▁গেলে
- ▁ষিণ
- ▁ভিদ
- ▁বেত
- ▁রেম
- ▁বিপর
- ▁তিদের
- েশন
- লেন
- ভুক
- ▁রোগী
- ▁পাত
- ▁চার
- বসম
- ▁রাণ
- ▁ঘো
- ▁আরো
- ▁এম
- মন
- ুরক
- ▁খেলা
- দিকে
- োজ
- ▁রো
- ▁বাভাবিক
- '0000'
- ▁যবহ
- ▁নিন
- ▁ইতিহাস
- ▁শত
- ▁পরিচ
- ▁রাথমিক
- ▁ভাইপার
- ▁জনগণ
- ▁থাকলে
- ▁শোনা
- ▁ঘুর
- ▁বিয
- ▁লোব
- ▁বাণ
- ▁পরিবহন
- ▁যবান
- ▁সাদা
- ▁ওজন
- ▁কিছুটা
- ▁চাকা
- ▁অপে
- ▁ঠে
- ▁মিলিত
- ▁সেক
- ▁বাকি
- ▁শরীরে
- ▁যেকের
- েট
- মাস
- ইচ
- ▁পালি
- ▁রচ
- দার
- ▁আকাশে
- ▁মুখে
- ারি
- ালন
- ▁রবাহ
- ▁কিলোমিটার
- ▁আকারে
- ▁শে
- ারিদ
- ▁সুন
- ভাগ
- পু
- ▁লোকের
- '50'
- ▁বাবা
- ▁মিত
- সাম
- ছেন
- বি
- ▁যৌন
- ▁রবণ
- মণ
- ▁বাক
- ▁ধেক
- ▁বহু
- ▁অদলবদল
- ▁তেজনা
- ▁বসবাস
- ▁পরিমাপ
- ▁রাজনৈতিক
- ▁আবাস
- ▁সংকেত
- ▁পরিবেশগত
- ▁বিকাশ
- ▁বিগুণ
- ▁যানেল
- ▁যাঁ
- ▁ইংরেজি
- ▁অভি
- ▁মিনিটের
- াধিক
- ▁যিকার
- ▁জানত
- ▁রজন
- ▁অসু
- রকম
- ▁থিক
- ▁রেখে
- ▁জেনে
- ▁3
- ণনা
- ▁নারী
- ▁সংয
- াত
- ▁টেমের
- ▁রেড
- লান
- ▁ানো
- ▁সাহ
- ▁চাচ
- ▁কাজটি
- ▁রিড
- ▁থল
- ▁পন
- ▁রন
- াজার
- ▁রিন
- ▁কোপে
- ▁গন
- ▁সৌ
- পথে
- ▁লুপ
- ▁সূ
- ▁ভাই
- ▁ষিত
- ▁কেল
- ▁ওঠে
- ▁70
- ▁জাহাজ
- ৷
- ▁থেরাপি
- ▁চাকরি
- ▁মৌলিক
- ▁চাঁদ
- ▁রতিফল
- ▁নেতৃ
- ▁শাসন
- ▁খবর
- ▁নাটক
- ▁ঘুমানো
- ▁করছিলাম
- ▁ইতিহাসে
- ▁চালানো
- ▁ষরিক
- ▁ষত
- ▁বীপ
- ▁আমেরিকানদের
- হিত
- ▁করছিল
- লাম
- ▁আউট
- ▁যাটারি
- ▁কথোপকথন
- ▁তোলা
- ▁থানে
- সংশ
- ▁দেন
- ▁ঘট
- ▁বাতাস
- ▁নিউ
- ▁নেট
- ামাজ
- জনক
- ▁দাম
- শক
- ূ
- ▁যাকসিনগুলি
- ▁নম
- হার
- ▁রাসা
- ▁শিশু
- োল
- ালের
- ▁কোর
- ▁16
- ▁রাত
- ▁চালা
- ▁100
- ▁সমাজ
- কেন
- ▁তাহ
- ভূমি
- ▁কমে
- ▁মাস
- াময
- ▁12
- শিত
- ▁পেশী
- মাক
- a
- ▁ফোকাস
- ▁শিখেছি
- ▁তহবিল
- ▁রতিবেশী
- ▁রভু
- ▁উপকূল
- ▁দুধ
- ▁পরিচাল
- ▁আলোক
- ▁বলুন
- ▁সিজেন
- ▁দাবি
- ▁দূষণ
- ▁শতকে
- ▁যতক
- ▁পাঠানো
- ▁রাণিত
- ▁রোগীর
- ▁কাউ
- ▁রাখবে
- ▁বোত
- ▁জানতে
- টিভ
- ▁ঞানিক
- ষণা
- ▁গেম
- ▁পুনরা
- োচ
- ▁মিল
- ▁হৃদয
- ▁করেছিলাম
- ▁মুখ
- ▁পোর
- বিদ
- ▁রহস
- ▁পাবল
- ৃ
- ▁কেরি
- ▁রণে
- ▁আজকে
- ▁অপরি
- ংশ
- ▁মহিলার
- ▁রগুলি
- ালোক
- েমন
- ▁জিত
- ▁ষক
- ▁হাতি
- ▁একা
- ষিক
- ▁হাতে
- ▁াস
- তুর
- ▁কা
- ▁কোণ
- ▁দশকের
- ফিল
- ▁গুরুতর
- ▁বলছি
- ▁পাই
- ▁আমেরিকা
- ▁8
- ▁বাজার
- াদী
- ▁চোখে
- ▁রমে
- '3'
- িপিং
- ▁দাঁ
- ▁তরুণ
- '9'
- ▁নদী
- ▁যাপন
- ▁চলেছে
- ▁পাঠ
- ▁অবকাঠামো
- ▁কবুতর
- ▁টুকরো
- ▁অনুবাদ
- ▁একটু
- ▁জিডিপি
- ▁নমুনা
- ▁দখল
- ▁যমজ
- ▁24
- ▁রোজেন
- ▁যাপচার
- '26'
- ▁শারীরিক
- ▁তুলনামূলক
- ▁কিত
- হাউস
- ▁সফল
- ▁দরজা
- ▁নিরাপ
- ▁যালসি
- ▁গরম
- ▁দেখেন
- ▁রমিক
- ▁টাও
- ▁গম
- ▁তিগুলি
- ▁চারটি
- ▁দেবতা
- ▁েল
- ▁তবতা
- ▁শহ
- ▁বিতা
- ▁দৈ
- ▁মাক
- ▁সংকট
- ▁অনুসার
- গুণ
- ▁ইহুদি
- ▁ণবাদী
- ▁রুটি
- ▁মালি
- ▁বালি
- ▁পুনরু
- াশ
- ▁জনক
- ▁পৌঁছা
- ▁উপাদানগুলি
- ▁80
- ▁ইক
- ▁ষি
- ▁কোনটি
- ▁কুশ
- দুর
- রি
- োগ
- ▁করেনি
- ুল
- নিয
- ▁নিব
- ▁জের
- িকভাবে
- ▁শুক
- ▁বান
- ▁রাণীর
- ▁মগুলি
- ুরে
- ▁তাত
- ▁শিখ
- ▁কক
- ুনি
- ▁রেই
- ▁কাট
- ▁তিকর
- পোস
- ▁খালি
- ▁যাগুলি
- ▁বনাইজ
- ▁ভূ
- ▁যেগুলি
- ▁লাভ
- ▁গেল
- ▁জাতিক
- ▁পরিশ
- ▁উপরের
- কর
- ▁মেশিন
- েল
- ▁ছেলে
- ▁সু
- ছিল
- ▁জাম
- ▁শানবো
- সাশ
- ূত
- ▁থিতিশীল
- ▁বো
- ▁তুলা
- ▁বকে
- ▁অবি
- '00'
- ▁থানগুলি
- ালকা
- ▁লু
- ▁ইউ
- ▁অধিকার
- ▁রাইলোবাইট
- ▁টেরিওটাইপ
- ানদের
- ▁মিটার
- ▁জাতি
- ▁ভালবাসা
- ▁সীমিত
- ▁অনুশীলন
- ▁মোনালিসা
- ▁জীবনযাপন
- ▁আলোচনা
- ▁লোরো
- ▁আগামী
- ▁তেজিত
- ▁রনালী
- ▁2030
- ▁উঠেছে
- ▁আগুন
- ▁নেতিবাচক
- ▁যাকটাস
- ৎকার
- ▁যালাক
- ▁থনীতিবিদ
- ▁বিরল
- ▁লেজ
- ▁পৌঁছানো
- ▁বীকৃত
- ▁পাহা
- ▁চেইন
- ▁যামেরা
- ▁রু
- ▁রেখা
- মস
- ▁দেখানো
- ▁চীন
- ▁জনসাধারণ
- ▁তাব
- ▁রাজি
- েড
- ▁ছদ
- ▁ডিং
- ▁তালিকা
- নো
- ▁পরিবেশে
- ▁ফি
- ▁রাউন
- ▁রোল
- দৌ
- ▁চোখ
- ▁সাক
- ▁রোম
- ▁ফাঁদ
- শন
- ▁ডস
- '5'
- ▁সাই
- াজ
- ▁শেখা
- ▁জিনগুলি
- িণ
- ▁টিকেল
- কোণ
- ▁গান
- ▁সেতু
- ▁সরকারকে
- ▁মাসের
- ▁রাপক
- ▁খাল
- ▁কান
- মিশ
- শি
- দস
- কোনো
- ▁শিবির
- ▁হো
- ▁ছাত
- সরি
- ▁রহে
- ▁পথে
- ▁বলবে
- ▁এন
- যুদ
- ▁ভু
- মনী
- সে
- ▁অংশে
- ▁খেল
- জার
- ▁াট
- ▁দী
- '7'
- ইহ
- ▁সিরি
- ▁লাইন
- ▁মাসে
- ▁াদে
- ▁চক
- ▁ছেদ
- ▁খু
- ▁ডল
- ▁রীক
- ▁বলছে
- ▁এসে
- ▁উপকরণ
- কিং
- ▁ভাইরা
- ▁ঐতিহ
- '&'
- ;
- o
- p
- Ş
-
- ▁চুনাপাথর
- ▁মেলাটোনিন
- ▁রাজধানী
- ▁ষুধা
- ▁সবকিছু
- ▁অপারেটর
- ▁মবেশ
- ▁হরমোন
- ▁গাছপালা
- ▁উপভাষা
- ▁আইডি
- ▁যসেবা
- ▁দেশাবলী
- ▁যানবাহন
- ারের
- ▁হারানো
- ▁তরাধিকার
- ▁পাবেন
- ▁বিকৃত
- ▁ষেপণ
- ▁জেট
- ▁অংশগ
- ▁জমি
- ▁অভিযোজন
- ▁বাণী
- ▁বিবর
- ▁যাধি
- ▁হোম
- ▁যাটি
- ▁চলগুলি
- ▁বলেছিল
- ▁টাকা
- ▁খোলা
- ▁মীদের
- লো
- ▁রচার
- ▁রেণী
- ▁সামর
- ▁রহী
- ▁মানবতা
- ▁রতিদিন
- ▁দেহ
- ▁নিজেদেরকে
- ▁যাপার
- ▁াগুলি
- ▁ভারতকে
- ধিক
- বিরক
- ▁গর
- ▁টান
- ▁দান
- ▁90
- ▁কাজে
- ▁িগুলি
- ▁বাদ
- ▁সাত
- ▁25
- ▁হবেন
- ▁লেখক
- বাদী
- াউন
- াদের
- ▁পেরেছি
- ▁পক
- ▁পাইক
- '1'
- ▁1000
- িস
- ▁অল
- ▁রাশ
- ▁উপন
- ▁শিকারী
- সাধারণ
- ভার
- ▁ষিণে
- ▁বুদ
- ▁পশ
- ▁ভুলে
- ▁সাপ
- ▁রিজ
- াইড
- ▁ভূত
- ▁50
- ▁লাগে
- ▁বারে
- দিন
- ▁দৃ
- তন
- ▁পাদ
- '8'
- ▁আট
- ▁আকাশ
- ▁নিচ
- ▁বিগ
- '6'
- চে
- ▁খুল
- ▁ভূগ
- ▁দাতা
- ▁বলেছি
- ▁সুলতান
- পর
- কুচি
- ▁তনশীল
- ▁এতটা
- ▁মানি
- ▁অথ
- ীন
- তুল
- ▁লাই
- ▁পাখি
- ▁রোধ
- ▁নিদ
- ধ
- ▁বাধীন
- ▁এসেছি
- ঢ
- ▁ঘর
- তিবাচক
- ▁ডিভাইস
- ▁মোটামুটি
- T
- ▁পৃথক
- ▁যালঘু
- ▁সহযোগিতা
- ▁পুনঃ
- ▁আবেগ
- ▁যকলাপ
- ▁ঝিল
- ▁নিঃসরণ
- ▁আংশিক
- ▁চিৎকার
- ▁লিওন
- ▁মনোযোগ
- ▁আবেদন
- ▁বিবেচ
- ▁আছি
- ▁ফসল
- ▁পোরেশনগুলি
- ▁পেরু
- ▁বিতরণ
- ▁লাইট
- ▁কিলো
- ▁এসেছে
- ▁বহি
- ▁ইউনি
- ▁বামী
- ▁কভার
- ুব
- ▁ফলাফলগুলি
- ▁কৃষি
- ▁তাক
- কারক
- ▁যাকশন
- ▁পাঠা
- ▁নেতা
- ▁খে
- ▁সকলের
- ▁তনে
- নাইট
- পুর
- ডাউন
- ▁যৌনতা
- ▁ডান
- রম
- ▁শীত
- ▁চলা
- ▁কানের
- ▁মিং
- ▁মুদ
- ▁শাসক
- ▁গোপন
- ▁তোমা
- ▁কৃতি
- ▁টেক
- ▁রেট
- ▁সকালে
- ▁যাবেন
- ▁জান
- ▁পরিসরে
- ▁ফুল
- ▁হাত
- ▁এভাবে
- াইভ
- পূরণ
- ▁হলেন
- ▁শিশুর
- শীর
- ▁ডানা
- পতি
- ▁মাতা
- ▁শুনে
- ▁কাটা
- ▁ধারণাটি
- ▁যিক
- ছা
- ▁গাছে
- ▁রমা
- ▁সমাধানে
- সম
- ীদের
- ▁মাল
- িড
- আই
- ▁দার
- মার
- ুন
- ▁ভে
- ▁চতা
- ▁400
- ▁বাহ
- ▁ইতাল
- লস
- ▁রাইভ
- ▁এরিক
- ▁থি
- ▁হারি
- মাঝ
- েইনফ
- ▁পেরেছিল
- '4'
- ▁টিকে
- েব
- থাক
- ▁শর
- ▁ডাকা
- ▁রেখেছিল
- ▁তুলে
- ▁অসুবিধা
- ▁নগুলি
- ▁আই
- ▁টু
- ▁শেষে
- ▁জনপ
- খানে
- ▁বহুল
- ▁দেখেছিল
- ▁ঋণ
- ▁রুপ
- ▁দূষ
- ▁মহাকা
- ০
- ▁আরএনএ
- ▁নাৎসি
- ▁সুপারহিরো
- ▁রতিযোগিতা
- ▁পাইলট
- ▁বজনীন
- ▁ঐতিহাসিক
- ▁চিঠি
- ▁মরিসন
- ▁বাজেট
- ▁সুপারিশ
- ▁পুলিশ
- ▁দেখুন
- ▁অভিযান
- ▁রাহক
- ▁যবধান
- ▁রতিযোগী
- ▁ডানদিকে
- ▁পাগল
- ▁খনন
- ▁ঘটছে
- ▁বেষণ
- ▁সংবেদন
- ▁লাগানো
- ▁দেখেছিলেন
- ▁ঢে
- ▁পারবে
- ▁কাশন
- ▁বিকেলে
- ▁শুনেছেন
- ▁এসেছিল
- ▁যাসিড
- ▁নেমে
- ▁3-
- ▁রশংস
- ▁বাহু
- ▁করত
- ▁রঙে
- গদ
- ▁40
- ▁গক
- ▁শোষ
- ▁জোট
- ▁গণনা
- ▁হাঁট
- ▁বেস
- ▁রিলি
- ▁টিং
- ▁দাদা
- ▁সরকারগুলি
- ▁অংশগুলি
- ▁দোষ
- ▁খলা
- ▁করতেন
- ▁জাপান
- ▁আধি
- ▁বাহিনী
- ঘাত
- ▁সরকারী
- ▁থিতিতে
- ▁পারেনি
- ▁যাংক
- াসের
- াইজ
- ▁মেট
- ঃ
- ▁কুলে
- ▁বাচন
- ▁কোড
- ▁মাঝা
- ▁রেমে
- েইন
- রমাণব
- ▁যাগগুলি
- বহন
- বাজারে
- ▁টেবিল
- ▁চারা
- ▁রাখ
- ▁ঠানিক
- ▁বংসা
- ▁ধকারে
- ▁ঝুল
- ▁18
- ▁থাকেন
- ▁কৃষ
- ▁তক
- ▁চি
- বিরোধ
- হন
- ▁নাক
- ▁যাতন
- মিন
- দা
- চার
- ▁গগুলি
- ▁আছেন
- '21'
- ▁ডলে
- ▁তিটি
- পা
- ▁রোত
- ▁রকেট
- ▁তাহে
- ▁পাস
- ুলার
- ▁বাঁচা
- ▁আসেন
- ▁যথায
- ▁কৃতিক
- ▁ধকার
- ▁পরিষেবা
- বিক
- ▁তগুলি
- ▁যাণ
- ▁দেবী
- ▁ষর
- ▁সীমান
- ▁কৃত
- সি
- ছি
- ▁পিতামাতার
- ভান
- ▁মেঘ
- ▁আরি
- ▁ফাঁক
- েজ
- ধি
- ▁পরি
- ▁মেটা
- টো
- পাস
- নে
- তিগ
- োপ
- মুখী
- ▁যদ
- জীবন
- '0'
- ▁অতি
- ফো
- ▁মিনিট
- ▁রিপ
- ▁মিক
- ▁পিছ
- ▁কু
- ▁যানবাহনে
- ▁শো
- ▁নাগা
- বেন
- ▁পোরেশন
- ▁োগকারী
- শালী
- ▁জাতিসংঘ
- ৃৎপ
- ▁ডিজিটাল
- ▁নিখুঁত
- ▁পিতামহ
- ▁মহাসাগর
- ▁রিলোবাইট
- ▁রীতদাস
- ▁রোপচার
- ▁সেনাবাহিনী
- ▁অপারেশন
- ▁জরুরী
- ▁শেলোব
- P
- ▁অনুভূমিক
- ▁যাটেলাইট
- ▁বাছাই
- ▁যকারিতা
- ▁আঠালো
- ▁কেটলি
- ▁সৈন
- ▁ইনজেকশন
- ▁একাকী
- ▁রতিকৃতি
- ▁মালিকানা
- ▁রাকচার
- ▁তুলেছে
- ▁কবিতা
- ▁আসুন
- কোহ
- ▁বুশ
- মলত
- ▁আসছে
- ▁আশাবাদী
- ▁আসবে
- ▁উৎসাহ
- ▁বোতাম
- পোকা
- ▁অধীন
- ▁একমত
- ▁ভেবেছিল
- ▁সুখ
- ▁গঠিত
- ▁নজর
- ▁বিবরণ
- ▁201
- ▁দেখবে
- ▁লিনিক
- ছ
- ৌক
- ▁সুইচ
- ▁পরিণতি
- ▁মোটা
- ▁উৎপ
- ▁লেটগুলি
- ▁পাথর
- ▁ফেলবে
- ▁ফরাস
- ▁হৃদ
- িগার
- ▁মাপ
- ▁ভাঙ
- ফুস
- ▁ধুদের
- ▁বিরতি
- ▁কতা
- ▁লাইস
- ▁দিল
- ▁থাকি
- ▁নীতিগুলি
- ▁আবদ
- ▁রেলি
- ▁পেস
- ▁মাইক
- ▁টেমগুলি
- ▁গু
- ▁টেশন
- ▁গেট
- নশীল
- ▁লুক
- ▁পরাজ
- ▁পাঁচটি
- ▁বতন
- ▁পাবে
- ▁রোমান
- ▁বাপক
- ▁লাইনের
- ▁00
- পোর
- ▁উঠ
- ▁17
- ▁যাতি
- ▁জাল
- বাইন
- ▁ঘটা
- ▁কমান
- ▁ইমে
- ▁দগুলি
- ▁উপয
- ▁হতাশা
- ▁যুতে
- ▁নিষি
- ভ
- ▁সেল
- োর
- ▁ফিল
- ▁সিটি
- ▁ভবন
- ▁দীপ
- ▁194
- ▁ষাগুলি
- ▁যাগে
- ▁আবর
- ▁সকল
- মিড
- ▁টিকেলগুলি
- ▁কারণগুলি
- ▁দিক
- ▁হেল
- ▁বিট
- ▁রেরণা
- ▁কুশি
- ▁ঘোরা
- ▁ধরা
- ▁সী
- ফি
- ▁রবৃ
- ▁রোটিনে
- ▁কাজগুলি
- ▁মহাকাশে
- ামগ
- ▁অনেকের
- ▁পলি
- ফিক
- ▁রহণকারী
- ▁বিধ
- রেস
- ▁লোককে
- ▁মহাদেশ
- ুত
- ▁ণতা
- ▁রপ
- ▁মিশ
- ▁উৎস
- ▁গার
- কেটে
- গো
- মেডি
- ▁লেখা
- ▁ভিদে
- ▁ষী
- ▁দিনে
- বশেষ
- ▁দেশটি
- ▁মেস
- ▁বিচারে
- ৌ
- ▁ডিত
- ▁আব
- ▁মহাকাশ
- ▁রেডি
- ▁36
- ▁22
- ▁10000
- োস
- ▁বুজ
- কেল
- ▁বাতাসে
- েটর
- ীর
- ▁বেল
- ▁বীপে
- দন
- লাইন
- ূপ
- ▁সাহারা
- ▁রমণে
- ▁হাস
- ▁েজ
- ▁বলতা
- ▁জুন
- কোস
- ▁হই
- ▁মজা
- ▁নটি
- ▁করণ
- বিজ
- ▁যেকোন
- াবে
- াদা
- ▁রুট
- তিক
- ▁থের
- ▁সহজে
- ▁তাকা
- ▁গবেষক
- ▁ধর
- ▁রাইড
- ▁এলোমেলো
- ▁উঁচু
- ▁উদযাপন
- ▁কীটনাশক
- ▁রতিনিধি
- ▁শিরোনাম
- ▁শৈশব
- ▁াকলাপ
- ▁এনকোড
- ▁মজুরি
- ▁লাটিপাস
- ফেডারে
- ▁থেরানোস
- ▁মনোনিবেশ
- ▁ইটনভিল
- ▁লুরোস
- ▁জরিপ
- ▁টিউমার
- ▁মনিকা
- ▁সমাবেশ
- ▁বাসনালী
- ▁ইংল
- ▁খাঁচা
- ▁জীবিকা
- ▁গৃহ
- ▁ভিডিও
- ▁বেলারুশ
- ▁অধিকাংশ
- ▁রিগস
- ▁বাভাস
- ▁তুলবে
- ▁ঝাঁপ
- ▁পোশাক
- ▁খলিল
- ▁রতিবাদ
- ▁সাফো
- ▁আসল
- ▁সহিংসতা
- ▁সমাধি
- ▁কমিশন
- ▁বিদেশ
- ▁রেখেছিলেন
- ▁রাইম
- ▁কিং
- ▁ধতিগত
- ▁টাইন
- ▁অংশীদারদের
- ▁অনুভূতি
- থার
- ▁লাইম
- ▁বীজন
- ▁বিমান
- ▁রপাতি
- ▁কোলে
- ▁যানেলগুলি
- ুঁ
- ▁লিপিডগুলি
- িশোধ
- ▁সেগুলো
- ▁শিশুদের
- ▁লাফ
- ▁বেকার
- ▁সরানো
- ভাইরাস
- ▁অনুরোধ
- ▁শনিক
- ▁মালিক
- ▁রিকান
- ▁জমা
- ▁ফাঁ
- ▁অনুমোদন
- ▁করিনি
- ▁আবি
- ▁গণত
- ▁সভ
- ▁কমানো
- ▁দীতে
- ▁তৃতা
- ▁রতিরোধী
- ▁যুট
- ▁টাল
- িচ
- ▁রোপণ
- ▁বিবাহ
- বহুল
- ▁রবণতা
- ▁করলেন
- রিকানদের
- ▁দাঁত
- ▁আপস
- ▁যাকিং
- ▁যবাহ
- ▁জে
- ▁বোঝাতে
- ▁রামী
- ▁রুব
- ▁2000
- ▁মাছ
- ▁ারিং
- ▁জীবাণু
- ▁লিনার
- ▁ফুট
- ▁ধাপ
- চাপ
- আইনি
- ভাল
- গম
- ▁লেগে
- লুপ
- ▁কাপ
- ▁রহটি
- দূর
- শাস
- ▁টিমে
- ▁ঘটনাটি
- ▁কিলোমিটারের
- ▁সংগঠ
- থিত
- ▁অণুগুলি
- ▁বীর
- ▁সবে
- ▁করুক
- ▁লিফটে
- ▁সমাজে
- ▁ারশ
- ▁খরা
- ▁তেল
- ▁আঁক
- ▁চেল
- পশ
- ▁পরিপ
- ▁শহরটি
- ▁লোড
- েকটি
- ▁বিচার
- ▁লাগা
- বল
- ▁লাইটে
- ▁ভূমি
- ▁ফার
- সব
- ▁গণিত
- ▁চির
- ▁পৌঁছে
- লিপি
- ▁ালা
- াপ
- ▁আনা
- ▁পানিটি
- চক
- ▁186
- াংস
- িডা
- ▁একদিন
- ▁7
- ▁হারা
- কারীদের
- ুখ
- িএস
- ▁দশ
- োঁ
- ▁অফিসে
- ▁মুছ
- িশ
- ▁সিং
- ▁াশা
- ▁75
- ▁কাঠ
- ▁সাপে
- '11'
- ▁যদেব
- েম
- ▁ারগুলি
- কোষ
- ▁ফোন
- সেট
- ▁কোট
- ▁দলগুলি
- িটি
- ▁শুরুতে
- বিয
- তীতে
- িঁ
- ▁রেন
- ▁দামে
- করা
- ▁সেটা
- ▁ধিত
- দল
- লিক
- ▁টল
- ▁রোস
- ▁জেনি
- '60'
- ▁তাকান
- ▁যাং
- ▁পাতা
- ▁ো
- ▁পরিক
- ▁একবারে
- ▁কথোপকথনে
- ▁সমতা
- ▁ইউরোপে
- ▁দির
- হো
- শু
- ▁রিডে
- িদর
- ▁জৈব
- ▁জাদু
- ▁যালো
- ▁উৎ
- '15'
- টল
- ▁সুই
- ▁চত
- াবধানে
- ▁অনুমোদ
- ▁এখান
- ▁কিশোর
- ালোচনা
- িছু
- ▁কাগজে
- ▁তরল
- ▁বিরত
- ▁সমীক
- ▁রামক
- ▁অংশীদার
- বাজ
- ▁খামার
- বেদন
- ▁01
- ▁ধাঁধা
- ▁যাথোজেন
- ৫
- ৭
- ▁আনুমানিক
- ▁কমিউনিটি
- ▁করোনাভাইরাস
- ▁চাবিকাঠি
- ▁জরুরি
- ▁তঃসংয
- ▁তাভাবনা
- ▁নকশা
- ▁সহানুভূতি
- ▁অভিনেতা
- ▁ওভাররাইড
- ▁মামালেক
- ▁যামিগডালা
- ▁হতবাক
- ▁পুঁজিবাদ
- ▁মেঝে
- ▁বপুরুষ
- ▁জেগোটা
- ▁1970
- কাহিনী
- ▁বিবৃতি
- ▁বিরোধিতা
- ▁আইনজীবী
- ▁মচারী
- ▁থাপিত
- ▁ঞাপন
- ▁লেবেল
- ▁মামলা
- ▁কোলাহল
- ▁রচারণা
- ▁সোলার
- '99'
- ▁14
- ▁দোলন
- ▁গিগা
- ▁ভীক
- ▁ঘটবে
- ▁আপাত
- ▁ফেলেছিল
- ▁লাগবে
- ▁দেখছেন
- ▁যালসাই
- '35'
- ▁উপভ
- ▁বরাবর
- ▁ঘটেছে
- ▁ভেবেছিলেন
- লিভার
- ▁পেরেছিলাম
- ▁নিউরন
- ▁আমূল
- ▁ইরানে
- ▁সমতল
- ▁ওভার
- ▁আদেশ
- ▁কাঁটা
- ▁ধারনা
- ▁যুবক
- ▁এসেছিলেন
- ▁তানুকি
- ▁খামারগুলি
- ▁ণালী
- োফা
- ▁দুজন
- ▁ছুট
- ▁চৌ
- ▁সিরিজ
- ▁বলেছিলেন
- ▁উপক
- ধকতা
- ▁খুঁজছেন
- ▁জস
- ▁সচেতন
- ▁করছিলেন
- ▁লিটার
- ▁পিটার
- ▁রথা
- ▁ষমা
- ▁নথি
- ▁টোট
- ▁জামগুলি
- ▁কাগজ
- ▁তকরণ
- াবলী
- ▁পেশীগুলি
- ▁ঋণী
- ▁বছরগুলিতে
- ▁কেপ
- ▁নেহ
- ▁সেবা
- ▁তুলো
- সাঁ
- ▁অভিবাসী
- ▁পৌঁছেছে
- ▁চারণ
- ▁হেড
- ▁উঠে
- ▁যাডি
- ▁রাইভার
- ▁বেনি
- ▁আইল
- ▁সৃজনশীলতা
- ুমি
- ▁কোরবা
- ▁পারব
- চিং
- ▁চলেছেন
- ▁জীবনযা
- বসতি
- ▁রিফ
- ▁ওঠেন
- ▁ছবিটি
- ▁টাফ
- ▁সভা
- ▁ঘাম
- জগতে
- ▁রঙগুলি
- ▁বাই
- ▁তাৎ
- ▁পানী
- ▁শুনি
- শে
- ▁টেট
- ▁কারখানার
- ▁থাকবেন
- ▁যানগত
- াইরে
- ▁দো
- ▁কাঁ
- ▁সজ
- ▁থাংশ
- তীত
- ▁জেনিস
- ▁মি
- সিস
- ▁তাকালে
- োত
- পার
- ▁মোহ
- ▁পিট
- ▁টাপো
- গান
- ▁জিও
- ▁যাদা
- ▁হাম
- ▁মানিত
- ▁পাচার
- ▁সাহসী
- ▁মানগুলি
- '16'
- ুনির
- ▁ফটোগ
- ▁টাইম
- ▁পৃ
- ▁বংশ
- ▁রাণু
- ▁লট
- ▁মৃতি
- অপস
- ▁27
- '23'
- টে
- হারে
- নুপাত
- ▁শট
- ▁ফেলা
- ▁পশু
- ▁গেছেন
- ▁জারি
- ▁রমিত
- ▁রোতা
- টিং
- ▁জেনারেল
- ▁সৎ
- ▁লেন
- ▁বাগত
- ▁রমণকারী
- ▁চিতভাবে
- ▁বাসা
- ▁মডেলগুলি
- ▁টেন
- ▁গুর
- াগুলি
- দেবী
- ▁রোড
- দাতা
- ▁পরিবারগুলি
- ▁টানা
- লগ
- ▁রিটাউনে
- কিলোমিটার
- ▁রতা
- লাভ
- বৈ
- ▁কাম
- কন
- ▁বাব
- ▁সুবিধাগুলি
- ▁কগুলি
- ▁থীর
- ▁বিকভাবে
- রিশ
- ▁বই
- লিস
- ▁নগ
- দেশ
- ▁যৎ
- ▁দূরব
- ▁রাইভে
- ▁শিলা
- ▁চুরি
- মোন
- ▁অতীতে
- ▁সির
- ▁দেখাতে
- ▁হাব
- ▁কেলে
- সোস
- ▁ডাকে
- ▁আলোকব
- ▁তান
- ▁ামি
- টক
- ▁দানি
- ▁ডগুলি
- ▁পেরে
- ▁কেনা
- ▁ষণিক
- ▁কুশে
- টার
- ▁তৃপ
- ▁নেন
- ▁চাপা
- ভা
- দান
- ▁বিধা
- ▁যাকেজ
- েলে
- ▁গোল
- গন
- পরি
- ▁যাসে
- ছিলেন
- ▁চালান
- ▁নতা
- ▁যাশন
- ▁নাল
- ▁কোপটি
- িবাসী
- বশ
- িরোধী
- ▁অনুগ
- সিলি
- মত
- ▁মুন
- ▁ঞানে
- কালে
- ▁চিল
- েছিল
- ▁পরিত
- ▁যথা
- ▁যাকর
- োট
- ইনস
- ▁মিলে
- তঃ
- ▁সিএ
- ▁েলস
- শেষে
- ▁লোম
- জা
- ▁দেরি
- ▁রল
- টেক
- ▁সাহস
- ▁এইচ
- ▁মনো
- ▁রেরণ
- ▁পালা
- নিক
- ▁বাঁকা
- ছুক
- াইট
- ▁ফর
- ▁আটক
- ▁দটি
- ▁রাফ
- ▁মিস
- ▁ধা
- ▁পরিবারে
- ▁উঠত
- নুষ
- োম
- োদ
- খানার
- ▁অশ
- িরে
- বিত
- ভিল
- ▁ধুত
- ▁পাব
- ▁রেখেছি
- িটা
- ৈ
- াগন
- ▁কামান
- টাস
- ▁কারখানা
- ▁ধানে
- ▁দিত
- ▁অপরাধ
- ভি
- ালী
- রিকা
- ▁20000
- ▁সংঘ
- ▁সৃজনশীল
- '18'
- ▁অভিবাস
- ▁বলব
- ▁ধারক
- খানা
- রাধিকার
- ▁থাকব
- ▁লিখ
- ▁অমরজ
- ▁রপাত
- ▁উঠবে
- ▁রোমা
- াষী
- ▁দেখেছে
- ▁ডিশনার
- ▁াসে
- ▁নীত
- াগারে
- াফা
- ▁160
- জির
- াব
- '87'
- ▁ইনজেক
- ▁গোলকধাঁধা
- C
- L
- r
- ▁ইঁদুর
- ▁ইউটিলিটি
- ▁ইমিউনোথেরাপি
- ▁এলিভেটর
- ▁কাদামাটি
- ▁কৌতূহল
- ▁চিরতরে
- ▁ধারাবাহিক
- ▁মিসৌরি
- ▁রচারাভিযান
- ▁রাজকুমার
- ▁রেনেসাঁ
- ▁শিথিল
- ▁ষরেখা
- ▁হাসপাতাল
- ▁অবজারভেটরি
- ▁পরিকাঠামো
- ▁ররেখা
- ▁তলদেশে
- ▁শৈল
- ▁মদপুর
- ▁ওলাফ
- ▁গতিশীলতা
- ▁সাসপেন
- ▁ঘেটো
- ▁সংহতি
- ▁আইটেম
- ▁মেরামত
- ▁মৃদু
- ঁচট
- ▁96
- ▁রজেকশন
- ▁কংগ
- ▁রাচীর
- ▁রাজনীতিবিদ
- ▁সমালোচনামূলক
- ঘাট
- ▁রাখুন
- ▁উপনিবেশ
- ▁হিম
- ▁অনুকরণ
- ▁রামবাসী
- ▁দেশিকা
- টেইনার
- ▁ডেনিম
- ▁সাজানো
- রফেস
- ▁ষপাত
- ▁সাগর
- ▁পারতাম
- ▁মোতা
- ▁জিনোম
- ▁2019
- ▁এনেছিল
- ▁লুকানো
- িউএ
- ▁অভিজাত
- ▁রিটিশ
- ▁গুণমান
- ▁অভিনব
- ▁পরিপূরক
- ▁টগুলি
- ▁ষাপটে
- ▁রিলিফ
- ▁টানেল
- ▁জেগ
- ▁সুপার
- কটের
- ▁বৈধ
- ▁সেথেস
- ▁কাঁপ
- ▁জটিলতা
- ▁ফোরণ
- ▁টুকরা
- ▁ভরশীল
- ▁শদাতা
- ▁বালতি
- ▁পালক
- লিথি
- ▁ধরন
- ▁পেশা
- ▁পরিণতিগুলি
- ▁বাগান
- ▁মনোভাব
- ▁অনলাইন
- ▁থাপক
- ▁বলেছে
- ▁সেটিং
- ▁ডিফ
- ▁চোরা
- ▁ভিড
- ▁দেখেছেন
- ▁বোঝানো
- ▁শকুন
- ▁থাপকতা
- রবী
- লানিক
- ▁নীতিগত
- ▁করেননি
- ▁বিভাগে
- ▁দিকটি
- ামী
- ▁ওঠা
- িসিসি
- ▁তাকাতে
- ▁বলেছেন
- ▁পিতৃ
- ▁ফেট
- ▁পাঠক
- নাতক
- ▁দাগ
- ▁পারিনি
- ▁চেতনা
- ▁কফি
- ▁পাঠান
- ▁অবসান
- রোধে
- ▁রতিবার
- ▁মুদি
- ▁মূলধারার
- ▁বাতি
- ▁রাগন
- ▁গাম
- াবস
- ▁শনগুলি
- পোলি
- ▁বাধীনতা
- ▁ভাস
- ▁রাণীগুলি
- ▁আইস
- ▁কিছুর
- ▁জানতেন
- ▁জানু
- ▁রামগুলি
- ▁লোহ
- ▁কেজি
- ▁সাব
- ▁রাইট
- াচল
- ▁ইট
- ▁ছাপ
- বৃ
- ▁বিপদ
- সিভ
- ▁কলে
- ▁অসহ
- ▁টেরল
- ▁খাই
- ▁রমিকরা
- আইভ
- ▁উপাদানটি
- ▁মহামারীটি
- ▁যালোকে
- ▁সমাধানগুলি
- ▁যি
- ▁থিতিশীলতা
- ▁ওটা
- ▁রেখেছে
- ▁আদালতে
- ▁রোচ
- ▁গণ
- ▁দলে
- ভিয
- ▁উপহা
- ডেট
- ▁খালটি
- সুবিধাজনক
- ▁মগ
- ▁লালন
- ▁কণা
- ▁নিষেধ
- ▁১
- েলাই
- াবল
- ▁চেক
- ▁নই
- ▁অভিন
- ▁টেমে
- ▁ভট
- োন
- ▁গভীরতা
- ▁ষণগুলি
- ▁সারি
- ▁বরে
- ▁ধেকের
- ▁যাসী
- ▁দিরে
- ▁দৈন
- কড
- ঁ
- মাদ
- ▁টরের
- ▁কারো
- ▁গী
- ▁ফু
- ▁রাজারা
- জেনি
- কো
- ▁বীপগুলি
- ▁কণ
- ▁বাঁক
- ▁পিতামাতা
- ঠিত
- ▁সবাইকে
- ▁থির
- ▁মিনি
- বাহ
- ▁বাসী
- ▁তনগুলি
- ডো
- ▁থাপনা
- রো
- ▁াটি
- ▁রীর
- ▁নেবে
- ▁বুজে
- ▁রীন
- লুস
- রিটি
- নোর
- ▁500
- ▁এলাকাগুলি
- ▁উই
- ▁রোটিনটি
- তাকা
- ঠ
- শনে
- ▁360
- ▁বনে
- ▁সুয
- ▁ফিউ
- বুন
- ▁13
- ▁সাইটে
- শনার
- লাঙ
- টান
- ▁খোঁজ
- ▁ডাল
- ▁কপি
- ▁তুকি
- ▁ধাত
- জাত
- বেচ
- ▁হব
- ▁ইতালি
- োশ
- ▁জুম
- কক
- রুন
- মূল
- ▁মেইন
- ▁েলসে
- পথগুলি
- নিম
- লজি
- ▁টক
- হারা
- ▁দিই
- ▁দোকানে
- পিং
- সাধ
- চালান
- ▁রতিরোধে
- পেস
- '37'
- ▁নিল
- ▁খুলি
- গল
- ধান
- ▁ফের
- ▁জগুলি
- ▁বেলা
- পথ
- ▁কনস
- ▁শেল
- বিল
- ▁নেভিগে
- ▁জাগ
- জাতিক
- উ
- ▁রবাহে
- ুলে
- ফোন
- আপ
- তারা
- ▁অফিস
- ▁পশম
- ▁যুগে
- ▁যাটিন
- ▁ততটা
- লভ
- ▁মহাদেশে
- বো
- েমের
- ▁উৎসে
- ারবার
- ▁কমলা
- পাল
- ▁চলছ
- ভেন
- লিম
- মুন
- ▁202
- সেপ
- দানি
- মেলা
- ▁লিং
- িবার
- ▁সাইট
- ▁কনসা
- ঝর
- িকেল
- াশি
- ঝ
- ▁জানান
- ▁রমাণবাদ
- নেস
- শহ
- ▁নাচ
- ▁যাব
- ফেরা
- ▁124
- ▁পতন
- '12'
- ▁ভরা
- ▁ঘরে
- ▁বাম
- ▁লিক
- লানো
- ▁বী
- খা
- গোল
- ▁রতার
- ▁টেমটি
- '44'
- ▁জেনারে
- ▁রাশি
- ▁ভূমিক
- থি
- ▁ভাষ
- ▁ঝর
- ▁সুদ
- বাসী
- োজা
- ▁হতাশ
- লিং
- ▁চিনি
- হর
- ▁পারলে
- সাইক
- ▁196
- ▁সবা
- ▁ফুলে
- ▁আচরণে
- ভিউ
- হাই
- মদা
- '56'
- ▁তিরা
- ▁ষেপে
- ▁ধারে
- ▁নাইজ
- ▁300
- ▁অনুর
- ামেলা
- ▁মিউ
- ▁দেখ
- ▁থাম
- ▁অভিযোজ
- ▁হাঁটা
- মিক
- শাপ
- ানা
- ▁যাকটি
- ▁রবাল
- ▁বিতর
- কিউ
- ▁সিট
- ধীন
- ▁150
- ঁজ
- ▁গীত
- ▁থাকত
- াঁচে
- '600'
- ▁শুনেছে
- ▁ফসফোলিপিড
- ▁বাঁধ
- ▁বীজ
- কূল
- ▁খুঁজছে
- ▁রাজনীতি
- ▁রজেক
- ৯
- m
- u
- ğ
- ▁অববাহিকা
- ▁এনজাইম
- ▁এলিজাবেথ
- ▁কাটলফিশ
- ▁কূটনীতি
- ▁গিলগামেশ
- ▁টিরিওটাইপ
- ▁নৌবাহিনী
- ▁ফাংশন
- ▁ফারেনহাইট
- ▁বাংলাদেশ
- ▁ভলিউম
- ▁মসৃণ
- ▁মোকাবিলা
- ▁যসাগর
- ▁যাভিগেশন
- ▁যালগরিদম
- ▁রাঘিমাংশ
- ▁সমঝোতা
- ▁সালতানাত
- ▁সোককেলেটন
- ▁একাডেম
- ▁দেহভাজন
- ▁বংশধর
- ▁মহাকাশচারী
- ▁রজাপতি
- ▁হেঁটে
- ▁এমারসন
- ▁ছাসেবক
- ▁তোরাঁ
- ▁ধবিরতি
- ▁বিনোদন
- ▁রুসেডার
- ▁াশোনা
- ▁রণেতাদের
- ▁লাপনা
- দারুণ
- ▁যযুগ
- ১৯
- ▁নৃশংস
- ▁গৃহীত
- ▁সিনেমা
- ▁নেবুলা
- ▁ইমাল
- ▁শাটার
- ▁মহাকাশযান
- ▁পিঠ
- ▁থাকুন
- ▁ভালোবাস
- ▁লেপটিন
- ▁সহযোগী
- ▁পটভূমি
- ▁অবাধ
- ▁দুঃখজনক
- ▁ঢেউ
- ▁অসীম
- '97'
- ▁উপযোগবাদী
- ▁অতিথি
- ▁একেবারে
- ▁াবেটিস
- ▁কভারেজ
- ▁জোরালো
- ▁মশলা
- ▁শেঠ
- '94'
- ▁লেগেছিল
- '95'
- পোষণ
- ▁হিপ
- ▁তশাসন
- ▁টিপাত
- ▁হাজি
- ▁রবিন
- ▁যাটিপাস
- ▁টারনেট
- ▁1930
- ▁মিছিল
- ▁মাঠ
- ▁অটোম
- ▁লিখেছ
- ▁দেখছিলেন
- ▁হিংস
- ▁তৃণ
- '98'
- ▁মোনা
- ▁াংখী
- ▁উঠছে
- ▁আইকন
- ▁ফেলুন
- ভাটা
- লিডার
- ▁পিউট
- ▁যোগদান
- ▁ফীতি
- ▁মিটিং
- ▁বোমা
- ▁রাইবো
- ▁রণালী
- ▁টোরে
- ▁রতিকূল
- ডিপি
- ▁লোরেন
- ▁টারবাইন
- ▁টিবডিগুলি
- ▁ঢিবি
- ▁নোঙ
- ▁ছাদন
- ▁হেসে
- ▁বিভাজ
- ▁গুজরাট
- ▁োএ
- ▁120
- ▁খুনি
- োলেট
- ▁এসি
- ▁55
- ▁ডিজে
- ▁সিকো
- ▁ভেলা
- ▁সাইটগুলি
- ▁যাকচার
- ▁কণাগুলি
- ▁মতামত
- ▁কারখানাগুলি
- ▁ফুটপ
- ▁রাখছেন
- ▁শোনে
- ▁ষতিকর
- ▁ছাকৃত
- ▁শহরগুলো
- ▁াকরণ
- ▁যাদুঘর
- ▁সাগু
- ▁কেলিং
- ুথ
- োনাইজ
- ▁রগামী
- ▁যাসীদের
- ▁ভীত
- ▁রচলন
- ালো
- ▁টিপস
- ▁মৌ
- ▁যাফো
- ▁উঠবেন
- ▁সংবাদ
- ▁কাঁচ
- ▁চালনা
- ▁রেজার
- ▁রাসাদ
- ▁উপকরণগুলি
- ▁এগুলো
- ▁নীতিগ
- ▁0
- ▁নিকট
- ▁টেরিওটাইপগুলি
- ▁ফোরক
- ▁টোন
- ▁খনিজ
- ▁অবনতি
- ▁বনভূমি
- ▁যাটারিগুলি
- গাল
- ▁ডারগ
- ▁লুপগুলি
- ▁লজ
- ▁রনগুলি
- কিশোরী
- ▁ছেলেদের
- ভাষী
- ▁ডিপ
- ▁জুনাজপুস
- ▁গোলা
- ▁গভ
- ▁অধিক
- ▁মাইলের
- ▁কুই
- ▁সমালোচনা
- ▁যাফোস
- ▁অধিকারী
- ▁যবোধ
- ▁ধারকরা
- বিধি
- ▁ইকো
- ▁রিটেন
- ুভ
- ▁উপযোগ
- ▁নভ
- ▁ঠীগুলি
- ▁ঘটনাগুলি
- ▁মাংস
- ▁বাদাম
- োচন
- ▁লেব
- ▁বলছেন
- ▁চুষ
- ▁ঠানগুলি
- ▁শাক
- ▁কোঁ
- ▁বাভাবিকভাবে
- নুকি
- ▁লাইড
- িবিটি
- ▁যবসাগুলি
- িকে
- ▁যুগুলি
- ▁টিপ
- ▁রেফ
- ▁কাটে
- োলজি
- ঘর
- ▁টিমাই
- ▁গজা
- ▁সুযোগগুলি
- ▁বাজি
- ▁বিজি
- নেকের
- ীমা
- গুঁ
- ▁যাকরণ
- ▁গুন
- ▁বাঘ
- ▁দেহে
- সা
- '79'
- ▁যেকটি
- ▁টারে
- সিফ
- ▁লেপ
- ▁শুনেছিল
- ▁শেড
- ▁সুইড
- ▁াটে
- ▁কলাম
- ▁তেমন
- ▁ামে
- বাইক
- ▁ঢালা
- ▁মুখীতা
- ▁শিশুরা
- ▁বরফ
- ধারা
- ▁পৌ
- ▁কোল
- ▁তালা
- ▁লিন
- ▁খালে
- ুলেট
- ▁টিভি
- ▁রিম
- ▁সেনে
- ▁থামা
- ▁মিটারের
- ▁আসি
- ▁টুল
- ▁ভেজ
- ▁লাশ
- ▁রাগ
- ামাল
- টারের
- ▁রিজটি
- ▁দোর
- ▁যাসটি
- টকে
- ▁চালাবে
- ফিস
- ▁সাজ
- ▁যুব
- েবল
- ▁দিলে
- সিন
- ▁অজ
- ▁শা
- ▁টেজ
- ▁শতাংশে
- ▁ডু
- িজম
- জমে
- সাদ
- ▁অবা
- ▁পুরুষকে
- হাঁ
- ▁লুকো
- ▁মেঘে
- জান
- বক
- ▁যুতি
- ▁শতক
- ▁জিম
- রাণি
- ▁যানু
- সো
- ▁মিলন
- ▁চাইবে
- কৃতির
- ▁রোভ
- ▁মাইল
- '30'
- ▁পরিষেবাগুলি
- ▁আমানি
- ▁ছামত
- '500'
- বোল
- ▁ছবিগুলি
- ▁অরি
- ালি
- ▁নিই
- ▁তেলাপোকার
- কারে
- ▁রামে
- ▁সূচ
- ▁ারো
- ▁যাসি
- ▁টেলিভিশনে
- বুক
- টস
- ▁দেখান
- ুসং
- কু
- ▁আদি
- ণের
- িটাল
- ▁মরি
- রীদের
- বিচ
- ▁ধিম
- ▁রিটে
- ▁চাচা
- ▁গানে
- ▁শিবিরে
- টেন
- ▁দুঃ
- ▁টিকেলে
- ▁কেনে
- '000'
- ▁যুগ
- াশা
- '48'
- ▁কুর
- শান
- জিতে
- ▁খেলে
- ▁পরম
- পির
- ▁আঁ
- ভাব
- ানু
- ▁মাতৃ
- পশম
- ▁ষাত
- াণ
- ৃপ
- ▁চো
- কাঠি
- লন
- টারি
- ফল
- করণ
- টন
- ▁অতীত
- াইজার
- আর
- ▁ঝুলে
- িওল
- খোঁজ
- বোধ
- ▁গাগুলি
- ▁পেল
- বেশি
- ঘুরি
- কী
- ▁যাটা
- 08
- িব
- িৎ
- চিব
- '19'
- লাইট
- নৈতিক
- শুদ
- শম
- ▁সরকারে
- গভীর
- রোটিন
- '80'
- লেট
- ভাষা
- নাইজ
- হাত
- অপ
- ধারণ
- জানা
- ▁ঘটান
- অ
- ▁193
- কাজ
- ▁শুনেছি
- জুন
- িউ
- ▁নদ
- চুরি
- হেল
- ▁শেখান
- দি
- ঁকি
- ▁আসাদ
- লোভন
- ▁রিভে
- োগান
- নিউ
- ▁পৌঁছ
- াগ
- ▁াপথ
- ▁শোক
- ফেল
- মাণ
- ঘন
- তাই
- ▁ভুগছ
- ▁তৃ
- ▁বুঝি
- ▁দেখছি
- বসে
- ▁উঠল
- ▁টিম
- ▁180
- ▁জলা
- চা
- ▁লেগ
- ডিএ
- মাই
- ফিউ
- রিসে
- ▁পারমা
- ▁বেষ
- ▁মিলনে
- ▁110
- াংশের
- েটিক
- ▁800
- জিশন
- ▁ধারণে
- ▁তোম
- োনে
- ▁বলত
- ▁রাচ
- ▁বেগে
- ালদে
- ▁শুন
- ▁যারো
- ▁3000
- ▁1500
- ডেন
- ▁মূলধারা
- সিকতা
- ▁ছু
- ▁তাঁ
- ▁খোঁ
- ▁ভাবি
- ▁জুনাজপু
- ▁চালাব
- ▁পাথ
- গণিত
- ▁থেরাপিউটিক
- ▁মেক
- ▁ইংরেজ
- হীনতা
- ▁সেখান
- াহু
- ▁ফুটে
- হাউ
- ▁একগু
- ▁রাখছে
- ▁চমক
- ▁টিবডি
- ▁রাউ
- ৌরব
- ৎসাহ
- ভাসক
- ▁এসমেরাল
- e
- i
- ঊ
- ৬
- ▁1988
- ▁1990
- ▁অবৈধ
- ▁আকসুম
- ▁আজারবাইজান
- ▁ইসমাইল
- ▁কৌতুক
- ▁জরিমানা
- ▁তকণিকা
- ▁দাবানল
- ▁নিবেদিত
- ▁ফিলিপাইনে
- ▁যাবরেটরি
- ▁শৈবাল
- ▁সাবমেরিন
- ▁সিংহভাগ
- ▁সিংহাসনে
- ▁হাইপোথিসিস
- ▁ঘৃণ
- ▁ণযুগ
- ▁কোঅপারেটিভ
- ▁ঘেরলিন
- ▁জেলালেম
- ▁ঠপোষকতা
- ▁বিছানা
- ▁যাচমেকার
- ▁রাজবংশ
- ▁শীতাতপ
- ▁শোধন
- ▁সিকিউটিভ
- ▁হোমোরোটি
- ঘাঁট
- ▁বিলাসিতা
- ▁লেনদেন
- ▁ফোঁটা
- ▁ভালবাসে
- ▁ভূমিধস
- ▁ডেলিভারি
- ▁কমিউনিকে
- ▁এমবেড
- ▁ইউএস
- ▁ঝাঁঝ
- ▁সপোজার
- েমাট
- ▁উপসংহার
- ▁পিনবল
- ▁টাইফুন
- লিউশন
- ▁রবিবার
- ▁লেডগুলি
- ▁লুমিরা
- ▁চিবানো
- ▁রেগারি
- ▁টাইটান
- ▁কিনেছিলেন
- ▁কেরাটিন
- ▁লাজুক
- ▁শুনুন
- ▁সুসংবাদ
- ▁পহেড
- ▁মানবজাতি
- ▁মৌসুম
- ▁রবাদ
- ▁বদলানো
- এইচও
- ▁খল
- ▁রেণি
- ▁মীবাহিনী
- ▁ইরানী
- কোভ
- ▁মিলিমিটার
- ▁রসারণ
- ▁পরিহাস
- ▁রতারণা
- ▁টেসলা
- ▁014
- ▁খোসা
- ▁3500
- ▁ঘনমিটার
- বিধান
- ▁নিউটন
- ▁নেভিগেশন
- ▁গুণফল
- ▁খাঁ
- ▁কেলটন
- রিডিস
- ▁কনভেন
- ▁টেরিও
- থু
- ▁1450
- ▁টোবর
- ▁188
- ▁1980
- ▁কুকুর
- ▁পরিধি
- ▁দুঃখ
- ▁185
- ▁চাবিটি
- ▁লোরিড
- ▁1940
- ▁ধরবেন
- ▁নিঃশ
- ▁ঝাপ
- ▁তপাত
- ▁গীকার
- ▁শহরবাসী
- ▁ফসিল
- ▁যুভর
- ▁টলেশন
- ▁শুনিনি
- ▁যানজট
- ▁ডেভি
- ▁লেগেছে
- ▁জেলা
- ▁ঘটছিল
- ▁রানজিট
- ▁187
- ▁রণোদ
- ▁33
- ▁াবহ
- ▁গেছি
- '05'
- ▁খেলেছে
- ▁জিরো
- ▁ঝরনা
- ▁উপদেশ
- ▁38
- ▁াংখি
- ▁সারাদিন
- ▁শিম
- ▁আগামীকাল
- ▁বেআইনি
- ▁শিখেছে
- সিল
- ▁বাজানো
- ▁লাগছে
- ▁পালগুলি
- ▁লিউ
- ▁পাননি
- মিশনে
- ▁126
- ▁টিথার
- ▁ডোবা
- ▁বিরাজ
- এনবি
- ▁রোথ
- ▁বলছিলেন
- োনাল
- ▁যাংকিং
- চুপ
- ▁রোপড
- ▁টাইমলাইনে
- ▁যাকট
- ▁বাঁধা
- ▁যোনি
- ▁বোনা
- ▁করোন
- াকাশে
- ▁জেনেনা
- ▁ফসফোলিপিডগুলি
- ▁ওভারহ
- লেক
- ▁এলো
- ▁পিকাবু
- ▁আইনগত
- ▁তনালী
- সোম
- ▁উপকূলরেখাকে
- ▁তেরো
- ▁ফেরি
- '89'
- ▁রতিবেদন
- ▁অনুপাতে
- ▁থিম
- ▁ফলিকল
- ▁নলিখিত
- বিটি
- ▁ডিশনারগুলি
- ▁সহজাত
- ▁গুদাম
- ▁কারাগারে
- ▁গেলাম
- ▁হোমো
- ▁ফোটা
- ▁মানজনক
- ▁ঝু
- ▁অবকা
- ▁পেলেন
- ▁ফিনা
- ঃস
- ▁ঠাতা
- ▁লবণ
- ▁বিলাস
- ▁তিনজন
- ▁রশমন
- লিসা
- ▁পরিপূর
- ▁কিউবা
- ▁মিকা
- বদলে
- ▁জেনো
- পসাগর
- ▁বেসরকার
- ▁সুপ
- ▁যুইটি
- ▁চাইনি
- ▁ধিমূলক
- টিউ
- ▁ফাটল
- ▁সেলগুলি
- িওপি
- ▁নজির
- ▁হামলা
- ▁পুরু
- ▁অমরজত
- ▁তরণটি
- ▁করলাম
- ▁কখনো
- ▁মশালটি
- ▁গকগুলি
- ▁দিকগুলি
- ▁গমনকারী
- ▁দেখাবে
- ▁চাইলে
- নেভি
- ▁সাপগুলি
- ▁নোট
- ▁যানবাহনগুলি
- ▁সোমা
- ▁দেখেনি
- ▁োগকারীদের
- ▁রাইলোবাইটগুলি
- ▁ষণশীল
- ▁সেতুগুলি
- ▁বিবেক
- ▁খোঁজে
- ▁দেশগুলো
- ▁তারকা
- রীস
- ▁ডফিল
- ▁নাগাল
- ▁বোনাইজ
- ▁থেরাপিউটিকস
- ▁জিগ
- ▁যাপট
- ▁যৌগ
- ▁রুপার
- ▁রচল
- ▁যারিস
- ▁সহনশীল
- ▁বিনা
- াখা
- ▁যহীনতা
- ▁ভিজি
- ▁আঠা
- ▁ফাইন
- ▁ডুব
- ▁বইটি
- ▁সংযোগগুলি
- ▁রাফট
- ▁রবালের
- ▁ফে
- াসী
- সূরি
- সেছিলেন
- ▁যাসেল
- ▁গাইড
- ▁তাঁর
- ▁রোট
- ▁পনগুলি
- ▁গীতি
- ▁ধৃত
- োবা
- ▁বাবধান
- ▁সারিতে
- নামূল
- কভাবে
- ▁পৌঁছান
- লিখিত
- ▁তূপ
- ▁শিকারি
- ▁যথাস
- মেজ
- ীকৃত
- নাতনি
- ▁টরে
- ুখী
- চেতন
- ▁যাবলে
- ▁ধারণাগুলি
- ▁জীবগুলি
- ▁কাজিন
- ▁560
- হেলম
- ধমনী
- ▁করুণা
- ▁করেছ
- আ
- ১
- '%'
- ':'
- c
- h
- l
- n
- z
- ü
- ২
- ৪
- ৩
- ঔ
- ঈ
- Ö
- U
- ঋ
- ঐ
- '?'
- O
- ।
- ়
- <sos/eos>
src_token_list:
- <blank>
- <unk>
- s
- ▁the
- ▁to
- ▁and
- ▁of
- ▁a
- ing
- ▁in
- ed
- ▁that
- ▁is
- ▁
- ly
- ▁we
- ▁it
- ▁you
- d
- ▁this
- ▁for
- ▁on
- e
- ▁be
- ▁but
- ▁with
- ▁so
- ▁are
- ▁i
- ▁have
- ▁can
- ▁they
- y
- ▁was
- ▁as
- ▁its
- ▁from
- ▁their
- ▁at
- ▁what
- ▁by
- ▁one
- ▁our
- ▁an
- ▁or
- ▁al
- ▁like
- ▁more
- ▁when
- ▁your
- ▁were
- ▁some
- ▁all
- ▁not
- ▁these
- n
- ▁people
- t
- ▁how
- ▁if
- ▁about
- to
- ▁there
- ▁do
- ▁would
- ▁up
- ▁her
- al
- nt
- ▁which
- ▁just
- er
- ▁them
- ▁us
- ▁he
- ▁other
- ▁now
- es
- ▁know
- ▁has
- ▁s
- ▁make
- ▁my
- ▁most
- ▁because
- ▁could
- ▁world
- ▁than
- ▁work
- ▁need
- ▁will
- ▁had
- ▁out
- a
- r
- so
- ▁over
- ▁who
- o
- ▁time
- ▁new
- ▁even
- ▁get
- ve
- many
- ▁well
- ▁she
- ▁no
- ▁energy
- en
- ion
- ▁where
- ▁climate
- ▁take
- st
- ▁change
- ▁me
- ▁power
- ▁un
- ▁two
- ▁first
- ▁help
- ▁look
- ▁way
- le
- ▁dont
- ▁years
- ▁re
- ▁here
- ▁call
- ▁think
- p
- ▁m
- in
- ▁carbon
- able
- c
- ▁much
- ▁those
- ever
- ▁been
- ▁use
- ▁right
- ▁co
- m
- thing
- ▁go
- ▁want
- ▁through
- l
- ▁see
- ▁same
- ive
- i
- ▁may
- ▁really
- ers
- ▁going
- ▁every
- ▁then
- ▁cells
- ▁life
- ▁his
- ▁different
- ▁should
- out
- u
- ▁around
- ▁im
- ▁things
- ight
- re
- g
- ▁still
- ▁sleep
- ▁any
- ▁say
- 'on'
- ▁high
- ▁while
- ▁back
- ▁long
- each
- ous
- ▁light
- ▁b
- ▁let
- ▁does
- ▁down
- th
- ▁own
- ies
- ▁three
- ▁start
- ▁countries
- ▁before
- ▁made
- ▁show
- ▁system
- ▁come
- very
- ▁large
- ic
- ▁tell
- yre
- ▁percent
- ▁global
- ▁india
- ▁after
- ▁water
- ▁n
- ▁often
- ▁off
- ▁again
- ▁part
- ll
- ▁ma
- ▁company
- ▁country
- twe
- ▁being
- k
- ▁next
- ▁used
- ▁point
- ▁de
- ▁today
- ▁hear
- ready
- ▁day
- ▁actually
- ▁try
- ▁year
- ▁lot
- ▁less
- ar
- ting
- ▁ca
- ▁answer
- ▁keep
- ▁develop
- ▁another
- ment
- ▁end
- ▁d
- ation
- ce
- b
- ▁per
- us
- ▁grow
- ▁enough
- ▁why
- ▁live
- ful
- or
- ▁g
- ▁number
- ▁place
- ▁give
- ▁few
- ▁t
- ▁19
- ▁20
- '2'
- ▁create
- ▁important
- ▁believe
- ▁single
- ▁cities
- ▁10
- uch
- ▁move
- ▁example
- ▁fossil
- ▁mrna
- ▁better
- ther
- lar
- ▁fact
- ▁p
- ▁find
- ▁city
- ▁did
- ▁mean
- ▁emissions
- is
- an
- ur
- w
- ▁research
- ▁kind
- ▁government
- ▁allow
- ▁case
- ▁person
- ▁small
- ▁green
- ▁across
- ▁him
- ▁second
- ▁home
- ▁human
- ▁night
- ▁thank
- ▁air
- ▁best
- ▁future
- ▁space
- ▁together
- ▁become
- ince
- ▁both
- ▁must
- ▁cant
- ▁dur
- ▁seem
- ▁side
- h
- ▁man
- one
- ▁body
- ▁car
- ▁problem
- ate
- ity
- ▁too
- ▁under
- ▁final
- ▁play
- ▁build
- ary
- ▁form
- ▁set
- ▁e
- ▁money
- ▁good
- ▁five
- ▁follow
- ▁last
- ▁economy
- ry
- est
- ▁pay
- ite
- ▁plants
- ▁bas
- be
- ▁doing
- ▁companies
- ▁entire
- ▁great
- ▁yet
- ▁far
- ▁everyone
- ▁story
- ▁times
- ▁away
- ▁ra
- ▁f
- ty
- ▁stop
- most
- ▁force
- ▁scale
- ▁impact
- less
- ▁mass
- ▁sp
- ▁once
- ▁found
- ▁ear
- ▁hair
- ▁possible
- ▁sound
- ▁electric
- ▁states
- it
- ▁red
- ▁produce
- ▁million
- ▁feel
- ▁put
- ▁planet
- ▁means
- ▁later
- ▁o
- ▁value
- ▁telescope
- ▁result
- el
- ▁ask
- ▁cost
- '00'
- ▁turn
- ▁health
- ▁industry
- ▁potential
- ▁clear
- ▁consider
- ▁gold
- ▁didnt
- ▁heart
- ▁brain
- v
- ▁lead
- ▁thing
- ▁plant
- making
- ▁working
- ▁process
- ▁technology
- ▁name
- ▁electricity
- ▁sure
- ally
- x
- age
- ▁15
- ble
- ness
- ▁ever
- ▁mo
- ▁family
- ▁earth
- ▁rule
- ▁r
- ▁hand
- ▁god
- ▁others
- ▁old
- ▁share
- ▁vaccines
- ▁k
- ▁heat
- ▁continue
- ▁talk
- ▁question
- ▁fuel
- sh
- ▁business
- ▁economic
- ▁little
- ▁dioxide
- ▁run
- ▁public
- ▁cause
- ▁reach
- ▁scientists
- ▁com
- ▁land
- ▁seen
- ▁st
- '9'
- ▁understand
- ld
- la
- ▁got
- ▁whe
- ▁big
- ▁century
- ▁concrete
- ▁support
- ▁food
- ▁always
- ▁job
- ▁certain
- ▁act
- ▁design
- ▁head
- ▁resources
- ▁6
- ▁came
- ors
- ▁unit
- ▁c
- ▁control
- ▁level
- ▁test
- ▁1
- ge
- ▁war
- ▁gas
- ▁africa
- ▁hard
- ▁mitochondria
- ▁remain
- ▁along
- ▁nature
- ▁near
- ▁billion
- ▁white
- ▁exactly
- ▁11
- ▁instead
- ▁discover
- ter
- ▁3
- ▁4
- '0'
- ▁creat
- ▁includ
- ▁protein
- ine
- ▁love
- ▁close
- ▁amount
- ▁degree
- ▁desert
- ▁particular
- ▁pandemic
- ▁data
- ▁approach
- ▁renewable
- ▁cement
- ▁blue
- ▁lower
- at
- ▁cover
- ▁bring
- ▁lives
- ▁thousands
- ▁fuels
- li
- ▁protect
- ▁idea
- ▁term
- ▁group
- ▁difficult
- ▁systems
- ▁started
- ck
- ▁sea
- ▁treat
- ▁immun
- ▁platform
- ▁women
- ▁current
- ▁model
- ▁history
- ▁began
- ▁short
- ▁eye
- ▁save
- ▁team
- ▁sun
- ▁moment
- ▁usual
- ▁table
- member
- ▁happen
- ▁provide
- ▁true
- ▁coal
- ▁wanted
- tic
- ▁pre
- he
- ▁trees
- ▁ways
- ▁color
- ▁vaccine
- ▁full
- ought
- ▁response
- ▁themselves
- ▁natural
- ▁prevent
- ▁exist
- ▁information
- ▁took
- ▁buy
- ▁thought
- ▁mind
- per
- ▁though
- ized
- til
- il
- ish
- ▁face
- ▁clean
- ian
- ▁product
- ▁increase
- ▁low
- ▁humans
- ▁da
- ▁species
- ▁local
- ▁rich
- ▁huge
- ▁break
- ▁stories
- ▁order
- ▁blood
- ▁late
- ▁fight
- ▁reduce
- ▁quickly
- ▁seat
- ance
- ▁dis
- ▁du
- ▁real
- ir
- ist
- ▁2
- ▁map
- ▁invest
- am
- ▁bone
- ▁experience
- ▁governments
- ▁built
- ▁communities
- ▁yingtai
- ▁situation
- ▁return
- ▁similar
- ▁zero
- ▁south
- ▁standard
- ▁further
- ▁state
- ▁young
- ▁h
- ▁months
- ▁ago
- ory
- ▁sh
- ▁vari
- ▁liv
- te
- ra
- ▁5
- ▁black
- ▁saving
- ▁reason
- ▁laughter
- ▁market
- ▁children
- ▁desp
- ▁open
- ▁sky
- ▁fall
- ▁word
- ▁inter
- ▁vi
- ul
- ▁action
- ▁groups
- ▁bit
- rs
- ▁factor
- ▁dr
- ▁ex
- z
- ety
- ern
- ▁drive
- ▁u
- ▁chemical
- ▁female
- ▁ground
- ▁hundred
- ▁access
- ▁addition
- ▁least
- ▁sex
- ▁specific
- ▁chance
- ▁death
- ▁walk
- ▁sendler
- ▁half
- ▁anxi
- ▁left
- ▁sometime
- ▁port
- ▁pass
- ▁wind
- ▁arent
- ma
- ▁longer
- ▁free
- ▁toward
- ▁questions
- ▁soon
- cy
- ▁king
- ▁w
- came
- ▁cars
- ▁ga
- ▁poor
- ▁bridge
- ▁evidence
- ▁extreme
- ▁risk
- ▁surface
- ▁wrong
- taking
- ▁common
- ▁probab
- ▁plate
- ▁moon
- ▁either
- ▁language
- ▁sw
- co
- ▁cop
- ▁american
- ▁kilometers
- ily
- ▁am
- ▁inside
- ch
- ▁millions
- ize
- ▁house
- ▁materials
- ▁step
- ▁mi
- ▁con
- '8'
- ▁cockroach
- ▁racial
- ▁study
- ▁woman
- ▁modern
- ▁key
- saw
- ▁dna
- ▁survive
- ▁total
- ▁cell
- ▁op
- ▁line
- ▁pa
- ▁tra
- ▁hours
- ▁simple
- hap
- ▁size
- ▁solutions
- ▁200
- ▁safe
- ▁contain
- ▁require
- ant
- ▁past
- ▁success
- ▁fast
- ▁transform
- ▁fear
- ▁behind
- ▁english
- ▁essential
- ▁exponential
- ▁science
- ▁trilobite
- ▁yourself
- ▁demand
- ▁focus
- ▁multiple
- ▁third
- ▁said
- ▁covid
- ▁income
- ▁pressure
- ▁four
- ▁hold
- ▁lack
- ▁significant
- ▁astronomer
- ▁easy
- ▁disk
- ▁done
- ▁growth
- ▁pair
- ▁fire
- ▁dollars
- ▁asian
- ▁matter
- '5'
- ▁vehicles
- id
- vi
- ▁pro
- ▁simpl
- ▁30
- ▁decades
- ▁fat
- ▁results
- ▁areas
- '4'
- ▁waste
- ▁building
- ▁deliver
- ▁sit
- ▁egypt
- ▁view
- ▁community
- ▁conversation
- ▁pigeon
- ▁suggest
- ▁wavelength
- ▁constant
- ▁course
- ▁dream
- ▁unique
- ▁itself
- ▁send
- ▁absorb
- ▁law
- cient
- ▁universe
- ▁north
- ated
- ▁respond
- ▁sense
- ▁eat
- ▁forward
- ▁school
- ▁places
- ▁tree
- ure
- ▁hit
- tton
- '6'
- ▁goal
- ▁individuals
- ▁days
- ▁sha
- ▁patients
- ling
- ▁shift
- ▁limit
- ted
- ▁generate
- ▁en
- ▁star
- ▁solve
- land
- ▁visit
- ▁8
- ▁12
- ▁sub
- ▁begin
- ▁available
- ▁difference
- ▁greek
- ▁measure
- ▁opportunity
- ▁damage
- ▁increasing
- ▁relative
- ▁typical
- ▁release
- ▁middle
- ▁wont
- ▁center
- ▁weather
- ▁cancer
- ▁rise
- ▁loop
- ▁class
- ▁fix
- ▁role
- ▁policies
- ▁loss
- ▁care
- ▁outside
- ▁image
- ▁lord
- ▁basic
- ▁l
- ▁add
- as
- ng
- ▁pret
- ▁nation
- ▁range
- ▁challenge
- ba
- ▁sum
- ▁di
- pri
- ▁learn
- ▁trap
- ▁appear
- ▁interest
- ▁goals
- ▁anti
- though
- ▁determine
- ▁hope
- ▁virus
- ▁campaign
- ▁efficiency
- ▁knew
- ▁plastic
- ▁pollut
- ▁poverty
- ▁racism
- ▁stereotype
- ▁studie
- ▁behavior
- ▁avoid
- ▁river
- ▁moving
- ▁coast
- ▁instance
- ▁square
- ▁skin
- ▁effective
- ▁reality
- ▁faster
- ▁environmental
- ▁rate
- ▁trade
- ▁plan
- ▁completely
- jar
- ▁speed
- ▁tri
- ▁alone
- ▁sort
- ▁post
- ▁national
- ▁parents
- ations
- ▁store
- ier
- ▁connect
- ▁decid
- ▁levels
- ▁non
- ▁trojan
- ▁relationship
- ▁forest
- ▁material
- ol
- ▁patterns
- ▁refugees
- ▁sign
- ▁costs
- ▁talking
- ▁imagine
- ric
- ions
- ▁kill
- ▁hav
- ▁po
- ▁arm
- na
- ▁canal
- um
- ▁strong
- ▁threat
- ▁fel
- ▁extra
- ▁efficient
- ▁incredibl
- ▁infrastructure
- ▁switch
- ▁milk
- ▁panel
- ▁slavery
- ▁social
- ▁present
- ▁warming
- ▁organization
- ▁dangerous
- ▁else
- ▁deep
- ▁corporation
- ▁divers
- ▁society
- ▁listen
- ▁moral
- ▁signal
- ▁stand
- ▁sever
- ▁janie
- ▁separate
- ical
- ▁taken
- ca
- ▁type
- ▁solution
- ▁friend
- ▁main
- ▁animals
- ver
- side
- f
- ▁rules
- ▁options
- ent
- qu
- ▁bags
- ▁engines
- ▁70
- ▁treatments
- ▁ready
- ▁benefits
- ▁top
- ring
- ac
- ▁depend
- ▁manufactur
- tion
- ▁meet
- while
- ▁problems
- ▁americans
- ▁cur
- ▁works
- ▁homes
- ▁central
- ▁political
- ▁sappho
- ▁technologies
- ▁transition
- ▁escape
- ▁ocean
- ▁reflect
- ▁rough
- ▁unlike
- ▁crisis
- ▁jeans
- ▁eventually
- ▁harm
- ▁spouse
- ▁trans
- ▁2020
- ▁atmosphere
- ▁egyptian
- ▁attack
- ▁safety
- ▁mother
- ▁grid
- ▁stay
- ▁vir
- ▁bank
- ▁changed
- '3'
- ide
- go
- ▁structure
- ▁workers
- ▁organ
- ▁giv
- ▁numbers
- ▁ver
- ▁learned
- ▁cl
- ability
- de
- ▁lie
- ▁hi
- ▁16
- ▁recognize
- ▁game
- ▁benefit
- ▁saying
- ▁led
- ▁leave
- ▁temperatures
- ▁br
- ▁men
- ▁estimate
- ▁100
- ped
- ▁read
- des
- ▁th
- are
- ▁travel
- ▁fund
- ▁lo
- ▁improve
- head
- ▁cut
- ▁needs
- ▁tea
- ▁progress
- q
- ▁perform
- ▁complex
- ▁general
- ▁urban
- ▁advance
- ▁california
- ▁journey
- ▁perspective
- ▁quality
- ▁supply
- ▁affect
- ▁birth
- ▁piece
- ▁slow
- ▁dentist
- ▁spider
- ▁famili
- ▁favor
- ▁fair
- ▁child
- ▁trick
- ▁mona
- ▁everybody
- ▁flight
- ▁cool
- ▁flow
- ▁mali
- ▁track
- ▁male
- ▁manage
- ▁durr
- ▁colors
- ▁lying
- ▁notice
- ▁personal
- ial
- ▁doctor
- ▁weeks
- ▁80
- ments
- op
- '0000'
- yon
- cu
- ▁ensure
- ▁offer
- ▁camp
- ▁projects
- ▁lipid
- ▁molecule
- ▁treatment
- ▁candles
- ▁asked
- ▁points
- ▁engineers
- ▁mine
- ▁types
- ▁mu
- ▁chang
- nd
- ard
- ▁id
- my
- ▁compar
- ze
- king
- ▁agree
- ship
- national
- '1'
- ▁supernova
- ward
- tual
- ph
- ▁machine
- ▁author
- ▁capacity
- ▁celsius
- ▁driving
- ▁grandfather
- ▁oxygen
- ▁picture
- ▁satellite
- ▁scientific
- ▁shanbo
- ▁village
- ▁evolv
- ▁record
- ▁equip
- ▁floor
- ▁numer
- ▁regard
- ▁seven
- ▁among
- ▁neighborhood
- ▁lisa
- ▁normal
- ▁feed
- ▁indeed
- ▁catch
- ▁personality
- ▁kid
- ▁lose
- ▁innovation
- ▁east
- ▁six
- ▁trust
- ▁train
- ▁ne
- ▁sell
- ▁germany
- ▁teach
- ▁inject
- ell
- nese
- ▁ball
- ▁rare
- ▁mammals
- ▁carey
- ▁everything
- ▁products
- ▁eggs
- ▁healthy
- ities
- ia
- ▁bonds
- ▁rest
- norm
- ▁adults
- ▁locusts
- ▁cr
- ▁cu
- ▁area
- ▁cooperatives
- ▁tons
- ▁ph
- ▁die
- ▁le
- ▁figure
- ▁conditions
- ▁environment
- ▁regions
- ▁engine
- ru
- ▁seizure
- ▁pattern
- ▁refugee
- ▁bu
- ▁patient
- ▁bag
- ▁hour
- ▁engineer
- ▁turns
- lands
- lf
- ▁18
- ▁50
- ▁weight
- ▁created
- ▁ab
- ▁tru
- bi
- '7'
- ▁forms
- ▁burn
- fortunate
- ▁age
- ect
- ha
- ating
- ▁kings
- ▁europe
- ▁major
- ▁kush
- ▁rock
- ▁account
- ▁ambition
- ▁apnea
- ▁citizens
- ▁dramatically
- ▁education
- ▁footprint
- ▁ourselves
- ▁predators
- ▁producing
- ▁recipient
- ▁student
- ▁sudden
- ▁tackl
- ▁target
- ▁vulnerable
- ▁block
- ▁extend
- ▁racist
- ▁speech
- ▁daughter
- ▁below
- ▁committ
- ▁push
- ▁service
- ▁tiny
- ▁easier
- ▁causing
- ▁sector
- ▁launch
- ▁quite
- ▁describe
- ▁jew
- ▁capital
- ▁mention
- ▁wait
- ▁dome
- ▁pull
- ▁24
- ▁construction
- ▁born
- ▁financial
- ▁chain
- ▁banda
- ▁ari
- ▁sustainable
- ▁position
- ▁humanity
- ▁race
- ▁twin
- sion
- ▁sent
- ▁enter
- ▁expect
- ▁policy
- ▁rain
- wise
- ▁shipp
- ▁shape
- ▁bigge
- ▁medic
- ▁special
- om
- ▁tend
- ▁waves
- ▁remov
- ake
- ▁201
- utch
- ock
- ▁decarbonize
- ▁news
- ▁door
- ▁populations
- ▁fly
- ▁happening
- ▁realiz
- ▁source
- ▁price
- ▁disease
- ▁minute
- ▁candle
- ▁decade
- ▁molecules
- coming
- ▁fl
- dp
- ▁seeing
- un
- ▁el
- act
- ▁challenges
- ▁pee
- ▁wa
- ▁capture
- ▁meaning
- scar
- special
- ▁complete
- ▁swap
- ▁lines
- ▁pack
- ▁tax
- ▁sultan
- ▁novel
- ▁count
- ▁dark
- ▁civil
- ▁advantage
- ▁multi
- ▁physical
- ▁recycl
- ▁surround
- ş
- ▁accord
- ▁alternative
- ▁attention
- ▁cactus
- ▁culture
- ▁economist
- ▁myself
- ▁negative
- ▁relief
- ▁salesman
- ▁secret
- ▁expand
- ▁finance
- ▁report
- ▁glass
- ▁consequences
- ▁excited
- ▁choice
- ▁guess
- ▁loud
- ▁court
- ▁stick
- ▁address
- ▁calcium
- latoni
- ▁electrons
- ▁network
- ▁adopt
- ▁watch
- ▁translate
- ▁maintain
- ▁industrial
- ▁bad
- ▁distance
- ▁2030
- ▁batteries
- ▁communication
- ▁impossible
- ▁location
- ▁25
- ▁galaxy
- ▁frequent
- ▁camera
- ▁anywhere
- ▁truth
- ▁operation
- ▁associate
- ▁direction
- ▁elevator
- ▁fit
- ▁90
- ▁stable
- ▁inequality
- ▁gap
- ▁worse
- ▁bodies
- ▁wor
- ▁legal
- ▁raise
- da
- ▁ship
- ▁severe
- ▁em
- ▁fr
- ▁muscles
- ▁beat
- ▁proteins
- ging
- ▁tools
- ride
- lines
- ▁leader
- ft
- oke
- ten
- ious
- ▁diseases
- ▁minutes
- ▁accelerate
- ▁emit
- ▁seizures
- mat
- ▁understanding
- ▁road
- ▁project
- ▁effect
- some
- ▁viper
- ▁nanoparticle
- ▁condition
- ▁region
- ▁temperature
- ▁individual
- ▁option
- ▁coat
- ▁turning
- ifi
- ▁friends
- res
- ▁stor
- not
- ▁1000
- ▁la
- lob
- amp
- ap
- ▁combin
- ▁speak
- ang
- body
- izing
- house
- ▁lay
- ▁involve
- ▁america
- ▁val
- ▁abundan
- ▁arrest
- ▁chlorophyll
- ▁cloud
- ▁colleague
- ▁context
- ▁couple
- ▁economies
- ▁effort
- ▁enslaved
- ▁erik
- ▁experiment
- ▁genetic
- ▁hurston
- ▁infection
- ▁length
- ▁limestone
- ▁method
- ▁outbreak
- ▁positive
- ▁predict
- ▁recommend
- ▁urethra
- ▁annual
- ▁conflict
- ▁destroy
- ▁dialect
- ▁flood
- ▁hormone
- ▁narrow
- ▁scatter
- ▁therapy
- ▁thick
- ▁throw
- ▁volume
- ▁continent
- ▁ecosystems
- ▁fail
- ▁girl
- ▁thrive
- ▁field
- ▁swarm
- ▁drug
- ▁vibrat
- ▁average
- ants
- ▁morning
- ▁corner
- ▁stretch
- ▁overall
- ▁subject
- ▁hung
- ▁heav
- ▁orbit
- ▁instructions
- ▁pick
- ▁donor
- ▁responsibility
- ▁search
- ▁list
- ▁labor
- ▁match
- ▁join
- ▁european
- ▁stress
- ▁crucial
- ▁nap
- ▁yes
- vas
- ▁worth
- ▁park
- ▁invent
- mbus
- ▁host
- ▁gather
- ▁sethe
- ▁wage
- ▁operator
- ▁check
- ▁letter
- ▁actions
- ▁spread
- ▁gone
- ▁sexual
- ▁inspired
- ▁boy
- ular
- ▁cross
- ▁tail
- ▁production
- ▁critical
- ▁literally
- ▁surprising
- ▁cycle
- ▁connection
- ▁harder
- ▁storm
- ▁involved
- ▁z
- '000'
- ordina
- ible
- ▁fortunate
- ▁song
- eries
- ▁bin
- ▁mid
- ▁characters
- ▁safely
- ▁inten
- ja
- ▁jun
- ▁0
- ▁400
- kept
- ▁ix
- ▁features
- ▁objects
- ▁needed
- ▁practices
- ▁tens
- ▁meeting
- ▁meters
- ▁west
- ▁learning
- hol
- ▁gra
- ▁achieve
- ▁cutt
- ative
- ▁event
- ▁contribut
- ▁nanoparticles
- ▁hum
- ilitar
- ▁fac
- ▁lipids
- ▁period
- war
- ▁population
- ▁animal
- ▁issue
- ▁regulat
- ▁plans
- way
- ▁island
- ▁te
- ▁site
- lle
- ven
- ▁j
- ▁40
- ▁matters
- ▁corrupt
- ▁expos
- where
- ▁dead
- ▁quick
- ▁bar
- ▁languages
- ▁wide
- ▁polic
- ▁abo
- shed
- ▁german
- ▁twi
- room
- ism
- ▁chi
- ▁react
- ▁transport
- tract
- ▁lab
- ▁correct
- ▁immediate
- ▁pilot
- ▁previous
- ▁smuggl
- ▁altogether
- ▁bottom
- ▁component
- ▁husband
- ▁kenya
- ▁kilogram
- ▁pigments
- ▁platypus
- ▁responsible
- ▁suppose
- ▁temple
- ▁torture
- ▁york
- ▁battle
- ▁concentrat
- ▁status
- ▁witness
- ▁globe
- ▁peace
- ▁radical
- ▁serious
- ▁urine
- ▁tough
- ▁dozen
- ▁nice
- ▁minorit
- ▁beginning
- ▁accurate
- ▁consistent
- ▁street
- ▁running
- ▁round
- ▁message
- ▁spike
- ▁tech
- ▁convert
- ▁define
- ▁draw
- ▁2050
- ▁17
- ▁transportation
- ▁peg
- ▁paper
- ▁members
- ▁rna
- ▁necessary
- ▁nucle
- ▁strategy
- ▁rocket
- ▁wall
- ▁office
- ▁activity
- ▁flee
- ▁recent
- ▁bright
- ▁path
- ▁slight
- time
- ▁miss
- ▁drop
- ▁taxes
- ▁independent
- ▁stronger
- rising
- ▁claim
- ▁front
- ▁receive
- rch
- ▁anyone
- ▁displac
- ▁press
- ▁apply
- ium
- ▁widely
- ▁room
- ▁hopeful
- ▁smaller
- ▁internal
- ▁dry
- ▁pathways
- ▁grown
- orn
- ▁balance
- ▁required
- ▁min
- ull
- ▁carrie
- ▁ruler
- ▁captur
- ▁fully
- ▁ste
- ef
- ▁fa
- ▁terms
- ▁ideas
- ▁brothers
- ▁generations
- ▁towers
- ▁7
- ▁compare
- ▁include
- ▁bi
- ▁wr
- ▁manufacturer
- ▁whos
- ▁nor
- ▁died
- ▁stopp
- ▁happened
- ▁active
- ▁issues
- ▁aid
- ▁effects
- ▁seek
- ▁observat
- '50'
- ence
- ▁emitt
- ▁sett
- ▁environments
- ▁hous
- ▁continu
- fi
- ▁decide
- ▁ok
- ▁son
- bility
- ▁investment
- ese
- led
- inning
- ▁african
- ▁consume
- gs
- cent
- mon
- ▁ill
- tain
- gu
- im
- ▁buildings
- ▁film
- nder
- line
- set
- lem
- ▁direct
- ▁threaten
- loved
- work
- ▁li
- back
- ▁rais
- ▁arrive
- ▁sk
- ▁ban
- ▁paint
- ▁div
- ▁apart
- ▁adapt
- ▁germ
- ▁che
- ▁laugh
- if
- timate
- ▁attempt
- ▁survey
- ▁afternoon
- ▁bladder
- ▁budget
- ▁crazy
- ▁disguise
- ▁empire
- ▁equivalent
- ▁exchange
- ▁expensive
- ▁explain
- ▁feet
- ▁habit
- ▁himself
- ▁institution
- ▁japan
- ▁membrane
- ▁morrison
- ▁nazi
- ▁opportunities
- ▁peruggia
- ▁represent
- ▁revolution
- ▁scenario
- ▁shoot
- ▁strength
- ▁symptoms
- ▁tanuki
- ▁transfer
- ▁victim
- ▁wonder
- ▁application
- ▁murder
- ▁promis
- ▁stuck
- ▁worldwide
- ▁kingdom
- ▁grew
- ▁trillion
- ▁priv
- ▁document
- ▁score
- ▁super
- ▁giant
- ▁finish
- ▁confiden
- ▁acid
- ▁popular
- ▁traits
- ▁italian
- ▁wales
- ▁sand
- ▁analys
- ▁solid
- ▁china
- ▁wild
- ▁approv
- ▁sterilization
- ▁steel
- ▁indicat
- ▁compromise
- ▁collective
- ▁nile
- ature
- ▁kushite
- ▁majority
- ▁andrea
- ▁edge
- ▁syrian
- rench
- ▁chel
- ▁traditional
- ▁fur
- ▁visibl
- ▁fam
- ▁root
- ▁box
- ▁suit
- ▁prepared
- ▁stage
- ▁fine
- ▁mark
- ▁reaction
- ▁removal
- ow
- ▁please
- ▁amazing
- ▁gate
- cular
- ▁cold
- '26'
- ▁fi
- ▁convinc
- mit
- ▁art
- ▁cable
- ez
- ▁liter
- acious
- ▁spe
- ▁mission
- ▁station
- mental
- ▁processes
- ▁concerns
- anic
- ▁bones
- ▁warn
- ▁disc
- ▁maps
- com
- tch
- ▁bare
- ▁net
- ▁scar
- ▁code
- ▁flu
- ▁beliefs
- ▁residents
- ▁exposed
- ▁painting
- ▁soil
- ▁primar
- ▁combined
- ▁farms
- ff
- ▁speaking
- fully
- ▁sold
- ▁town
- ished
- ility
- ological
- ▁lit
- ▁pl
- ▁miner
- ▁officials
- ▁ingredients
- ▁spine
- ▁cleaner
- ▁neighbors
- ▁75
- ▁consum
- ati
- eth
- ▁fan
- ▁regulator
- ▁partners
- ▁tests
- ▁shifting
- ham
- ▁loved
- ▁replace
- ction
- unch
- ▁22
- ▁dut
- ific
- ▁hub
- ▁cha
- ▁roads
- ▁ii
- cc
- cious
- ▁prices
- ami
- ▁ho
- ▁equal
- ▁camps
- ▁produces
- ▁ent
- ▁contribute
- ▁games
- ▁organs
- ▁36
- ber
- ▁tissue
- ▁chamber
- ▁partner
- ▁cat
- ▁route
- ave
- ▁cooperative
- ▁vehicle
- ▁month
- ▁thousand
- ▁relationships
- ▁sums
- mal
- nia
- lic
- ▁app
- ▁structur
- ouri
- onvi
- ▁breath
- ternal
- gi
- ▁ran
- wn
- ets
- gh
- how
- ▁perfect
- ▁sta
- ▁leav
- ▁elevat
- ▁emerg
- ▁bill
- ▁command
- cast
- bti
- ▁spot
- ▁instruct
- ▁respect
- refore
- ▁ambitious
- ▁coronavirus
- ▁cultural
- ▁customers
- ▁disaster
- ▁elephant
- ▁encounter
- ▁horizontal
- ▁hydrogen
- ▁kettle
- ▁mainframe
- ▁messenger
- ▁mirror
- ▁permanent
- ▁photograph
- ▁precise
- ▁principle
- ▁rival
- ▁sahara
- ▁scholar
- ▁tunnel
- ▁choose
- ▁deserve
- ▁distribut
- ▁president
- ▁purpose
- ▁task
- ▁temporar
- aught
- ▁blow
- ▁collaborati
- ▁occur
- ▁resilien
- ▁suffer
- ▁surgeon
- ▁august
- ▁coalition
- ▁protest
- ▁programs
- ▁exciting
- ▁naked
- ▁aspect
- ▁photosynthesis
- ▁weapon
- ▁vertical
- ▁shot
- ▁strauss
- particles
- ▁beautiful
- ▁genes
- ▁disappear
- ▁nerve
- ▁prosper
- ▁function
- romed
- ▁preserv
- ▁encourag
- ▁trigger
- ▁trauma
- ▁indig
- ▁detect
- ▁mainstream
- ▁guide
- ▁iran
- ▁approximat
- ▁reused
- ▁justice
- ▁clinic
- ▁roman
- ▁superheroes
- ▁01
- ▁degrad
- ▁self
- ▁commission
- ▁wealth
- ▁crop
- ▁agenc
- ▁sick
- ▁27
- ▁flat
- ▁epic
- ▁hang
- ▁tale
- ▁ethic
- ▁assembl
- ▁proportion
- ▁broad
- ▁lock
- ▁14
- ▁invite
- ▁workplace
- ▁vary
- marine
- ▁194
- ▁definition
- ▁dig
- ▁transformation
- ▁cheaper
- ▁march
- ▁adaptation
- ▁statement
- ▁jet
- ▁load
- man
- ▁metal
- ▁lost
- ▁carry
- ▁projection
- ▁tension
- ▁fill
- ▁owe
- xic
- ▁curve
- ▁digital
- ▁cannon
- ▁ideal
- ▁fish
- ▁bce
- ▁mom
- ption
- ▁rely
- ▁useful
- etic
- ▁perfectly
- ▁army
- ▁movement
- ▁tear
- ▁eugenics
- ▁successful
- ▁reproducti
- ▁agreed
- ▁careful
- net
- ural
- reliable
- ▁threatened
- ▁coin
- ▁corrupted
- ▁bo
- non
- ▁myth
- ▁increases
- ▁coen
- ort
- ▁solv
- ▁mining
- ▁container
- ▁grand
- spend
- ▁indian
- nit
- ▁met
- mber
- ▁si
- ▁micro
- ▁killed
- ▁wi
- ▁funding
- ▁organisms
- ▁oil
- ▁cont
- ▁limited
- ▁alive
- ▁deal
- men
- ▁routes
- ▁tissues
- ▁chambers
- ▁exc
- ▁pin
- ▁investments
- vert
- ▁pursue
- ▁hands
- ▁periods
- ▁10000
- ▁pai
- ▁argue
- ▁vipers
- ▁observ
- vor
- ▁hot
- ▁basi
- ile
- ▁sil
- ▁sources
- ▁operate
- light
- ▁islands
- ▁ten
- ▁owner
- ▁brains
- asse
- ▁werent
- ▁telescopes
- ▁ha
- ▁conve
- ▁leaves
- lin
- ▁obstructi
- ▁2000
- ▁pe
- ▁par
- broken
- rait
- ▁forever
- ▁win
- ▁trojans
- ▁pi
- ▁disorder
- ▁organism
- ▁neighbor
- ▁practice
- ▁locust
- ational
- ▁remove
- ▁week
- rov
- fu
- ▁meter
- ization
- ▁rid
- ▁schools
- ▁cap
- ▁cure
- istic
- over
- day
- ▁lov
- ▁tim
- ept
- uff
- ▁cho
- ▁accept
- ▁manag
- umb
- stand
- bodies
- face
- ▁deci
- ball
- ▁sens
- ▁surviv
- som
- ▁luck
- ▁cheap
- ▁happ
- ▁publi
- enome
- eign
- ▁shut
- ▁repeat
- ▁consist
- ▁prove
- hood
- ▁spirit
- sequence
- ▁confirm
- ▁persuad
- ▁promot
- precedent
- roach
- ▁amarjot
- ▁australia
- ▁beneath
- ▁bushmaster
- ▁channel
- ▁contrast
- ▁democra
- ▁devastat
- ▁diameter
- ▁discrimination
- ▁fragil
- ▁geothermal
- ▁guarantee
- ▁identify
- ▁innocent
- ▁leonardo
- ▁marriage
- ▁maximiz
- ▁mechanisms
- ▁monica
- ▁monitor
- ▁nutmeg
- ▁participant
- ▁pathogen
- ▁philosopher
- ▁piankhy
- ▁plague
- ▁possess
- ▁resolve
- ▁struggling
- ▁theranos
- ▁thrust
- ▁underground
- ▁vessel
- ▁violence
- ▁yeah
- ▁competitor
- ▁compress
- ▁establish
- ▁fashion
- ▁freetown
- ▁harness
- ▁obvious
- ▁purchas
- ▁random
- ▁smart
- ▁snoring
- ▁struggle
- ▁venom
- ▁video
- ▁healthcare
- ▁plenty
- ▁policymakers
- ▁sugar
- ▁regular
- ▁wife
- ▁earlier
- ▁tumor
- ▁encode
- ▁skill
- ▁dump
- ▁disparit
- ▁technique
- ▁brand
- ▁mouth
- ▁commercial
- ▁decisions
- ▁dense
- ▁details
- ▁calm
- ▁clay
- ▁excellen
- ▁scre
- ▁belarus
- ▁link
- ▁sweat
- ▁influence
- ▁license
- ▁media
- ▁allergi
- ▁tall
- ▁reveal
- ▁insects
- ▁brief
- meral
- ▁burrow
- ▁suspen
- ▁easily
- ▁excess
- ▁civilization
- ▁cloth
- ▁spectac
- ▁deploy
- ▁deny
- ▁wound
- ▁catastrophic
- rena
- ▁metric
- ▁industries
- ▁petrol
- ▁sunlight
- ▁flexible
- ▁flip
- ▁contract
- ▁breed
- ▁186
- ▁poop
- ▁khalil
- ▁divide
- ▁observations
- rope
- ffic
- ▁label
- ▁brig
- ▁agriculture
- ▁landscape
- ▁engage
- ▁trouble
- ▁reproduce
- ▁airway
- ▁intersecti
- ▁worry
- ▁char
- ▁authority
- ▁happiness
- ▁affordable
- ▁warsaw
- ▁gravity
- ▁identity
- aker
- ▁lifetime
- ▁everywhere
- ▁management
- ▁hunt
- ▁stem
- ▁gum
- ▁lone
- ▁squa
- ▁sew
- ▁coral
- ▁bio
- ▁clu
- ▁orient
- ▁honest
- ▁proper
- ▁emergenc
- ▁zoo
- ▁cor
- ▁factories
- ▁wings
- ▁tro
- eek
- ▁universal
- ▁experienc
- ▁circumstances
- ▁firefighters
- ▁cage
- ▁truck
- ▁span
- ▁diet
- ▁mile
- ▁rapidly
- ▁orang
- ▁stream
- ▁emotional
- ▁statistical
- ▁durable
- ▁gav
- ▁trip
- ▁mag
- ▁dispose
- ▁soft
- ▁equally
- ▁strike
- ▁directly
- ▁appli
- ddl
- ▁supernovae
- ender
- ▁writing
- ▁prop
- ference
- raw
- ▁earn
- ▁enable
- ▁foundation
- ely
- ▁devices
- ▁breathing
- ▁hell
- ▁serve
- ▁tool
- ▁resistan
- pr
- ▁magic
- ▁artist
- ▁samples
- ▁snakes
- ▁trials
- ▁poet
- ▁books
- ▁profits
- ▁ce
- take
- ▁assum
- ▁traveling
- ▁inf
- eum
- ▁disorders
- pan
- ably
- oid
- ▁desir
- ament
- ▁union
- ▁africans
- signed
- ▁spin
- ▁ro
- ▁protected
- ify
- pul
- ▁sites
- ▁vo
- ▁included
- ▁accelerat
- ▁birds
- ▁bird
- ▁mus
- ▁pit
- ▁write
- bble
- ▁ai
- ▁replac
- ▁desire
- alt
- ▁double
- ▁je
- umble
- ▁valu
- gd
- '23'
- ▁writ
- ▁book
- ▁sample
- ▁snake
- ▁object
- ▁ingredient
- ▁official
- ▁feature
- ▁brother
- ▁muscle
- ▁tower
- ▁gig
- ▁adult
- ▁generation
- ▁emission
- ▁wave
- '30'
- ▁conf
- ▁trial
- finite
- entle
- j
- gram
- ▁jan
- ike
- ▁fo
- posed
- ▁rem
- ▁leg
- ala
- strict
- think
- sleep
- ioniz
- mission
- ▁emotion
- ▁celebrate
- either
- ▁whi
- ▁compet
- ents
- ▁introduc
- ▁disrupt
- ▁collect
- ▁cit
- orba
- ipping
- ▁capita
- ▁govern
- reak
- ▁batter
- ▁arrang
- ▁belong
- ▁hollow
- ▁referr
- ▁suspect
- craft
- rogen
- ▁1970
- ▁architect
- ▁biontech
- ▁centuries
- ▁commitment
- ▁conduct
- ▁contradict
- ▁counterweight
- ▁dinner
- ▁disagree
- ▁display
- ▁disposal
- ▁distinct
- ▁employee
- ▁equator
- ▁eradicat
- ▁exercise
- ▁exoskeletons
- ▁expectancy
- ▁extrover
- ▁gestapo
- ▁hebrides
- ▁infectious
- ▁introver
- ▁journal
- ▁junajpu
- ▁louvre
- ▁nanostructure
- ▁overwhelm
- ▁phospholipid
- ▁physics
- ▁politician
- ▁practical
- ▁procedure
- ▁profess
- ▁pronouns
- ▁quarter
- ▁regime
- ▁reinforce
- ▁review
- ▁supplement
- ▁tempt
- ▁threshold
- ▁vegeta
- ▁violet
- ▁welcome
- ▁yellow
- ▁zegota
- ▁activat
- ▁ancestor
- ▁article
- ▁conquer
- ▁depart
- ▁email
- ▁herself
- ▁neuron
- ▁peak
- ▁psych
- ▁restoration
- ▁error
- ▁mystery
- ▁revenue
- ▁stunn
- ▁theoretical
- ▁household
- ▁notorious
- ▁scaling
- ▁erode
- ▁firm
- ▁frame
- ▁scene
- ▁evidentialis
- ▁freedom
- ▁title
- ▁desperate
- ▁migrat
- ▁select
- ▁summer
- ▁married
- ▁delta
- ▁shelter
- awatt
- ▁fresh
- ▁hidden
- ▁surgery
- ▁fluorescence
- ▁drain
- ▁kick
- ▁nearby
- ▁related
- ▁evad
- ▁bypass
- ▁colon
- ▁sperm
- ▁flore
- ▁begun
- ▁audience
- ▁rescue
- ▁empower
- proof
- ▁concept
- ▁creatures
- ▁museum
- ▁coordinat
- ▁intensive
- ▁diagnostic
- ▁rush
- ▁consumption
- ▁prohibit
- ▁hatch
- ▁vital
- ▁strand
- ▁persist
- tresse
- ▁missile
- ▁converg
- ▁importance
- ▁exploit
- ▁creativity
- ▁wedding
- ▁narrative
- ▁cliff
- ▁capitalism
- ▁surg
- ▁possibilities
- ▁understood
- foot
- ▁drink
- ▁persecution
- ▁sharp
- ▁flock
- courage
- ▁folk
- ▁refus
- ▁wow
- ▁noise
- ▁imagination
- ▁2021
- federa
- ▁remind
- ▁odd
- ▁extract
- ▁navigation
- ▁astronomy
- ▁illness
- ▁defi
- ▁version
- ▁impressive
- ▁systematic
- tamin
- ▁innovative
- minent
- ▁propos
- ▁resistance
- ▁shop
- ▁strait
- ▁horr
- ▁darkness
- ▁objective
- ▁father
- ound
- ▁restore
- ▁buff
- ▁restrict
- ▁excre
- ▁phone
- ▁mountain
- glad
- ▁pants
- ▁highway
- ▁announc
- ▁interconnected
- ▁hole
- ▁conditioners
- ▁pace
- ▁conscious
- ▁era
- ▁buttons
- ▁gain
- ▁lucky
- ▁barr
- ▁combine
- ▁dim
- ▁confus
- ient
- ▁packag
- ▁medication
- ▁career
- ▁board
- ▁bri
- ▁integrate
- ▁insisted
- arity
- ▁compete
- ▁plann
- immer
- ▁nest
- ▁strict
- ▁lesson
- rote
- ▁asia
- ▁band
- ▁complicated
- ▁constructed
- iment
- ▁hire
- fug
- ▁grat
- ▁closet
- roll
- ▁borders
- ▁monks
- rbag
- ▁protection
- ▁accepted
- ▁fal
- ▁26
- rated
- ▁deadly
- zer
- kay
- ▁investigat
- ▁advise
- eters
- ▁observing
- ski
- ▁prov
- ▁difficulty
- ▁improv
- ▁layer
- col
- ▁filmmak
- mad
- ▁grab
- ▁driver
- ▁meant
- ▁13
- alo
- '37'
- ▁unf
- ▁definit
- ▁burned
- cogni
- weigh
- ob
- lum
- ▁wash
- ▁profit
- obb
- ▁mono
- ▁appeared
- ▁interested
- ▁mess
- ▁comput
- ▁log
- ▁electro
- ▁meal
- ▁hid
- ▁reader
- ▁jo
- ctors
- ▁doubl
- top
- ▁ec
- ▁millennia
- ired
- urd
- gotten
- '80'
- ▁del
- eful
- ▁chris
- atic
- pur
- gar
- ▁pop
- ▁stra
- centri
- ▁spr
- comfortable
- ▁demonstrate
- fo
- ▁variet
- ▁pron
- bb
- ▁fertilize
- ▁figur
- arch
- pped
- ▁ensur
- ▁recogniz
- ls
- '20'
- ▁provid
- ▁explore
- '11'
- ▁achiev
- native
- late
- ried
- rts
- ▁160
- ▁500
- ▁ju
- ▁sali
- ▁character
- ▁35
- eeze
- ▁pen
- ▁assume
- '96'
- putation
- mph
- ▁sna
- ▁farm
- ▁fran
- eight
- ology
- ▁bond
- ▁parent
- ▁kilometer
- hal
- ▁resource
- bra
- ▁20000
- ▁serv
- ▁origin
- ▁advi
- ▁possib
- ▁historic
- place
- ification
- ▁strik
- rish
- ▁reduc
- tail
- ▁construct
- ▁pain
- ▁tip
- ▁complicat
- ▁gran
- ▁prime
- ▁substantial
- tag
- stream
- ▁conv
- front
- ▁rapid
- claim
- expect
- respond
- lord
- usual
- fall
- ▁brit
- point
- allow
- ▁exact
- ▁digit
- board
- tons
- ▁literal
- ▁install
- ▁innovat
- emed
- cti
- ▁lens
- ▁mari
- ▁epi
- ▁absor
- ▁subsidi
- ▁fell
- ▁hypothe
- ▁optim
- ▁constitut
- ▁defeat
- ▁diagonal
- ▁immunotherap
- ▁smooth
- bacteria
- ipples
- ologist
- ğ
- ▁1987
- ▁adjacent
- ▁advocate
- ▁anticipate
- ▁assistance
- ▁baby
- ▁beauty
- ▁calculat
- ▁candidate
- ▁challenging
- ▁champion
- ▁cholesterol
- ▁counterparts
- ▁crusade
- ▁curiosity
- ▁descend
- ▁destination
- ▁destruction
- ▁destructive
- ▁disappointment
- ▁dissolve
- ▁distinguish
- ▁empath
- ▁enjoy
- ▁equitabl
- ▁evaluat
- ▁export
- ▁ghetto
- ▁ghost
- ▁gradual
- ▁hospital
- ▁implement
- ▁inclusiv
- ▁inherit
- ▁intervention
- ▁laboratory
- ▁liquid
- ▁luxury
- ▁mercator
- ▁micah
- ▁miracle
- ▁nebula
- ▁nervous
- ▁neutral
- ▁olaf
- ▁opposite
- ▁participat
- ▁pesticide
- ▁puzzle
- ▁pyrethroid
- ▁rainforest
- ▁rattlesnake
- ▁rebuil
- ▁register
- ▁resolution
- ▁rognvald
- ▁secure
- ▁spectrum
- ▁statue
- ▁television
- ▁therapeutic
- ▁throat
- ▁vulture
- ▁wood
- phobia
- ▁abandon
- ▁accident
- ▁automatic
- ▁bucket
- ▁burden
- ▁competency
- ▁consult
- ▁equity
- ▁evaporat
- ▁interview
- ▁knowledge
- ▁legacy
- ▁legislat
- ▁mathematic
- ▁niger
- ▁plummet
- ▁taste
- ▁technical
- ▁transplant
- itarian
- ▁chronic
- ▁compell
- ▁crowd
- ▁empty
- ▁incarcer
- ▁misfir
- ▁poison
- ▁quantit
- ▁turb
- ▁victor
- ▁election
- ▁priorit
- ▁religio
- ▁snore
- defensi
- ▁bundle
- ▁carousel
- ▁climb
- ▁exhaust
- ▁fractur
- ▁garden
- ▁succeed
- ▁suez
- ▁hdpe
- ▁juice
- aguar
- ▁denim
- ▁dividing
- ▁fallacy
- ▁outcomes
- ▁plot
- ▁blind
- ▁shocked
- ▁bounc
- ▁depth
- incident
- ▁subtle
- ▁pump
- rcia
- ▁initiatives
- ▁spray
- ▁haunt
- ▁traverse
- ▁polish
- ▁hypothesis
- ▁voice
- ▁pledge
- ▁burst
- ▁uncle
- ▁sink
- sturb
- ▁anchor
- ▁gratitude
- ▁pause
- ▁quo
- ▁alert
- ▁vast
- ▁van
- ▁attitudes
- ▁grocer
- ▁countdown
- ▁decrease
- ▁extensi
- ▁invasion
- ▁therapi
- ▁instant
- ▁guy
- ▁forget
- ▁lawyer
- ▁reduction
- ▁strange
- ▁boom
- abul
- ▁season
- ▁begg
- ▁underwater
- ▁strategies
- ▁stimulate
- ▁hurt
- ▁alertness
- ▁utilit
- ▁tomb
- ▁elsewhere
- ▁leap
- ▁patch
- ▁preference
- ▁realistic
- ▁fold
- ▁medit
- ▁stair
- itzer
- ▁embr
- ▁addict
- ▁2015
- ▁percepti
- ▁reign
- ▁painful
- egal
- ▁respi
- ▁depriv
- ▁shutter
- ▁chemistry
- ▁sad
- ▁bias
- ▁boost
- ▁wake
- ▁workforce
- ▁varieties
- ▁repair
- ▁genome
- ▁reject
- ▁124
- slide
- ▁mobility
- ▁shade
- ▁medicine
- ▁vent
- ▁hyp
- ▁melt
- ▁cake
- ▁organized
- ▁novelty
- ▁distan
- ▁france
- ▁suck
- ▁parity
- ▁vision
- ▁voc
- ▁sufficient
- charged
- ▁calcine
- ensity
- ▁dart
- ▁collection
- ▁gun
- ▁rays
- ▁pour
- ▁bitter
- ▁funn
- ▁coff
- ▁fearless
- ▁stance
- ▁inner
- ▁retain
- ▁debt
- ▁chile
- fuse
- ▁partial
- ▁mold
- ▁substan
- ▁survival
- ▁seize
- ▁qui
- ▁installation
- ▁cup
- ruel
- ▁boss
- ▁plug
- ▁apartment
- ▁communicate
- ▁sacrifice
- ▁tapp
- ▁grass
- ▁italy
- ▁roy
- ▁squ
- ▁percentage
- ▁dots
- ▁absolutely
- ▁incentivize
- ▁reserve
- ▁navigate
- ▁creative
- viation
- ▁angle
- ▁deb
- ▁agent
- ▁isolat
- spiration
- ▁ramp
- ▁forgotten
- ▁extin
- ▁celebrated
- diff
- ▁substantially
- ▁viruses
- ▁por
- clos
- ▁comment
- ▁closest
- ▁fatal
- ▁triple
- olk
- ▁eliminate
- ▁facilit
- oster
- ▁geo
- erior
- ▁online
- ▁fung
- ▁insight
- ▁bull
- '79'
- ▁swapp
- ▁wipe
- rrow
- ▁historical
- ▁delivery
- hre
- ntine
- erson
- ▁former
- ▁original
- ▁cri
- ▁accura
- ▁bat
- ▁pave
- reci
- mma
- ▁generat
- rum
- decided
- ▁provider
- cell
- ▁intri
- izab
- neck
- ▁pur
- neu
- ▁stepp
- hoppers
- ▁hu
- ▁dye
- ▁chase
- '21'
- ▁impress
- hu
- ▁broke
- ▁obstruct
- ▁360
- ▁explor
- gue
- rate
- ▁controlle
- roc
- bru
- ecta
- ▁gui
- ▁rec
- qua
- ▁imagin
- ▁operat
- ▁fertiliz
- litar
- ▁hotte
- profitable
- ▁argu
- ▁150
- odes
- tify
- llus
- lets
- ▁terr
- poly
- ▁christ
- ctively
- ▁decarboniz
- scribe
- ▁electr
- ▁immigra
- ▁300
- ▁separat
- ▁hopp
- ▁rang
- employed
- mped
- '98'
- rail
- '97'
- ▁device
- ▁pun
- ▁belief
- ▁resident
- ▁pathway
- ▁egg
- ▁dollar
- ▁scientist
- ▁prim
- ▁reliabl
- igation
- ▁aud
- ▁fun
- maker
- ▁marr
- ▁afford
- ▁gro
- ashes
- urning
- ▁cycl
- ject
- ▁surpris
- ▁eliminat
- ▁disco
- ▁univers
- ▁receiv
- stead
- ▁critic
- mark
- ▁plea
- ▁absolute
- pair
- limited
- water
- truck
- sexual
- spread
- '35'
- bank
- virus
- imagine
- consider
- power
- down
- look
- more
- drive
- ▁communicat
- ▁prepare
- cott
- ▁insist
- fish
- ▁gri
- ▁tap
- ▁incentiv
- ▁distort
- ▁jani
- case
- ▁societ
- nounc
- ▁interact
- ▁syria
- ▁eas
- ▁frequen
- ▁significan
- ▁attac
- ▁populat
- ▁except
- ▁steriliz
- ▁cooperat
- ▁khali
- ▁appro
- ivity
- ▁danger
- ▁inform
- ▁stimul
- ▁quest
- ▁memori
- ▁import
- hibit
- stood
- ▁decre
- ▁influ
- rupt
- cense
- ippi
- ▁photosynthe
- augu
- criminat
- ▁biodivers
- ▁cardio
- ▁ridicul
- occupie
- sophisticated
- ▁absolutis
- ▁accused
- ▁afraid
- ▁algorithm
- ▁aristocra
- ▁assaulted
- ▁association
- ▁assyrian
- ▁atlantic
- ▁autonomy
- ▁availability
- ▁brutal
- ▁byproduct
- ▁ceremon
- ▁circle
- ▁conclusion
- ▁congress
- ▁consensus
- ▁diabetes
- ▁dimensional
- ▁diploma
- ▁disadvantage
- ▁disrespect
- ▁dragonfl
- ▁enzymes
- ▁epidemic
- ▁evolution
- ▁expense
- ▁eyebrows
- ▁fairbnb
- ▁follicle
- ▁fragment
- ▁gatekeeper
- ▁geography
- ▁ghrelin
- ▁gilgamesh
- ▁google
- ▁greece
- ▁gujarat
- ▁harvest
- ▁hurricane
- ▁inevitable
- ▁injustice
- ▁intelligen
- ▁ixbalanke
- ▁jetpack
- ▁judgment
- ▁livelihoods
- ▁longitude
- ▁margin
- ▁minimum
- ▁navy
- ▁necessarily
- ▁passenger
- ▁politics
- ▁prejudice
- ▁prospect
- ▁proximity
- ▁relieve
- ▁replicate
- ▁restaurant
- ▁scotland
- ▁senior
- ▁simultaneously
- ▁slot
- ▁stigma
- ▁supreme
- ▁sustainably
- ▁teenager
- ▁thirteen
- ▁thrill
- ▁tiger
- ▁tomorrow
- ▁toothpaste
- ▁tynwald
- ▁underneath
- ▁utilitarian
- ▁volunteer
- ▁vulnerability
- ▁alternate
- ▁assassinat
- ▁branche
- ▁categor
- ▁commute
- ▁defend
- ▁exclusive
- ▁feather
- ▁graduate
- ▁meticulous
- ▁perpetuat
- ▁resettle
- ▁segregat
- ▁treasur
- ▁violent
- ▁align
- ▁apparent
- ▁blades
- ▁competition
- ▁concert
- ▁counteract
- ▁daunting
- ▁debris
- ▁deficienc
- ▁disperse
- ▁england
- ▁fascinat
- ▁inflation
- ▁inhabit
- ▁irony
- ▁midwest
- ▁occasion
- ▁paddy
- ▁pioneer
- ▁praise
- ▁princes
- ▁resembl
- ▁roof
- ▁sensitive
- ▁territori
- ▁unfair
- rugg
- ▁coworkers
- ▁fruit
- ▁gasoline
- ▁impulse
- ▁lung
- ▁megawatt
- ▁palace
- ▁request
- ▁testimon
- ▁unfolding
- ▁yarn
- ▁bomb
- ▁crack
- ▁drastic
- ▁harsh
- ▁hometown
- ▁infected
- ▁john
- ▁minimize
- ▁properties
- ▁swift
- ▁pillar
- ▁endanger
- ▁flaw
- ▁relax
- ▁turk
- ▁admir
- ▁nuance
- ▁declare
- ▁guard
- ▁reunion
- ▁storytell
- ▁butterfl
- ▁scour
- ▁ribo
- ▁ferry
- ▁hacking
- ▁hydro
- ▁thread
- ▁convention
- ▁text
- ▁split
- ▁congest
- ▁translation
- ▁appreciat
- ratory
- ▁iceland
- ▁jaw
- ▁mistake
- ▁95
- programm
- ▁injure
- ▁explosive
- ▁spiritual
- ▁drill
- ▁typh
- ▁smell
- ▁latin
- ▁poem
- ▁asylum
- ▁crime
- ▁sail
- ▁appeal
- ▁guest
- ▁initial
- ▁peekabo
- ▁outlier
- mog
- ▁proud
- ▁bolt
- ▁spurr
- intuiti
- ▁cantilever
- ▁amani
- ▁genre
- ▁afar
- ▁rub
- ▁moistur
- ▁recover
- ▁items
- ▁optimistic
- ▁slippe
- ▁oversee
- ▁sara
- ▁illegal
- ▁rainwater
- ▁opposition
- ▁overnight
- ▁movie
- ▁explosion
- ▁intensity
- ▁linguistic
- ▁emulsi
- ▁radiation
- ▁violat
- morph
- ▁homo
- ▁spice
- ▁vibran
- ▁intact
- ▁rewards
- ▁exceed
- ▁viewpoint
- ▁heroes
- ▁repeatedly
- ▁confront
- rane
- ▁thre
- ▁squir
- ▁wrap
- ▁godred
- ▁orgy
- ▁sentence
- unci
- ▁memorize
- monia
- holder
- ▁quiet
- rpet
- ▁icon
- ▁spark
- ▁deforestation
- ▁nurs
- ▁1945
- ▁finger
- cade
- ▁efficac
- ▁haz
- ▁motivation
- ▁spotted
- ▁pitch
- ▁subsidize
- ▁intention
- ▁window
- ombi
- ▁swim
- ▁winter
- ▁dynami
- ▁executive
- ▁boil
- ▁assess
- ▁2018
- ▁failure
- ▁horse
- ▁enact
- utter
- ▁circulation
- ▁queen
- ▁distract
- flag
- ▁mentor
- ▁lick
- lank
- ▁ebo
- ▁dirt
- ▁remark
- ▁shake
- ▁entry
- frost
- ▁pear
- ▁bound
- ▁rif
- ▁performance
- ▁exception
- ▁189
- ▁straight
- ▁purp
- imeter
- ▁hills
- ▁chew
- scop
- ▁lamp
- ▁fog
- ▁sweet
- ▁cosm
- ▁mysteri
- rbit
- ▁dying
- ▁argument
- ▁intell
- ▁sultanate
- aire
- ▁tile
- ▁monoc
- ▁machinery
- ▁motion
- ▁infant
- ▁healthier
- ▁continuous
- ▁truce
- ▁undergo
- aboo
- ▁commanders
- ▁qualifi
- ▁55
- ▁anyway
- ▁lenses
- ▁offset
- ▁merg
- quent
- tari
- ▁chim
- ptin
- ▁exit
- ▁dash
- ▁meta
- ▁wish
- ▁poorest
- ▁distortion
- ▁interaction
- ▁proposal
- ▁reven
- ▁trace
- ▁perch
- ▁behav
- ▁disruption
- ▁progressive
- introduce
- ▁gall
- ▁stone
- ▁update
- descent
- ▁dance
- ▁polye
- ▁settle
- fellow
- ▁rob
- ▁stre
- ▁kan
- dominant
- ▁bro
- ▁ev
- ▁purif
- ▁agreement
- ▁dominate
- ▁regulation
- ▁improvement
- hase
- ▁ecolog
- hydr
- pical
- ▁conspi
- ▁inhale
- ▁arriv
- ▁fil
- ▁visitor
- ▁greenland
- phasi
- ▁farmer
- ▁cran
- ▁identifi
- ▁chose
- hau
- grega
- mps
- ▁characteriz
- ▁audi
- ▁oppress
- mination
- aint
- ▁determin
- ▁unemploy
- spire
- ▁giga
- ska
- ▁immigrat
- rank
- sport
- aft
- ▁snap
- emper
- equality
- ▁imp
- ▁terri
- ▁interv
- '19'
- hi
- icated
- ▁demonstrat
- kg
- gible
- ix
- grad
- pression
- '16'
- ▁pursu
- ▁hor
- ▁deli
- ▁spar
- ▁suc
- ▁millenni
- connected
- ▁leon
- ▁inspir
- ▁tho
- ▁faci
- ▁domin
- ▁resist
- ▁mobil
- ▁var
- eval
- ▁interfer
- abilities
- ▁enabl
- ▁border
- ▁forci
- ▁monk
- ▁eugenic
- gae
- ▁concern
- ▁fertil
- ▁mammal
- ▁iri
- ▁merc
- ▁blu
- gger
- ▁statistic
- ▁integr
- compa
- nown
- ▁navigat
- ▁amaz
- ▁reserv
- layer
- escription
- ▁angl
- ▁amplif
- force
- plug
- conscious
- compete
- mind
- leader
- honest
- load
- position
- root
- box
- speak
- flow
- complete
- drop
- check
- sustainable
- friend
- track
- game
- moral
- certain
- green
- world
- people
- life
- what
- about
- human
- wind
- suit
- pay
- minis
- ▁tradition
- ▁bloo
- ▁explo
- ▁strateg
- ▁circu
- ▁gravit
- ▁corporat
- ▁activit
- ▁inequalit
- ▁galax
- ▁calci
- ▁energ
- ▁identit
- ▁locat
- ▁que
- ford
- compromis
- ▁swee
- ▁constr
- imitation
- ▁matte
- zoo
- hwa
- ▁dyna
- ▁flexib
- ▁execut
- ▁renew
- ▁catastroph
- ▁deforest
- rink
- ▁auth
- ▁pub
- ▁marc
- ▁furthe
- ▁diagnos
- ecutive
- titude
- ▁compli
- gressive
- nprofit
- pute
- ▁nano
- oxide
- ▁evident
- ▁surp
- ▁arachn
- ▁hippoc
- nivores
- skeleton
- suppress
- thropo
- ü
- ▁accomplish
- ▁accusation
- ▁acknowledg
- ▁activists
- á
- î
- ç
- ö
- <sos/eos>
init: null
input_size: null
ctc_conf:
dropout_rate: 0.0
ctc_type: builtin
reduce: true
ignore_nan_grad: null
zero_infinity: true
brctc_risk_strategy: exp
brctc_group_strategy: end
brctc_risk_factor: 0.0
st_joint_net_conf: null
model_conf:
asr_weight: 0.3
mt_weight: 0.0
mtlalpha: 0.3
lsm_weight: 0.1
length_normalized_loss: false
use_preprocessor: true
token_type: bpe
src_token_type: bpe
bpemodel: data/en_bn_token_list/tgt_bpe_unigram4000/bpe.model
src_bpemodel: data/en_bn_token_list/src_bpe_unigram4000/bpe.model
non_linguistic_symbols: null
cleaner: null
g2p: null
src_g2p: null
speech_volume_normalize: null
rir_scp: null
rir_apply_prob: 1.0
noise_scp: null
noise_apply_prob: 1.0
noise_db_range: '13_15'
short_noise_thres: 0.5
ctc_sample_rate: 0.0
frontend: default
frontend_conf:
n_fft: 400
hop_length: 160
fs: 16k
specaug: specaug
specaug_conf:
apply_time_warp: true
time_warp_window: 5
time_warp_mode: bicubic
apply_freq_mask: true
freq_mask_width_range:
- 0
- 27
num_freq_mask: 2
apply_time_mask: true
time_mask_width_ratio_range:
- 0.0
- 0.05
num_time_mask: 5
normalize: utterance_mvn
normalize_conf: {}
preencoder: null
preencoder_conf: {}
encoder: conformer
encoder_conf:
output_size: 256
attention_heads: 4
linear_units: 2048
num_blocks: 12
dropout_rate: 0.1
positional_dropout_rate: 0.1
attention_dropout_rate: 0.1
input_layer: conv2d
normalize_before: true
macaron_style: true
rel_pos_type: latest
pos_enc_layer_type: rel_pos
selfattention_layer_type: rel_selfattn
activation_type: swish
use_cnn_module: true
cnn_module_kernel: 31
postencoder: null
postencoder_conf: {}
decoder: transformer
decoder_conf:
attention_heads: 4
linear_units: 2048
num_blocks: 6
dropout_rate: 0.1
positional_dropout_rate: 0.1
self_attention_dropout_rate: 0.1
src_attention_dropout_rate: 0.1
extra_asr_decoder: transformer
extra_asr_decoder_conf:
input_layer: embed
num_blocks: 6
linear_units: 2048
dropout_rate: 0.1
extra_mt_decoder: transformer
extra_mt_decoder_conf:
input_layer: embed
num_blocks: 2
linear_units: 2048
dropout_rate: 0.1
md_encoder: null
md_encoder_conf: {}
hier_encoder: null
hier_encoder_conf: {}
extra_mt_encoder: null
extra_mt_encoder_conf: {}
preprocessor: default
preprocessor_conf: {}
required:
- output_dir
- token_list
version: '202402'
distributed: false
```
</details>
### Citing ESPnet
```BibTex
@inproceedings{watanabe2018espnet,
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
title={{ESPnet}: End-to-End Speech Processing Toolkit},
year={2018},
booktitle={Proceedings of Interspeech},
pages={2207--2211},
doi={10.21437/Interspeech.2018-1456},
url={http://dx.doi.org/10.21437/Interspeech.2018-1456}
}
```
or arXiv:
```bibtex
@misc{watanabe2018espnet,
title={ESPnet: End-to-End Speech Processing Toolkit},
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
year={2018},
eprint={1804.00015},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| [
"CRAFT"
] |
DehydratedWater42/SeELLama | DehydratedWater42 | text-generation | [
"transformers",
"safetensors",
"llama",
"text-generation",
"math",
"semantic",
"extraction",
"graph",
"relations",
"science",
"synthetic",
"en",
"dataset:DehydratedWater42/semantic_relations_extraction",
"license:llama2",
"autotrain_compatible",
"text-generation-inference",
"4-bit",
"bitsandbytes",
"region:us"
] | "2024-04-20T21:41:41Z" | 2024-04-27T22:24:45+00:00 | 0 | 1 | ---
datasets:
- DehydratedWater42/semantic_relations_extraction
language:
- en
library_name: transformers
license: llama2
pipeline_tag: text-generation
tags:
- math
- semantic
- extraction
- graph
- relations
- science
- synthetic
inference: false
---
# SeELLama (Semantic Extraction LLama)
Model is based on LLama2-7b and fine-tuned with the `DehydratedWater42/semantic_relations_extraction` dataset.
The purpose of this model is to extract semantic relations from text in a structured way.
#### Simplified Example:
- **Initial Text**: "While there is beautiful weather outside the building, from the window we can see a car. And what's the most annoying, pigeons love to sit on that car."
- **Entities**: ["pigeon", "car", "building"]
- **Relations between entities**: {"pigeon -> car": "pigeon sits on the car", "car -> building": "car is parked outside the building"}
**Note:** The text example above is **too short** for the actual model; please use **at least 500-token text** segments for extraction to avoid hallucinations.
### Other versions:
- **Get SeELLama in GGUF format:** [DehydratedWater42/SeELLama-GGUF](https://huggingface.co/DehydratedWater42/SeELLama-GGUF)
- **Get SeELLama as adapter:** [DehydratedWater42/SeELLama-qlora-adapter](https://huggingface.co/DehydratedWater42/SeELLama-qlora-adapter)
***
## How to use it:
### Template:
Use the **prompt template** provided below to extract relations from text. Replace `<<your_text_for_extraction>>` with your selected text, ideally between 500-1500 tokens,
with an **optimal range** of about **800-1000 tokens**. You can adjust the **temperature** between 0.3 and 1.0; a good starting point is **between 0.6 and 0.7**.
Temperatures below 0.3 may lead to never ending `section_description`. The higher the temperature, the more the model will fill in the gaps in the provided text.
It was **fine-tuned on scientific articles**, so it will supplement missing information with general knowledge.
Model was trained on 2560 context lenght where 1000-1500 tokens where used as input text.
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
<<your_text_for_extraction>>
### Extracted Relations:
{
"section_description":
```
The `JSON` opening is not necessary but it improves stability. Remember to use double `{{` instead of singular '{' if you are using LangChain prompts with fstring formatting.
### Optimal settings:
For Transformer loader -> [text-generation-webui](https://github.com/oobabooga/text-generation-webui):
- load-in-4bit=True
- user_double_quant=True
- trust-remote-code=True
- compute_dtype=bfloat16
- quant_type=nf4
For `text-generation-webui` you must add `--trust-remote-code` flag to use the model
`./start_linux.sh --trust-remote-code`
This is a PerfModel created from 4bit qlora adapter fine-tuned with SFTTrainer with BitsAndBytes merged with quantized LLama2-7b.
Model was trained on 2560 context lenght where 1000-1500 tokens where used as input text.
Some results may require post processing to close out-of-context jsons.
You can also get unquantized version in `/SeELLama-unquantized-f16` folder in the repository.
### Starter code
#### Run generation with python
```python
from transformers import LlamaForCausalLM, LlamaTokenizerFast, BitsAndBytesConfig
model_name = "DehydratedWater42/SeELLama"
prompt = """Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Players in Survival, Adventure, or Spectator have access to the 2×2 crafting grid from their inventory screen. Small crafting recipes that are at most 2×2 can be made there. These include wooden planks, sticks, crafting tables, torches and some shapeless recipes. To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it. This brings up an interface with a larger, 3×3 crafting grid, which the player can use to craft any crafting recipe in the game. There is also a recipe book where the player can access all the crafting recipes.
Some recipes do not require their ingredients to be arranged in a specific way on the crafting grid. These are commonly known as shapeless recipes. For example, players may craft a fermented spider eye by placing its ingredients anywhere within the grid or like a flint and steel can be crafted in any orientation.
On the contrary, many recipes must have their ingredients placed in the correct relative positions on the crafting grid. These are commonly known as shaped recipes. Ingredients in shaped recipes can be ‘moved’ up, down, left, or right. They can also be flipped side-ways. For example, a 3×1 recipe, such as bread, can be made using the top, middle, or bottom row of the 3×3 grid, and a bow may be made with the strings placed on the left instead of on the right.
There are recipes that may not be moved or mirrored in this way. These are commonly known as fixed recipes. For example, dyes in banner recipes - only available in Bedrock Edition - must be specifically placed to achieve the desired pattern. Fixed recipes can be added by data packs, add-ons, or mods.
### Extracted Relations:
{
"section_description":
"""
model = LlamaForCausalLM.from_pretrained(model_name, trust_remote_code=True, device_map="auto")
tokenizer = LlamaTokenizerFast.from_pretrained(model_name, trust_remote_code=True, device_map="auto")
model.eval()
inputs = tokenizer(prompt, return_tensors="pt").to('cuda')
outputs = model.generate(
inputs['input_ids'],
max_length=2500,
temperature=0.7,
top_p=0.6,
top_k=0,
repetition_penalty=1,
no_repeat_ngram_size=0,
do_sample=True,
num_return_sequences=1
)
generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(generated_text)
```
#### Extract generated json
```python
import json
json_start = """{
"section_description":
"""
to_parse = json_start + generated_text.split(json_start)[1]
print(json.loads(to_parse))
```
### Example:
Extracting information from Minecraft Wiki.
#### Initial template
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Players in Survival, Adventure, or Spectator have access to the 2×2 crafting grid from their inventory screen. Small crafting recipes that are at most 2×2 can be made there. These include wooden planks, sticks, crafting tables, torches and some shapeless recipes. To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it. This brings up an interface with a larger, 3×3 crafting grid, which the player can use to craft any crafting recipe in the game. There is also a recipe book where the player can access all the crafting recipes.
Some recipes do not require their ingredients to be arranged in a specific way on the crafting grid. These are commonly known as shapeless recipes. For example, players may craft a fermented spider eye by placing its ingredients anywhere within the grid or like a flint and steel can be crafted in any orientation.
On the contrary, many recipes must have their ingredients placed in the correct relative positions on the crafting grid. These are commonly known as shaped recipes. Ingredients in shaped recipes can be ‘moved’ up, down, left, or right. They can also be flipped side-ways. For example, a 3×1 recipe, such as bread, can be made using the top, middle, or bottom row of the 3×3 grid, and a bow may be made with the strings placed on the left instead of on the right.
There are recipes that may not be moved or mirrored in this way. These are commonly known as fixed recipes. For example, dyes in banner recipes - only available in Bedrock Edition - must be specifically placed to achieve the desired pattern. Fixed recipes can be added by data packs, add-ons, or mods.
### Extracted Relations:
{
"section_description":
```
#### Result (about game):
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Players in Survival, Adventure, or Spectator have access to the 2×2 crafting grid from their inventory screen. Small crafting recipes that are at most 2×2 can be made there. These include wooden planks, sticks, crafting tables, torches and some shapeless recipes. To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it. This brings up an interface with a larger, 3×3 crafting grid, which the player can use to craft any crafting recipe in the game. There is also a recipe book where the player can access all the crafting recipes.
Some recipes do not require their ingredients to be arranged in a specific way on the crafting grid. These are commonly known as shapeless recipes. For example, players may craft a fermented spider eye by placing its ingredients anywhere within the grid or like a flint and steel can be crafted in any orientation.
On the contrary, many recipes must have their ingredients placed in the correct relative positions on the crafting grid. These are commonly known as shaped recipes. Ingredients in shaped recipes can be ‘moved’ up, down, left, or right. They can also be flipped side-ways. For example, a 3×1 recipe, such as bread, can be made using the top, middle, or bottom row of the 3×3 grid, and a bow may be made with the strings placed on the left instead of on the right.
There are recipes that may not be moved or mirrored in this way. These are commonly known as fixed recipes. For example, dyes in banner recipes - only available in Bedrock Edition - must be specifically placed to achieve the desired pattern. Fixed recipes can be added by data packs, add-ons, or mods.
### Extracted Relations:
{
"section_description": "In Minecraft, players can craft items using the 2x2 crafting grid. There are some fixed recipes that require the player to place their ingredients in a specific position.",
"list_of_entities": [
"crafting recipes",
"player",
"fixed recipes",
"2x2 crafting grid",
"survival, adventure, or spectator players",
"wooden planks",
"torches",
"shaped recipes",
"small crafting recipes",
"shapeless recipes",
"wooden planks, sticks, crafting tables, torches",
"flint and steel"
],
"relations": [
{
"description": "Players have access to the 2x2 crafting grid from their inventory screen.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"player"
],
"strength": "strong"
},
{
"description": "Small crafting recipes that are at most 2x2 can be made using the 2x2 crafting grid.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"survival, adventure, or spectator players"
],
"strength": "strong"
},
{
"description": "To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"wooden planks"
],
"strength": "strong"
},
{
"description": "To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"crafting tables"
],
"strength": "strong"
},
{
"description": "Crafting tables allow players to make bigger crafting recipes that are not possible with the 2x2 crafting grid.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"torches",
"flint and steel",
"wooden planks"
],
"strength": "strong"
},
{
"description": "Crafting tables are used to craft items using the 2x2 crafting grid.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"wooden planks, sticks, crafting tables, torches"
],
"strength": "strong"
},
{
"description": "There are some fixed recipes that require the player to place their ingredients in a specific position.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"fixed recipes"
],
"strength": "strong"
},
{
"description": "The recipes do not require their ingredients to be arranged in a specific way on the crafting grid.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"shapeless recipes"
],
"strength": "strong"
},
{
"description": "Ingredients in shapeless recipes can be ‘moved’ up, down, left, or right.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"shaped recipes"
],
"strength": "strong"
},
{
"description": "Ingredients in shaped recipes can be flipped side-ways.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"fixed recipes"
],
"strength": "strong"
},
{
"description": "There are some fixed recipes that require the player to place their ingredients in a specific position.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"fixed recipes"
],
"strength": "strong"
},
{
"description": "The recipes can be added by data packs, add-ons, or mods.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"fixed recipes"
],
"strength": "strong"
}
]
}
```
### Smaller example (About crafting)
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Players in Survival, Adventure, or Spectator have access to the 2×2 crafting grid from their inventory screen. Small crafting recipes that are at most 2×2 can be made there. These include wooden planks, sticks, crafting tables, torches and some shapeless recipes. To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it. This brings up an interface with a larger, 3×3 crafting grid, which the player can use to craft any crafting recipe in the game. There is also a recipe book where the player can access all the crafting recipes.
### Extracted Relations:
{
"section_description": "Survival, Adventure, and Spectator players have access to the 2×2 crafting grid. However, to craft items using a bigger grid, you need to create a crafting table with wooden planks, place it in the world, and press use while facing it.",
"list_of_entities": [
"Survival",
"wooden planks",
"Spectator",
"Adventure",
"crafting grid"
],
"relations": [
{
"description": "Survival players can access the 2×2 crafting grid.",
"source_entities": [
"Survival"
],
"target_entities": [
"crafting grid"
],
"strength": "strong"
},
{
"description": "Adventure and Spectator players can also access the 2×2 crafting grid.",
"source_entities": [
"Adventure"
],
"target_entities": [
"crafting grid"
],
"strength": "strong"
},
{
"description": "To craft items using a bigger grid, you need to create a crafting table with wooden planks.",
"source_entities": [
"Spectator"
],
"target_entities": [
"crafting grid"
],
"strength": "strong"
}
]
}
```
### Dopamine example
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Dopamine is synthesized in a restricted set of cell types, mainly neurons and cells in the medulla of the adrenal glands.[23] The primary and minor metabolic pathways respectively are:
Primary: L-Phenylalanine → L-Tyrosine → L-DOPA → Dopamine[20][21]
Minor: L-Phenylalanine → L-Tyrosine → p-Tyramine → Dopamine[20][21][22]
Minor: L-Phenylalanine → m-Tyrosine → m-Tyramine → Dopamine[22][24][25]
The direct precursor of dopamine, L-DOPA, can be synthesized indirectly from the essential amino acid phenylalanine or directly from the non-essential amino acid tyrosine.[26] These amino acids are found in nearly every protein and so are readily available in food, with tyrosine being the most common. Although dopamine is also found in many types of food, it is incapable of crossing the blood–brain barrier that surrounds and protects the brain.[27] It must therefore be synthesized inside the brain to perform its neuronal activity.[27]
L-Phenylalanine is converted into L-tyrosine by the enzyme phenylalanine hydroxylase, with molecular oxygen (O2) and tetrahydrobiopterin as cofactors. L-Tyrosine is converted into L-DOPA by the enzyme tyrosine hydroxylase, with tetrahydrobiopterin, O2, and iron (Fe2+) as cofactors.[26] L-DOPA is converted into dopamine by the enzyme aromatic L-amino acid decarboxylase (also known as DOPA decarboxylase), with pyridoxal phosphate as the cofactor.[26]
Dopamine itself is used as precursor in the synthesis of the neurotransmitters norepinephrine and epinephrine.[26] Dopamine is converted into norepinephrine by the enzyme dopamine β-hydroxylase, with O2 and L-ascorbic acid as cofactors.[26] Norepinephrine is converted into epinephrine by the enzyme phenylethanolamine N-methyltransferase with S-adenosyl-L-methionine as the cofactor.[26]
Some of the cofactors also require their own synthesis.[26] Deficiency in any required amino acid or cofactor can impair the synthesis of dopamine, norepinephrine, and epinephrine.[26]
Degradation
Dopamine is broken down into inactive metabolites by a set of enzymes—monoamine oxidase (MAO), catechol-O-methyl transferase (COMT), and aldehyde dehydrogenase (ALDH), acting in sequence.[28] Both isoforms of monoamine oxidase, MAO-A and MAO-B, effectively metabolize dopamine.[26] Different breakdown pathways exist but the main end-product is homovanillic acid (HVA), which has no known biological activity.[28] From the bloodstream, homovanillic acid is filtered out by the kidneys and then excreted in the urine.[28] The two primary metabolic routes that convert dopamine into HVA are:[29]
Dopamine → DOPAL → DOPAC → HVA – catalyzed by MAO, ALDH, and COMT respectively
Dopamine → 3-Methoxytyramine → HVA – catalyzed by COMT and MAO+ALDH respectively
In clinical research on schizophrenia, measurements of homovanillic acid in plasma have been used to estimate levels of dopamine activity in the brain. A difficulty in this approach however, is separating the high level of plasma homovanillic acid contributed by the metabolism of norepinephrine.[30][31]
Although dopamine is normally broken down by an oxidoreductase enzyme, it is also susceptible to oxidation by direct reaction with oxygen, yielding quinones plus various free radicals as products.[32] The rate of oxidation can be increased by the presence of ferric iron or other factors. Quinones and free radicals produced by autoxidation of dopamine can poison cells, and there is evidence that this mechanism may contribute to the cell loss that occurs in Parkinson's disease and other conditions.[33]
### Extracted Relations:
{
"section_description": "Dopamine is a neurotransmitter that plays a crucial role in regulating movement, motivation, and reward processing. It is produced in several regions of the brain, including the substantia nigra, ventral tegmental area, and hypothalamus. Dopamine is involved in several physiological functions such as motor control, cognition, and emotional responses. Low levels of dopamine have been linked to several neurological disorders, including Parkinson's disease, schizophrenia, and addiction. Dopamine is also involved in the reward system, which is responsible for motivation and pleasure. The brain's reward system is activated by the release of dopamine in response to pleasurable experiences, such as eating, sex, and drugs. Dopamine is synthesized from the amino acid L-tyrosine, which is derived from dietary sources. The primary precursor of dopamine is L-DOPA, which is synthesized from L-tyrosine by the enzyme tyrosine hydroxylase. Dopamine is then converted into norepinephrine and epinephrine by the enzyme dopamine β-hydroxylase. Dopamine is broken down into inactive metabolites by a set of enzymes, including monoamine oxidase (MAO), catechol-O-methyl transferase (COMT), and aldehyde dehydrogenase (ALDH). The two primary metabolic routes that convert dopamine into HVA are: Dopamine → DOPAL → DOPAC → HVA - catalyzed by MAO, ALDH, and COMT respectively Dopamine → 3-Methoxytyramine → HVA - catalyzed by COMT and MAO+ALDH respectively In clinical research on schizophrenia, measurements of homovanillic acid in plasma have been used to estimate levels of dopamine activity in the brain. A difficulty in this approach however, is separating the high level of plasma homovanillic acid contributed by the metabolism of norepinephrine. Dopamine is also susceptible to oxidation by direct reaction with oxygen, yielding quinones plus various free radicals as products. The rate of oxidation can be increased by the presence of ferric iron or other factors. Quinones and free radicals produced by autoxidation of dopamine can poison cells, and there is evidence that this mechanism may contribute to the cell loss that occurs in Parkinson's disease and other conditions. ",
"list_of_entities": [
"motivation",
"Parkinson's disease",
"cognition",
"pleasure",
"dopamine",
"L-tyrosine",
"schizophrenia",
"emotional responses",
"L-DOPA",
"dopamine β-hydroxylase",
"dopamine β-hydroxylase",
"L-DOPA",
"dopamine",
"L-tyrosine",
"dopamine β-hydroxylase",
"L-DOPA",
"L-tyrosine",
"L-DOPA",
"dopamine",
"L-DOPA",
"dopamine"
],
"relations": [
{
"description": "Dopamine is synthesized from the amino acid L-tyrosine, which is derived from dietary sources.",
"source_entities": [
"dopamine"
],
"target_entities": [
"L-tyrosine"
]
},
{
"description": "The primary precursor of dopamine is L-DOPA, which is synthesized from L-tyrosine by the enzyme tyrosine hydroxylase.",
"source_entities": [
"L-DOPA"
],
"target_entities": [
"dopamine"
]
},
{
"description": "Dopamine is then converted into norepinephrine and epinephrine by the enzyme dopamine β-hydroxylase.",
"source_entities": [
"dopamine"
],
"target_entities": [
"dopamine β-hydroxylase"
]
},
{
"description": "Dopamine is broken down into inactive metabolites by a set of enzymes, including monoamine oxidase (MAO), catechol-O-methyl transferase (COMT), and aldehyde dehydrogenase (ALDH).",
"source_entities": [
"dopamine"
],
"target_entities": [
"monoamine oxidase (MAO)",
"catechol-O-methyl transferase (COMT)",
"aldehyde dehydrogenase (ALDH)"
]
},
{
"description": "The two primary metabolic routes that convert dopamine into HVA are: Dopamine → DOPAL → DOPAC → HVA - catalyzed by MAO, ALDH, and COMT respectively Dopamine → 3-Methoxytyramine → HVA - catalyzed by COMT and MAO+ALDH respectively",
"source_entities": [
"dopamine"
],
"target_entities": [
"HVA",
"MAO",
"ALDH",
"COMT"
]
},
{
"description": "In clinical research on schizophrenia, measurements of homovanillic acid in plasma have been used to estimate levels of dopamine activity in the brain.",
"source_entities": [
"dopamine"
],
"target_entities": [
"homovanillic acid"
]
},
{
"description": "A difficulty in this approach however, is separating the high level of plasma homovanillic acid contributed by the metabolism of norepinephrine.",
"source_entities": [
"homovanillic acid"
],
"target_entities": [
"norepinephrine"
]
},
{
"description": "Dopamine is also susceptible to oxidation by direct reaction with oxygen, yielding quinones plus various free radicals as products.",
"source_entities": [
"dopamine"
],
"target_entities": [
"oxidation"
]
},
{
"description": "The rate of oxidation can be increased by the presence of ferric iron or other factors.",
"source_entities": [
"dopamine"
],
"target_entities": [
"ferric iron"
]
},
{
"description": "Quinones and free radicals produced by autoxidation of dopamine can poison cells, and there is evidence that this mechanism may contribute to the cell loss that occurs in Parkinson's disease and other conditions.",
"source_entities": [
"dopamine"
],
"target_entities": [
"cell loss"
]
}
]
}
```
### Longer example (dirt block)
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Usage
Using bone meal on grass blocks causes short grass, tall grass,[Java Edition only] ferns,[Bedrock Edition only] and flowers to grow.
Passive mobs tend to wander toward grass blocks. They also wander toward light.
When a sheep eats a grass block, the block becomes dirt, and a sheared sheep regrows its wool. Baby sheep graze grass much more often than adults and mature 1 minute faster when grazing.
Tilling a grass block with a hoe converts it to a farmland block.
Dirt paths can be created by using any type of shovel on the side or top of a grass block with air above it. The shovel loses 1 durability for each dirt path block created.
Death
Grass dies and changes to dirt a random time (when a random tick lands on the block) after it has been covered by any opaque blocks. Transparent blocks can kill grass in a similar manner if they cause the light level above the grass block to be four or below (like water does), and the surrounding area is not otherwise sufficiently illuminated.
Spawning
In Java Edition, animals occasionally spawn on grass blocks that have light level 9 or brighter in the space directly above. This is quite rare and requires that there be few other animals nearby or in the spawn chunks. Most animals are instead created along with the terrain.
In Bedrock Edition, most animals require surface grass blocks (grass blocks with no solid block anywhere above them at the same x, z coordinate) to spawn. Animals regularly spawn in most Overworld biomes on surface grass blocks that have light level 7 or brighter in the space directly above.
Snowy grass block
A snowy grass block with snow on top of it.
Grass blocks turn into a snowy grass block if snow, powder snow or a snow block is placed on top of the block. Removing the snow layer turns the block into a normal grass block.
Composting
In Bedrock Edition, placing grass blocks into a composter has a 30% chance of raising the compost level by 1.
### Extracted Relations:
{
"section_description": "The article discusses the various ways in which grass blocks can be used in Minecraft, including as food, fuel, and protection. It also covers how grass blocks can be used to tame animals, and the dangers of being exposed to grass. The article concludes by highlighting the importance of knowing how to use and defend against grass blocks.",
"list_of_entities": [
"dirt",
"Minecraft",
"light level",
"animal",
"surface grass blocks",
"Bedrock Edition",
"Java Edition",
"dirt path",
"grass",
"snow",
"snowy grass block",
"opaque blocks",
"sheep",
"composter",
"transparent blocks"
],
"relations": [
{
"description": "Grass blocks are a renewable resource that can be used as a food item.",
"source_entities": [
"Minecraft",
"grass"
],
"target_entities": [
"animal",
"surface grass blocks"
],
"strength": "strong"
},
{
"description": "Grass blocks can be used to tame animals.",
"source_entities": [
"Minecraft",
"grass"
],
"target_entities": [
"animal",
"surface grass blocks"
],
"strength": "strong"
},
{
"description": "Exposure to grass blocks can cause damage to the player's health.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"grass",
"player"
],
"strength": "strong"
},
{
"description": "Placing grass blocks in a composter has a 30% chance of raising the compost level by 1.",
"source_entities": [
"Minecraft",
"grass"
],
"target_entities": [
"composter",
"grass"
],
"strength": "strong"
},
{
"description": "Surface grass blocks are the only ones that can be used to spawn animals.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"surface grass blocks",
"animal"
],
"strength": "strong"
},
{
"description": "Animals regularly spawn in most Overworld biomes on surface grass blocks that have light level 7 or brighter in the space directly above.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"surface grass blocks",
"animal"
],
"strength": "strong"
},
{
"description": "If you walk into the grass without first getting rid of any animals or monsters that are there, they will attack you.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"grass",
"player"
],
"strength": "moderate"
},
{
"description": "Placing grass blocks with snow on top of them turns them into snowy grass blocks.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"snow",
"grass"
],
"strength": "strong"
},
{
"description": "Removing the snow layer turns the block into a normal grass block.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"snowy grass block",
"grass"
],
"strength": "strong"
},
{
"description": "Dirt path blocks can be created by using any type of shovel on the side or top of a grass block with air above it.",
"source_entities": [
"Minecraft",
"grass"
],
"target_entities": [
"dirt path",
"shovel"
],
"strength": "strong"
},
{
"description": "The shovel loses 1 durability for each dirt path block created.",
"source_entities": [
"Minecraft",
"shovel"
],
"target_entities": [
"dirt path",
"shovel"
],
"strength": "moderate"
},
{
"description": "Death grass block dies and changes to dirt a random time (when a random tick lands on the block)",
"source_entities": [
"Minecraft"
],
"target_entities": [
"death grass block",
"dirt"
],
"strength": "strong"
},
{
"description": "Grass can be used to create dirt paths",
"source_entities": [
"Minecraft"
],
"target_entities": [
"grass",
"dirt path"
],
"strength": "strong"
}
]
}
``` | [
"CRAFT"
] |
realpsninja/Cartoon_Logo_for_SDXL | realpsninja | text-to-image | [
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"template:sd-lora",
"base_model:stabilityai/sdxl-turbo",
"base_model:adapter:stabilityai/sdxl-turbo",
"license:mit",
"region:us"
] | "2024-04-21T02:19:03Z" | 2024-04-21T02:19:03+00:00 | 0 | 0 | ---
base_model: stabilityai/sdxl-turbo
license: mit
tags:
- text-to-image
- stable-diffusion
- lora
- diffusers
- template:sd-lora
widget:
- text: '-'
output:
url: images/myFile_20_5.0_016.jpeg
- text: '-'
output:
url: images/myFile_20_5.0_012.jpeg
---
# Cartoon_Logo_SDXL
<Gallery />
## Model description
Introducing Lora, a versatile AI model designed to craft personalized cartoon logos for small businesses. Whether you're a quaint coffee shop or a buzzing tattoo studio, Lora can bring your brand to life with a touch of whimsy and a dash of character. With a simple prompt structure of "text [COMPANY_NAME], cartoon logo,..." Lora navigates the complexities of visual creativity to deliver logos that are not only eye-catching but also resonate with your brand's identity.
Lora isn't limited to visuals alone; it can adeptly integrate text into your logo, ensuring that your company's name stands out in a unique and memorable way. The best results are achieved with a weight setting around 1, which balances the elements of design and typography to perfection.
Whether you're looking to capture the nostalgia of pop culture, the humor of a well-loved meme, or the iconic essence of a global franchise like Star Wars, Lora is equipped to translate your vision into a logo that speaks volumes. So, give Lora your company name and watch as it crafts a one-of-a-kind logo that's tailored just for you—a logo that's not just a symbol, but a story waiting to be told.
## Download model
Weights for this model are available in Safetensors format.
[Download](/realpsninja/Cartoon_Logo_for_SDXL/tree/main) them in the Files & versions tab.
| [
"CRAFT"
] |
zhichen/Llama3-Chinese-Lora | zhichen | null | [
"safetensors",
"arxiv:2402.09353",
"arxiv:2402.12354",
"region:us"
] | "2024-04-21T05:59:46Z" | 2024-04-23T09:55:58+00:00 | 0 | 4 | ---
{}
---
<p align="left">
<a href="README_CN.md">中文</a>  |  English
</p>
<br><br>
<p align="center">
<a href='https://huggingface.co/spaces/zhichen'>
<img src='./images/logo.png'>
</a>
</p>
<div align="center">
<p align="center">
<h3> Llama3-Chinese </h3>
<p align="center">
<a href='https://huggingface.co/zhichen'>
<img src='https://img.shields.io/badge/%F0%9F%A4%97%20HuggingFace-Llama3%20Chinese-yellow'>
</a>
<a href='https://modelscope.cn/profile/seanzhang'>
<img src='https://img.shields.io/badge/🤖 ModelScope-Llama3%20Chinese-blue'>
</a>
<br>
<a href=href="https://github.com/seanzhang-zhichen/llama3-chinese/stargazers">
<img src="https://img.shields.io/github/stars/seanzhang-zhichen/llama3-chinese?color=ccf">
</a>
<a href="https://github.com/seanzhang-zhichen/llama3-chinese/blob/main/LICENSE">
<img alt="GitHub Contributors" src="https://img.shields.io/badge/license-Apache%202.0-blue.svg" />
</a>
</p>
</div>
## Introduce
**Llama3-Chinese** is a large model trained on 500k high-quality Chinese multi-turn SFT data, 100k English multi-turn SFT data, and 2k single-turn self-cognition data, using the training methods of [DORA](https://arxiv.org/pdf/2402.09353.pdf) and [LORA+](https://arxiv.org/pdf/2402.12354.pdf) based on **Meta-Llama-3-8B** as the base.
**Github:** [https://github.com/seanzhang-zhichen/llama3-chinese](https://github.com/seanzhang-zhichen/llama3-chinese)

## Download Model
| Model | Download |
|:-------------------:|:-----------:|
| Meta-Llama-3-8B |[ 🤗 HuggingFace](https://huggingface.co/meta-llama/Meta-Llama-3-8B) [ 🤖 ModelScope](https://modelscope.cn/models/LLM-Research/Meta-Llama-3-8B)|
| Llama3-Chinese-Lora |[ 🤗 HuggingFace](https://huggingface.co/zhichen/Llama3-Chinese-Lora) [ 🤖 ModelScope](https://modelscope.cn/models/seanzhang/Llama3-Chinese-Lora)|
| Llama3-Chinese (merged model) |[ 🤗 HuggingFace](https://huggingface.co/zhichen/Llama3-Chinese) [ 🤖 ModelScope](https://modelscope.cn/models/seanzhang/Llama3-Chinese)|
## Merge LORA Model (Skippable)
1、Download [Meta-Llama-3-8B](https://modelscope.cn/models/LLM-Research/Meta-Llama-3-8B)
```bash
git clone https://www.modelscope.cn/LLM-Research/Meta-Llama-3-8B.git
```
2、Download [Llama3-Chinese-Lora](https://www.modelscope.cn/models/seanzhang/Llama3-Chinese-Lora)
**From ModelScope**
```bash
git lfs install
git clone https://www.modelscope.cn/seanzhang/Llama3-Chinese-Lora.git
```
**From HuggingFace**
```bash
git lfs install
git clone https://huggingface.co/zhichen/Llama3-Chinese-Lora
```
3、Merge Model
```bash
python merge_lora.py \
--base_model path/to/Meta-Llama-3-8B \
--lora_model path/to/lora/Llama3-Chinese-Lora \
--output_dir ./Llama3-Chinese
```
## Download Llama3-Chinese (Merged Model)
**From ModelScope**
```bash
git lfs install
git clone https://www.modelscope.cn/seanzhang/Llama3-Chinese.git
```
**From HuggingFace**
```bash
git lfs install
git clone https://huggingface.co/zhichen/Llama3-Chinese
```
## Inference
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
model_id = "zhichen/Llama3-Chinese"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype="auto", device_map="auto")
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "你好"},
]
input_ids = tokenizer.apply_chat_template(
messages, add_generation_prompt=True, return_tensors="pt"
).to(model.device)
outputs = model.generate(
input_ids,
max_new_tokens=2048,
do_sample=True,
temperature=0.7,
top_p=0.95,
)
response = outputs[0][input_ids.shape[-1]:]
print(tokenizer.decode(response, skip_special_tokens=True))
```
## CLI DEMO
```bash
python cli_demo.py --model_path zhichen/Llama3-Chinese
```
## WEB DEMO
```bash
python web_demo.py --model_path zhichen/Llama3-Chinese
```
## VLLM WEB DEMO
1、Use [vllm](https://github.com/vllm-project/vllm) deploy model
```bash
python -m vllm.entrypoints.openai.api_server --served-model-name Llama3-Chinese --model ./Llama3-Chinese(Replace it with your own merged model path)
```
2、This command is executed on the CLI
```bash
python vllm_web_demo.py --model Llama3-Chinese
```
## Train Dataset
[deepctrl-sft-data](https://modelscope.cn/datasets/deepctrl/deepctrl-sft-data)
## LICENSE
This project can only be used for research purposes, and the project developer shall not bear any harm or loss caused by the use of this project (including but not limited to data, models, codes, etc.). For details, please refer to [DISCLAIMER](https://github.com/seanzhang-zhichen/Llama3-Chinese/blob/main/DISCLAIMER)。
The License agreement of the Llama3-Chinese project code is the [Apache License 2.0](./LICENSE). The code is free for commercial use, and the model weights and data can only be used for research purposes. Please attach a link to Llama3-Chinese and the licensing agreement in the product description.
## Citation
If you used Llama3-Chinese in your research, cite it in the following format:
```latex
@misc{Llama3-Chinese,
title={Llama3-Chinese},
author={Zhichen Zhang, Xin LU, Long Chen},
year={2024},
howpublished={\url{https://github.com/seanzhang-zhichen/llama3-chinese}},
}
```
## Acknowledgement
[meta-llama/llama3](https://github.com/meta-llama/llama3)
<br>
[hiyouga/LLaMA-Factory](https://github.com/hiyouga/LLaMA-Factory)
## Star History
[](https://star-history.com/#seanzhang-zhichen/Llama3-Chinese&Date)
| [
"BEAR"
] |
TensorStack/SDXL-Lightning-onnx | TensorStack | text-to-image | [
"onnx",
"text-to-image",
"region:us"
] | "2024-04-22T01:28:16Z" | 2024-05-27T01:24:32+00:00 | 0 | 2 | ---
pipeline_tag: text-to-image
---
# SDXL Lightning - Onnx Olive DirectML Optimized
## Original Model
https://huggingface.co/ByteDance/SDXL-Lightning
## C# Inference Demo
https://github.com/saddam213/OnnxStack
```csharp
// Create Pipeline
var pipeline = StableDiffusionXLPipeline.CreatePipeline("D:\\Repositories\\SDXL-Lightning-onnx");
// Prompt
var promptOptions = new PromptOptions
{
Prompt = "Craft an image of a galaxy far, far away, with swirling nebulae and distant stars painting the cosmic canvas"
};
// Scheduler Options
var schedulerOptions = pipeline.DefaultSchedulerOptions with
{
SchedulerType = SchedulerType.Euler,
InferenceSteps = 8,
GuidanceScale = 0,
BetaSchedule = BetaScheduleType.Linear
};
// Run pipeline
var result = await pipeline.GenerateImageAsync(promptOptions, schedulerOptions);
// Save Image Result
await result.SaveAsync("Result.png");
```
## Inference Result
 | [
"CRAFT"
] |
LiteLLMs/OpenELM-3B-Instruct-GGUF | LiteLLMs | null | [
"GGUF",
"arxiv:2404.14619",
"license:other",
"region:us"
] | "2024-04-25T11:16:16Z" | 2024-04-25T11:16:25+00:00 | 0 | 4 | ---
license: other
license_name: apple-sample-code-license
license_link: LICENSE
tags:
- GGUF
quantized_by: andrijdavid
---
# OpenELM-3B-Instruct-GGUF
- Original model: [OpenELM-3B-Instruct](https://huggingface.co/apple/OpenELM-3B-Instruct)
<!-- description start -->
## Description
This repo contains GGUF format model files for [OpenELM-3B-Instruct](https://huggingface.co/apple/OpenELM-3B-Instruct).
<!-- description end -->
<!-- README_GGUF.md-about-gguf start -->
### About GGUF
GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
* [llama.cpp](https://github.com/ggerganov/llama.cpp). This is the source project for GGUF, providing both a Command Line Interface (CLI) and a server option.
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), Known as the most widely used web UI, this project boasts numerous features and powerful extensions, and supports GPU acceleration.
* [Ollama](https://github.com/jmorganca/ollama) Ollama is a lightweight and extensible framework designed for building and running language models locally. It features a simple API for creating, managing, and executing models, along with a library of pre-built models for use in various applications
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), A comprehensive web UI offering GPU acceleration across all platforms and architectures, particularly renowned for storytelling.
* [GPT4All](https://gpt4all.io), This is a free and open source GUI that runs locally, supporting Windows, Linux, and macOS with full GPU acceleration.
* [LM Studio](https://lmstudio.ai/) An intuitive and powerful local GUI for Windows and macOS (Silicon), featuring GPU acceleration.
* [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui). A notable web UI with a variety of unique features, including a comprehensive model library for easy model selection.
* [Faraday.dev](https://faraday.dev/), An attractive, user-friendly character-based chat GUI for Windows and macOS (both Silicon and Intel), also offering GPU acceleration.
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), A Python library equipped with GPU acceleration, LangChain support, and an OpenAI-compatible API server.
* [candle](https://github.com/huggingface/candle), A Rust-based ML framework focusing on performance, including GPU support, and designed for ease of use.
* [ctransformers](https://github.com/marella/ctransformers), A Python library featuring GPU acceleration, LangChain support, and an OpenAI-compatible AI server.
* [localGPT](https://github.com/PromtEngineer/localGPT) An open-source initiative enabling private conversations with documents.
<!-- README_GGUF.md-about-gguf end -->
<!-- compatibility_gguf start -->
## Explanation of quantisation methods
<details>
<summary>Click to see details</summary>
The new methods available are:
* GGML_TYPE_Q2_K - "type-1" 2-bit quantization in super-blocks containing 16 blocks, each block having 16 weight. Block scales and mins are quantized with 4 bits. This ends up effectively using 2.5625 bits per weight (bpw)
* GGML_TYPE_Q3_K - "type-0" 3-bit quantization in super-blocks containing 16 blocks, each block having 16 weights. Scales are quantized with 6 bits. This end up using 3.4375 bpw.
* GGML_TYPE_Q4_K - "type-1" 4-bit quantization in super-blocks containing 8 blocks, each block having 32 weights. Scales and mins are quantized with 6 bits. This ends up using 4.5 bpw.
* GGML_TYPE_Q5_K - "type-1" 5-bit quantization. Same super-block structure as GGML_TYPE_Q4_K resulting in 5.5 bpw
* GGML_TYPE_Q6_K - "type-0" 6-bit quantization. Super-blocks with 16 blocks, each block having 16 weights. Scales are quantized with 8 bits. This ends up using 6.5625 bpw.
</details>
<!-- compatibility_gguf end -->
<!-- README_GGUF.md-how-to-download start -->
## How to download GGUF files
**Note for manual downloaders:** You almost never want to clone the entire repo! Multiple different quantisation formats are provided, and most users only want to pick and download a single folder.
The following clients/libraries will automatically download models for you, providing a list of available models to choose from:
* LM Studio
* LoLLMS Web UI
* Faraday.dev
### In `text-generation-webui`
Under Download Model, you can enter the model repo: LiteLLMs/OpenELM-3B-Instruct-GGUF and below it, a specific filename to download, such as: Q4_0/Q4_0-00001-of-00009.gguf.
Then click Download.
### On the command line, including multiple files at once
I recommend using the `huggingface-hub` Python library:
```shell
pip3 install huggingface-hub
```
Then you can download any individual model file to the current directory, at high speed, with a command like this:
```shell
huggingface-cli download LiteLLMs/OpenELM-3B-Instruct-GGUF Q4_0/Q4_0-00001-of-00009.gguf --local-dir . --local-dir-use-symlinks False
```
<details>
<summary>More advanced huggingface-cli download usage (click to read)</summary>
You can also download multiple files at once with a pattern:
```shell
huggingface-cli download LiteLLMs/OpenELM-3B-Instruct-GGUF --local-dir . --local-dir-use-symlinks False --include='*Q4_K*gguf'
```
For more documentation on downloading with `huggingface-cli`, please see: [HF -> Hub Python Library -> Download files -> Download from the CLI](https://huggingface.co/docs/huggingface_hub/guides/download#download-from-the-cli).
To accelerate downloads on fast connections (1Gbit/s or higher), install `hf_transfer`:
```shell
pip3 install huggingface_hub[hf_transfer]
```
And set environment variable `HF_HUB_ENABLE_HF_TRANSFER` to `1`:
```shell
HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download LiteLLMs/OpenELM-3B-Instruct-GGUF Q4_0/Q4_0-00001-of-00009.gguf --local-dir . --local-dir-use-symlinks False
```
Windows Command Line users: You can set the environment variable by running `set HF_HUB_ENABLE_HF_TRANSFER=1` before the download command.
</details>
<!-- README_GGUF.md-how-to-download end -->
<!-- README_GGUF.md-how-to-run start -->
## Example `llama.cpp` command
Make sure you are using `llama.cpp` from commit [d0cee0d](https://github.com/ggerganov/llama.cpp/commit/d0cee0d36d5be95a0d9088b674dbb27354107221) or later.
```shell
./main -ngl 35 -m Q4_0/Q4_0-00001-of-00009.gguf --color -c 8192 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "<PROMPT>"
```
Change `-ngl 32` to the number of layers to offload to GPU. Remove it if you don't have GPU acceleration.
Change `-c 8192` to the desired sequence length. For extended sequence models - eg 8K, 16K, 32K - the necessary RoPE scaling parameters are read from the GGUF file and set by llama.cpp automatically. Note that longer sequence lengths require much more resources, so you may need to reduce this value.
If you want to have a chat-style conversation, replace the `-p <PROMPT>` argument with `-i -ins`
For other parameters and how to use them, please refer to [the llama.cpp documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md)
## How to run in `text-generation-webui`
Further instructions can be found in the text-generation-webui documentation, here: [text-generation-webui/docs/04 ‐ Model Tab.md](https://github.com/oobabooga/text-generation-webui/blob/main/docs/04%20%E2%80%90%20Model%20Tab.md#llamacpp).
## How to run from Python code
You can use GGUF models from Python using the [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) or [ctransformers](https://github.com/marella/ctransformers) libraries. Note that at the time of writing (Nov 27th 2023), ctransformers has not been updated for some time and is not compatible with some recent models. Therefore I recommend you use llama-cpp-python.
### How to load this model in Python code, using llama-cpp-python
For full documentation, please see: [llama-cpp-python docs](https://abetlen.github.io/llama-cpp-python/).
#### First install the package
Run one of the following commands, according to your system:
```shell
# Base ctransformers with no GPU acceleration
pip install llama-cpp-python
# With NVidia CUDA acceleration
CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python
# Or with OpenBLAS acceleration
CMAKE_ARGS="-DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS" pip install llama-cpp-python
# Or with CLBLast acceleration
CMAKE_ARGS="-DLLAMA_CLBLAST=on" pip install llama-cpp-python
# Or with AMD ROCm GPU acceleration (Linux only)
CMAKE_ARGS="-DLLAMA_HIPBLAS=on" pip install llama-cpp-python
# Or with Metal GPU acceleration for macOS systems only
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python
# In windows, to set the variables CMAKE_ARGS in PowerShell, follow this format; eg for NVidia CUDA:
$env:CMAKE_ARGS = "-DLLAMA_OPENBLAS=on"
pip install llama-cpp-python
```
#### Simple llama-cpp-python example code
```python
from llama_cpp import Llama
# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = Llama(
model_path="./Q4_0/Q4_0-00001-of-00009.gguf", # Download the model file first
n_ctx=32768, # The max sequence length to use - note that longer sequence lengths require much more resources
n_threads=8, # The number of CPU threads to use, tailor to your system and the resulting performance
n_gpu_layers=35 # The number of layers to offload to GPU, if you have GPU acceleration available
)
# Simple inference example
output = llm(
"<PROMPT>", # Prompt
max_tokens=512, # Generate up to 512 tokens
stop=["</s>"], # Example stop token - not necessarily correct for this specific model! Please check before using.
echo=True # Whether to echo the prompt
)
# Chat Completion API
llm = Llama(model_path="./Q4_0/Q4_0-00001-of-00009.gguf", chat_format="llama-2") # Set chat_format according to the model you are using
llm.create_chat_completion(
messages = [
{"role": "system", "content": "You are a story writing assistant."},
{
"role": "user",
"content": "Write a story about llamas."
}
]
)
```
## How to use with LangChain
Here are guides on using llama-cpp-python and ctransformers with LangChain:
* [LangChain + llama-cpp-python](https://python.langchain.com/docs/integrations/llms/llamacpp)
* [LangChain + ctransformers](https://python.langchain.com/docs/integrations/providers/ctransformers)
<!-- README_GGUF.md-how-to-run end -->
<!-- footer end -->
<!-- original-model-card start -->
# Original model card: OpenELM-3B-Instruct
# OpenELM
*Sachin Mehta, Mohammad Hossein Sekhavat, Qingqing Cao, Maxwell Horton, Yanzi Jin, Chenfan Sun, Iman Mirzadeh, Mahyar Najibi, Dmitry Belenko, Peter Zatloukal, Mohammad Rastegari*
We introduce **OpenELM**, a family of **Open**-source **E**fficient **L**anguage **M**odels. OpenELM uses a layer-wise scaling strategy to efficiently allocate parameters within each layer of the transformer model, leading to enhanced accuracy. We pretrained OpenELM models using the [CoreNet](https://github.com/apple/corenet) library. We release both pretrained and instruction tuned models with 270M, 450M, 1.1B and 3B parameters.
Our pre-training dataset contains RefinedWeb, deduplicated PILE, a subset of RedPajama, and a subset of Dolma v1.6, totaling approximately 1.8 trillion tokens. Please check license agreements and terms of these datasets before using them.
## Usage
We have provided an example function to generate output from OpenELM models loaded via [HuggingFace Hub](https://huggingface.co/docs/hub/) in `generate_openelm.py`.
You can try the model by running the following command:
```
python generate_openelm.py --model apple/OpenELM-3B-Instruct --hf_access_token [HF_ACCESS_TOKEN] --prompt 'Once upon a time there was' --generate_kwargs repetition_penalty=1.2
```
Please refer to [this link](https://huggingface.co/docs/hub/security-tokens) to obtain your hugging face access token.
Additional arguments to the hugging face generate function can be passed via `generate_kwargs`. As an example, to speedup the inference, you can try [lookup token speculative generation](https://huggingface.co/docs/transformers/generation_strategies) by passing the `prompt_lookup_num_tokens` argument as follows:
```
python generate_openelm.py --model apple/OpenELM-3B-Instruct --hf_access_token [HF_ACCESS_TOKEN] --prompt 'Once upon a time there was' --generate_kwargs repetition_penalty=1.2 prompt_lookup_num_tokens=10
```
Alternatively, try model-wise speculative generation with an [assistive model](https://huggingface.co/blog/assisted-generation) by passing a smaller model through the `assistant_model` argument, for example:
```
python generate_openelm.py --model apple/OpenELM-3B-Instruct --hf_access_token [HF_ACCESS_TOKEN] --prompt 'Once upon a time there was' --generate_kwargs repetition_penalty=1.2 --assistant_model [SMALLER_MODEL]
```
## Main Results
### Zero-Shot
| **Model Size** | **ARC-c** | **ARC-e** | **BoolQ** | **HellaSwag** | **PIQA** | **SciQ** | **WinoGrande** | **Average** |
| | | - | | -- | | | -- | -- | | | - | | -- |
| [OpenELM-270M](https://huggingface.co/apple/OpenELM-270M) | 27.65 | **66.79** | 47.15 | 25.72 | 69.75 | 30.91 | **39.24** | **53.83** | 45.13 |
| [OpenELM-270M-Instruct](https://huggingface.co/apple/OpenELM-270M-Instruct) | **32.51** | 66.01 | **51.58** | **26.70** | **70.78** | 33.78 | 38.72 | 53.20 | **46.66** |
| [OpenELM-450M](https://huggingface.co/apple/OpenELM-450M) | 30.20 | **68.63** | 53.86 | **26.01** | 72.31 | 33.11 | 40.18 | 57.22 | 47.69 |
| [OpenELM-450M-Instruct](https://huggingface.co/apple/OpenELM-450M-Instruct) | **33.53** | 67.44 | **59.31** | 25.41 | **72.63** | **36.84** | **40.48** | **58.33** | **49.25** |
| [OpenELM-1_1B](https://huggingface.co/apple/OpenELM-1_1B) | 36.69 | **71.74** | 65.71 | **27.05** | **75.57** | 36.46 | 36.98 | 63.22 | 51.68 |
| [OpenELM-1_1B-Instruct](https://huggingface.co/apple/OpenELM-1_1B-Instruct) | **41.55** | 71.02 | **71.83** | 25.65 | 75.03 | **39.43** | **45.95** | **64.72** | **54.40** |
| [OpenELM-3B](https://huggingface.co/apple/OpenELM-3B) | 42.24 | **73.29** | 73.28 | **26.76** | 78.24 | **38.76** | 34.98 | 67.25 | 54.35 |
| [OpenELM-3B-Instruct](https://huggingface.co/apple/OpenELM-3B-Instruct) | **47.70** | 72.33 | **76.87** | 24.80 | **79.00** | 38.47 | **38.76** | **67.96** | **55.73** |
See the technical report for more results and comparison.
## Evaluation
### Setup
Install the following dependencies:
```bash
# install public lm-eval-harness
harness_repo="public-lm-eval-harness"
git clone https://github.com/EleutherAI/lm-evaluation-harness ${harness_repo}
cd ${harness_repo}
# use main branch on 03-15-2024, SHA is dc90fec
git checkout dc90fec
pip install -e .
cd ..
# 66d6242 is the main branch on 2024-04-01
pip install datasets@git+https://github.com/huggingface/datasets.git@66d6242
pip install tokenizers>=0.15.2 transformers>=4.38.2 sentencepiece>=0.2.0
```
### Evaluate OpenELM
```bash
# OpenELM-3B-Instruct
hf_model=OpenELM-3B-Instruct
# this flag is needed because lm-eval-harness set add_bos_token to False by default, but OpenELM uses LLaMA tokenizer which requires add_bos_token to be True
tokenizer=meta-llama/Llama-2-7b-hf
add_bos_token=True
batch_size=1
mkdir lm_eval_output
shot=0
task=arc_challenge,arc_easy,boolq,hellaswag,piqa,race,winogrande,sciq,truthfulqa_mc2
lm_eval --model hf \
--model_args pretrained=${hf_model},trust_remote_code=True,add_bos_token=${add_bos_token},tokenizer=${tokenizer} \
--tasks ${task} \
--device cuda:0 \
--num_fewshot ${shot} \
--output_path ./lm_eval_output/${hf_model//\//_}_${task//,/_}-${shot}shot \
--batch_size ${batch_size} 2>&1 | tee ./lm_eval_output/eval-${hf_model//\//_}_${task//,/_}-${shot}shot.log
shot=5
task=mmlu,winogrande
lm_eval --model hf \
--model_args pretrained=${hf_model},trust_remote_code=True,add_bos_token=${add_bos_token},tokenizer=${tokenizer} \
--tasks ${task} \
--device cuda:0 \
--num_fewshot ${shot} \
--output_path ./lm_eval_output/${hf_model//\//_}_${task//,/_}-${shot}shot \
--batch_size ${batch_size} 2>&1 | tee ./lm_eval_output/eval-${hf_model//\//_}_${task//,/_}-${shot}shot.log
shot=25
task=arc_challenge,crows_pairs_english
lm_eval --model hf \
--model_args pretrained=${hf_model},trust_remote_code=True,add_bos_token=${add_bos_token},tokenizer=${tokenizer} \
--tasks ${task} \
--device cuda:0 \
--num_fewshot ${shot} \
--output_path ./lm_eval_output/${hf_model//\//_}_${task//,/_}-${shot}shot \
--batch_size ${batch_size} 2>&1 | tee ./lm_eval_output/eval-${hf_model//\//_}_${task//,/_}-${shot}shot.log
shot=10
task=hellaswag
lm_eval --model hf \
--model_args pretrained=${hf_model},trust_remote_code=True,add_bos_token=${add_bos_token},tokenizer=${tokenizer} \
--tasks ${task} \
--device cuda:0 \
--num_fewshot ${shot} \
--output_path ./lm_eval_output/${hf_model//\//_}_${task//,/_}-${shot}shot \
--batch_size ${batch_size} 2>&1 | tee ./lm_eval_output/eval-${hf_model//\//_}_${task//,/_}-${shot}shot.log
```
## Bias, Risks, and Limitations
The release of OpenELM models aims to empower and enrich the open research community by providing access to state-of-the-art language models. Trained on publicly available datasets, these models are made available without any safety guarantees. Consequently, there exists the possibility of these models producing outputs that are inaccurate, harmful, biased, or objectionable in response to user prompts. Thus, it is imperative for users and developers to undertake thorough safety testing and implement appropriate filtering mechanisms tailored to their specific requirements.
## Citation
If you find our work useful, please cite:
```BibTex
@article{mehtaOpenELMEfficientLanguage2024,
title = {{OpenELM}: {An} {Efficient} {Language} {Model} {Family} with {Open}-source {Training} and {Inference} {Framework}},
shorttitle = {{OpenELM}},
url = {https://arxiv.org/abs/2404.14619v1},
language = {en},
urldate = {2024-04-24},
journal = {arXiv.org},
author = {Mehta, Sachin and Sekhavat, Mohammad Hossein and Cao, Qingqing and Horton, Maxwell and Jin, Yanzi and Sun, Chenfan and Mirzadeh, Iman and Najibi, Mahyar and Belenko, Dmitry and Zatloukal, Peter and Rastegari, Mohammad},
month = apr,
year = {2024},
}
@inproceedings{mehta2022cvnets,
author = {Mehta, Sachin and Abdolhosseini, Farzad and Rastegari, Mohammad},
title = {CVNets: High Performance Library for Computer Vision},
year = {2022},
booktitle = {Proceedings of the 30th ACM International Conference on Multimedia},
series = {MM '22}
}
```
<!-- original-model-card end -->
| [
"SCIQ"
] |
tomaarsen/glove-mean-pooling-sts | tomaarsen | sentence-similarity | [
"sentence-transformers",
"sentence-similarity",
"feature-extraction",
"loss:CosineSimilarityLoss",
"en",
"arxiv:1908.10084",
"model-index",
"co2_eq_emissions",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | "2024-04-25T15:39:01Z" | 2024-04-25T15:39:32+00:00 | 0 | 0 | ---
language:
- en
library_name: sentence-transformers
metrics:
- pearson_cosine
- spearman_cosine
- pearson_manhattan
- spearman_manhattan
- pearson_euclidean
- spearman_euclidean
- pearson_dot
- spearman_dot
- pearson_max
- spearman_max
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- loss:CosineSimilarityLoss
widget:
- source_sentence: Women are running.
sentences:
- Women are running.
- The cougar is chasing the bear.
- NATO soldier killed in Afghan attack
- source_sentence: A woman is reading.
sentences:
- A woman is writing something.
- A person is drawing a picture.
- A dog laying in the snow.
- source_sentence: A plane in the sky.
sentences:
- Two airplanes in the sky.
- A man is playing an instrument.
- Bangladesh executes opposition leader
- source_sentence: A man jumping rope
sentences:
- A man is climbing a rope.
- The girl is playing the guitar.
- A chef prepared a meal.
- source_sentence: A baby is laughing.
sentences:
- The baby laughed in his car seat.
- A person is combing a cat hair.
- A man is riding a horse in the desert.
co2_eq_emissions:
emissions: 0.04787408159843385
energy_consumed: 0.00012316397033828962
source: codecarbon
training_type: fine-tuning
on_cloud: false
cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
ram_total_size: 31.777088165283203
hours_used: 0.002
hardware_used: 1 x NVIDIA GeForce RTX 3090
model-index:
- name: SentenceTransformer
results:
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts dev
type: sts-dev
metrics:
- type: pearson_cosine
value: 0.7683803418925228
name: Pearson Cosine
- type: spearman_cosine
value: 0.7632727671822109
name: Spearman Cosine
- type: pearson_manhattan
value: 0.7167343000545916
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.7284225373129679
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.7177127625426643
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.729676171689153
name: Spearman Euclidean
- type: pearson_dot
value: 0.561565806742925
name: Pearson Dot
- type: spearman_dot
value: 0.6116263753232491
name: Spearman Dot
- type: pearson_max
value: 0.7683803418925228
name: Pearson Max
- type: spearman_max
value: 0.7632727671822109
name: Spearman Max
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts test
type: sts-test
metrics:
- type: pearson_cosine
value: 0.6783055201030597
name: Pearson Cosine
- type: spearman_cosine
value: 0.6549170846046467
name: Spearman Cosine
- type: pearson_manhattan
value: 0.6064971288495867
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.6169187673598634
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.6073075425801093
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.6178537671183167
name: Spearman Euclidean
- type: pearson_dot
value: 0.45009881124802237
name: Pearson Dot
- type: spearman_dot
value: 0.47227603379856636
name: Spearman Dot
- type: pearson_max
value: 0.6783055201030597
name: Pearson Max
- type: spearman_max
value: 0.6549170846046467
name: Spearman Max
---
# SentenceTransformer
This is a [sentence-transformers](https://www.SBERT.net) model trained on the [sentence-transformers/stsb](https://huggingface.co/datasets/sentence-transformers/stsb) dataset. It maps sentences & paragraphs to a 300-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
<!-- - **Base model:** [Unknown](https://huggingface.co/unknown) -->
- **Maximum Sequence Length:** 1000000 tokens
- **Output Dimensionality:** 300 tokens
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- [sentence-transformers/stsb](https://huggingface.co/datasets/sentence-transformers/stsb)
- **Language:** en
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): WordEmbeddings(
(emb_layer): Embedding(400001, 300)
)
(1): Pooling({'word_embedding_dimension': 300, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Dense({'in_features': 300, 'out_features': 300, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
(3): Dense({'in_features': 300, 'out_features': 300, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("tomaarsen/glove-mean-pooling-sts")
# Run inference
sentences = [
'A baby is laughing.',
'The baby laughed in his car seat.',
'A person is combing a cat hair.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 300]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Semantic Similarity
* Dataset: `sts-dev`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.7684 |
| **spearman_cosine** | **0.7633** |
| pearson_manhattan | 0.7167 |
| spearman_manhattan | 0.7284 |
| pearson_euclidean | 0.7177 |
| spearman_euclidean | 0.7297 |
| pearson_dot | 0.5616 |
| spearman_dot | 0.6116 |
| pearson_max | 0.7684 |
| spearman_max | 0.7633 |
#### Semantic Similarity
* Dataset: `sts-test`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.6783 |
| **spearman_cosine** | **0.6549** |
| pearson_manhattan | 0.6065 |
| spearman_manhattan | 0.6169 |
| pearson_euclidean | 0.6073 |
| spearman_euclidean | 0.6179 |
| pearson_dot | 0.4501 |
| spearman_dot | 0.4723 |
| pearson_max | 0.6783 |
| spearman_max | 0.6549 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### sentence-transformers/stsb
* Dataset: [sentence-transformers/stsb](https://huggingface.co/datasets/sentence-transformers/stsb) at [d999f12](https://huggingface.co/datasets/sentence-transformers/stsb/tree/d999f12281623b0925506817d9bd85e88289218a)
* Size: 5,749 training samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 | score |
|:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------|
| type | string | string | float |
| details | <ul><li>min: 1 tokens</li><li>mean: 3.38 tokens</li><li>max: 11 tokens</li></ul> | <ul><li>min: 1 tokens</li><li>mean: 3.39 tokens</li><li>max: 10 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.54</li><li>max: 1.0</li></ul> |
* Samples:
| sentence1 | sentence2 | score |
|:-----------------------------------------------------------|:----------------------------------------------------------------------|:------------------|
| <code>A plane is taking off.</code> | <code>An air plane is taking off.</code> | <code>1.0</code> |
| <code>A man is playing a large flute.</code> | <code>A man is playing a flute.</code> | <code>0.76</code> |
| <code>A man is spreading shreded cheese on a pizza.</code> | <code>A man is spreading shredded cheese on an uncooked pizza.</code> | <code>0.76</code> |
* Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/losses.html#cosinesimilarityloss) with these parameters:
```json
{
"loss_fct": "torch.nn.modules.loss.MSELoss"
}
```
### Evaluation Dataset
#### sentence-transformers/stsb
* Dataset: [sentence-transformers/stsb](https://huggingface.co/datasets/sentence-transformers/stsb) at [d999f12](https://huggingface.co/datasets/sentence-transformers/stsb/tree/d999f12281623b0925506817d9bd85e88289218a)
* Size: 1,500 evaluation samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>score</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 | score |
|:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------|
| type | string | string | float |
| details | <ul><li>min: 1 tokens</li><li>mean: 5.17 tokens</li><li>max: 12 tokens</li></ul> | <ul><li>min: 1 tokens</li><li>mean: 5.08 tokens</li><li>max: 15 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.47</li><li>max: 1.0</li></ul> |
* Samples:
| sentence1 | sentence2 | score |
|:--------------------------------------------------|:------------------------------------------------------|:------------------|
| <code>A man with a hard hat is dancing.</code> | <code>A man wearing a hard hat is dancing.</code> | <code>1.0</code> |
| <code>A young child is riding a horse.</code> | <code>A child is riding a horse.</code> | <code>0.95</code> |
| <code>A man is feeding a mouse to a snake.</code> | <code>The man is feeding a mouse to the snake.</code> | <code>1.0</code> |
* Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/losses.html#cosinesimilarityloss) with these parameters:
```json
{
"loss_fct": "torch.nn.modules.loss.MSELoss"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 32
- `num_train_epochs`: 1
- `warmup_ratio`: 0.1
- `fp16`: True
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: False
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 32
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 1
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: None
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | loss | sts-dev_spearman_cosine | sts-test_spearman_cosine |
|:------:|:----:|:-------------:|:------:|:-----------------------:|:------------------------:|
| 0.5556 | 100 | 0.0908 | 0.0577 | 0.7633 | - |
| 1.0 | 180 | - | - | - | 0.6549 |
### Environmental Impact
Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon).
- **Energy Consumed**: 0.000 kWh
- **Carbon Emitted**: 0.000 kg of CO2
- **Hours Used**: 0.002 hours
### Training Hardware
- **On Cloud**: No
- **GPU Model**: 1 x NVIDIA GeForce RTX 3090
- **CPU Model**: 13th Gen Intel(R) Core(TM) i7-13700K
- **RAM Size**: 31.78 GB
### Framework Versions
- Python: 3.11.6
- Sentence Transformers: 3.0.0.dev0
- Transformers: 4.41.0.dev0
- PyTorch: 2.3.0+cu121
- Accelerate: 0.26.1
- Datasets: 2.18.0
- Tokenizers: 0.19.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"BEAR"
] |
Trelis/Phi-3-mini-128k-instruct-function-calling | Trelis | text-generation | [
"transformers",
"safetensors",
"phi3",
"text-generation",
"nlp",
"code",
"phi-3",
"function-calling",
"conversational",
"custom_code",
"en",
"dataset:Trelis/function_calling_v3",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | "2024-04-27T09:12:46Z" | 2024-04-27T09:39:40+00:00 | 0 | 13 | ---
datasets:
- Trelis/function_calling_v3
language:
- en
pipeline_tag: text-generation
tags:
- nlp
- code
- phi-3
- function-calling
widget:
- messages:
- role: user
content: Can you provide ways to eat combinations of bananas and dragonfruits?
extra_gated_prompt: Purchase access to this repo [HERE](https://buy.stripe.com/00g14Q7BX2HxaMU3dM)!
---
# Function Calling Fine-tuned Phi 3 Instruct
This model is fine-tuned for function calling.
- The model is suitable for commercial use.
Check out other fine-tuned function calling models [here](https://huggingface.co/collections/Trelis/function-calling-v3-657199ecbe378693925c7915).
## Quick Server Setup
Runpod one click TGI template [here](https://runpod.io/console/deploy?template=h9pnbylvph&ref=jmfkcdio). [AWAITING [THIS FIX](https://github.com/huggingface/text-generation-inference/issues/1807)
- See this [YouTube Video](https://www.youtube.com/watch?v=hHn_cV5WUDI) for guidance on inference with this model.
Runpod Affiliate [Link](https://runpod.io?ref=jmfkcdio) (helps support the Trelis channel).
## Inference Scripts
See below for sample prompt format.
Complete inference scripts are available for purchase [here](https://trelis.com/enterprise-server-api-and-inference-guide/):
- Support for TGI, vLLM and Llama.cpp
- Automate catching, handling and chaining of function calls.
## Prompt Format
### Using tokenizer.apply_chat_template
For an easier application of the prompt, you can set up as follows (note that the conversation below is complete, i.e. you need to remove assistant messages if you want to feed in the conversation to the model):
Set up `messages`:
```
[
{
"role": "function_metadata",
"content": "FUNCTION_METADATA"
},
{
"role": "user",
"content": "What is the current weather in London?"
},
{
"role": "function_call",
"content": "{\n \"name\": \"get_current_weather\",\n \"arguments\": {\n \"city\": \"London\"\n }\n}"
},
{
"role": "function_response",
"content": "{\n \"temperature\": \"15 C\",\n \"condition\": \"Cloudy\"\n}"
},
{
"role": "assistant",
"content": "The current weather in London is Cloudy with a temperature of 15 Celsius"
}
]
```
with `FUNCTION_METADATA` as:
```
[
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "This function gets the current weather in a given city",
"parameters": {
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "The city, e.g., San Francisco"
},
"format": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The temperature unit to use."
}
},
"required": ["city"]
}
}
},
{
"type": "function",
"function": {
"name": "get_clothes",
"description": "This function provides a suggestion of clothes to wear based on the current weather",
"parameters": {
"type": "object",
"properties": {
"temperature": {
"type": "string",
"description": "The temperature, e.g., 15 C or 59 F"
},
"condition": {
"type": "string",
"description": "The weather condition, e.g., 'Cloudy', 'Sunny', 'Rainy'"
}
},
"required": ["temperature", "condition"]
}
}
}
]
```
and then apply the chat template to get a formatted prompt:
```
tokenizer = AutoTokenizer.from_pretrained('Trelis/Phi-3-mini-128k-instruct-function-calling', trust_remote_code=True)
prompt = tokenizer.apply_chat_template(prompt, tokenize=False)
```
If you are using a gated model, you need to first run:
```
pip install huggingface_hub
huggingface-cli login
```
### Manual Prompt:
```
<s><|function_metadata|>
[
{
"type": "function",
"function": {
"name": "get_stock_price",
"description": "Get the stock price of an array of stocks",
"parameters": {
"type": "object",
"properties": {
"names": {
"type": "array",
"items": {
"type": "string"
},
"description": "An array of stocks"
}
},
"required": [
"names"
]
}
}
},
{
"type": "function",
"function": {
"name": "get_big_stocks",
"description": "Get the names of the largest N stocks by market cap",
"parameters": {
"type": "object",
"properties": {
"number": {
"type": "integer",
"description": "The number of largest stocks to get the names of, e.g. 25"
},
"region": {
"type": "string",
"description": "The region to consider, can be \"US\" or \"World\"."
}
},
"required": [
"number"
]
}
}
}
]<|end|>
<|user|>
Get the names of the five largest stocks by market cap<|end|>
<|assistant|>
Correct Response:
{
"name": "get_big_stocks",
"arguments": {
"number": "5"
}
}
Generated Response:
```json
{
"function": "get_big_stocks",
"parameters": {
"number": 5,
"region": "World"
}
}
```<|end|><|endoftext|>
```
# Dataset
See [Trelis/function_calling_v3](https://huggingface.co/datasets/Trelis/function_calling_v3).
~~~
The original repo card follows below.
~~~
## Model Summary
The Phi-3-Mini-128K-Instruct is a 3.8 billion-parameter, lightweight, state-of-the-art open model trained using the Phi-3 datasets.
This dataset includes both synthetic data and filtered publicly available website data, with an emphasis on high-quality and reasoning-dense properties.
The model belongs to the Phi-3 family with the Mini version in two variants [4K](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) and [128K](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct) which is the context length (in tokens) that it can support.
After initial training, the model underwent a post-training process that involved supervised fine-tuning and direct preference optimization to enhance its ability to follow instructions and adhere to safety measures.
When evaluated against benchmarks that test common sense, language understanding, mathematics, coding, long-term context, and logical reasoning, the Phi-3 Mini-128K-Instruct demonstrated robust and state-of-the-art performance among models with fewer than 13 billion parameters.
Resources and Technical Documentation:
+ [Phi-3 Microsoft Blog](https://aka.ms/phi3blog-april)
+ [Phi-3 Technical Report](https://aka.ms/phi3-tech-report)
+ [Phi-3 on Azure AI Studio](https://aka.ms/phi3-azure-ai)
+ Phi-3 ONNX: [128K](https://aka.ms/Phi3-mini-128k-instruct-onnx)
## Intended Uses
**Primary use cases**
The model is intended for commercial and research use in English. The model provides uses for applications which require:
1) Memory/compute constrained environments
2) Latency bound scenarios
3) Strong reasoning (especially code, math and logic)
Our model is designed to accelerate research on language and multimodal models, for use as a building block for generative AI powered features.
**Use case considerations**
Our models are not specifically designed or evaluated for all downstream purposes. Developers should consider common limitations of language models as they select use cases, and evaluate and mitigate for accuracy, safety, and fariness before using within a specific downstream use case, particularly for high risk scenarios. Developers should be aware of and adhere to applicable laws or regulations (including privacy, trade compliance laws, etc.) that are relevant to their use case.
Nothing contained in this Model Card should be interpreted as or deemed a restriction or modification to the license the model is released under.
## How to Use
Phi-3 Mini-128K-Instruct has been integrated in the development version (4.40.0) of `transformers`. Until the official version is released through `pip`, ensure that you are doing one of the following:
* When loading the model, ensure that `trust_remote_code=True` is passed as an argument of the `from_pretrained()` function.
* Update your local `transformers` to the development version: `pip uninstall -y transformers && pip install git+https://github.com/huggingface/transformers`. The previous command is an alternative to cloning and installing from the source.
The current `transformers` version can be verified with: `pip list | grep transformers`.
### Tokenizer
Phi-3 Mini-128K-Instruct supports a vocabulary size of up to `32064` tokens. The [tokenizer files](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/blob/main/added_tokens.json) already provide placeholder tokens that can be used for downstream fine-tuning, but they can also be extended up to the model's vocabulary size.
### Chat Format
Given the nature of the training data, the Phi-3 Mini-128K-Instruct model is best suited for prompts using the chat format as follows.
You can provide the prompt as a question with a generic template as follow:
```markdown
<|user|>\nQuestion<|end|>\n<|assistant|>
```
For example:
```markdown
<|system|>
You are a helpful AI assistant.<|end|>
<|user|>
How to explain Internet for a medieval knight?<|end|>
<|assistant|>
```
where the model generates the text after `<|assistant|>`. In case of few-shots prompt, the prompt can be formatted as the following:
```markdown
<|system|>
You are a helpful AI assistant.<|end|>
<|user|>
I am going to Paris, what should I see?<|end|>
<|assistant|>
Paris, the capital of France, is known for its stunning architecture, art museums, historical landmarks, and romantic atmosphere. Here are some of the top attractions to see in Paris:\n\n1. The Eiffel Tower: The iconic Eiffel Tower is one of the most recognizable landmarks in the world and offers breathtaking views of the city.\n2. The Louvre Museum: The Louvre is one of the world's largest and most famous museums, housing an impressive collection of art and artifacts, including the Mona Lisa.\n3. Notre-Dame Cathedral: This beautiful cathedral is one of the most famous landmarks in Paris and is known for its Gothic architecture and stunning stained glass windows.\n\nThese are just a few of the many attractions that Paris has to offer. With so much to see and do, it's no wonder that Paris is one of the most popular tourist destinations in the world."<|end|>
<|user|>
What is so great about #1?<|end|>
<|assistant|>
```
### Sample inference code
This code snippets show how to get quickly started with running the model on a GPU:
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
torch.random.manual_seed(0)
model = AutoModelForCausalLM.from_pretrained(
"microsoft/Phi-3-mini-128k-instruct",
device_map="cuda",
torch_dtype="auto",
trust_remote_code=True,
)
tokenizer = AutoTokenizer.from_pretrained("microsoft/Phi-3-mini-128k-instruct")
messages = [
{"role": "system", "content": "You are a helpful digital assistant. Please provide safe, ethical and accurate information to the user."},
{"role": "user", "content": "Can you provide ways to eat combinations of bananas and dragonfruits?"},
{"role": "assistant", "content": "Sure! Here are some ways to eat bananas and dragonfruits together: 1. Banana and dragonfruit smoothie: Blend bananas and dragonfruits together with some milk and honey. 2. Banana and dragonfruit salad: Mix sliced bananas and dragonfruits together with some lemon juice and honey."},
{"role": "user", "content": "What about solving an 2x + 3 = 7 equation?"},
]
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
)
generation_args = {
"max_new_tokens": 500,
"return_full_text": False,
"temperature": 0.0,
"do_sample": False,
}
output = pipe(messages, **generation_args)
print(output[0]['generated_text'])
```
*Some applications/frameworks might not include a BOS token (`<s>`) at the start of the conversation. Please ensure that it is included since it provides more reliable results.*
## Responsible AI Considerations
Like other language models, the Phi series models can potentially behave in ways that are unfair, unreliable, or offensive. Some of the limiting behaviors to be aware of include:
+ Quality of Service: the Phi models are trained primarily on English text. Languages other than English will experience worse performance. English language varieties with less representation in the training data might experience worse performance than standard American English.
+ Representation of Harms & Perpetuation of Stereotypes: These models can over- or under-represent groups of people, erase representation of some groups, or reinforce demeaning or negative stereotypes. Despite safety post-training, these limitations may still be present due to differing levels of representation of different groups or prevalence of examples of negative stereotypes in training data that reflect real-world patterns and societal biases.
+ Inappropriate or Offensive Content: these models may produce other types of inappropriate or offensive content, which may make it inappropriate to deploy for sensitive contexts without additional mitigations that are specific to the use case.
+ Information Reliability: Language models can generate nonsensical content or fabricate content that might sound reasonable but is inaccurate or outdated.
+ Limited Scope for Code: Majority of Phi-3 training data is based in Python and use common packages such as "typing, math, random, collections, datetime, itertools". If the model generates Python scripts that utilize other packages or scripts in other languages, we strongly recommend users manually verify all API uses.
Developers should apply responsible AI best practices and are responsible for ensuring that a specific use case complies with relevant laws and regulations (e.g. privacy, trade, etc.). Important areas for consideration include:
+ Allocation: Models may not be suitable for scenarios that could have consequential impact on legal status or the allocation of resources or life opportunities (ex: housing, employment, credit, etc.) without further assessments and additional debiasing techniques.
+ High-Risk Scenarios: Developers should assess suitability of using models in high-risk scenarios where unfair, unreliable or offensive outputs might be extremely costly or lead to harm. This includes providing advice in sensitive or expert domains where accuracy and reliability are critical (ex: legal or health advice). Additional safeguards should be implemented at the application level according to the deployment context.
+ Misinformation: Models may produce inaccurate information. Developers should follow transparency best practices and inform end-users they are interacting with an AI system. At the application level, developers can build feedback mechanisms and pipelines to ground responses in use-case specific, contextual information, a technique known as Retrieval Augmented Generation (RAG).
+ Generation of Harmful Content: Developers should assess outputs for their context and use available safety classifiers or custom solutions appropriate for their use case.
+ Misuse: Other forms of misuse such as fraud, spam, or malware production may be possible, and developers should ensure that their applications do not violate applicable laws and regulations.
## Training
### Model
* Architecture: Phi-3 Mini-128K-Instruct has 3.8B parameters and is a dense decoder-only Transformer model. The model is fine-tuned with Supervised fine-tuning (SFT) and Direct Preference Optimization (DPO) to ensure alignment with human preferences and safety guidlines.
* Inputs: Text. It is best suited for prompts using chat format.
* Context length: 128K tokens
* GPUs: 512 H100-80G
* Training time: 7 days
* Training data: 3.3T tokens
* Outputs: Generated text in response to the input
* Dates: Our models were trained between February and April 2024
* Status: This is a static model trained on an offline dataset with cutoff date October 2023. Future versions of the tuned models may be released as we improve models.
### Datasets
Our training data includes a wide variety of sources, totaling 3.3 trillion tokens, and is a combination of
1) Publicly available documents filtered rigorously for quality, selected high-quality educational data, and code;
2) Newly created synthetic, “textbook-like” data for the purpose of teaching math, coding, common sense reasoning, general knowledge of the world (science, daily activities, theory of mind, etc.);
3) High quality chat format supervised data covering various topics to reflect human preferences on different aspects such as instruct-following, truthfulness, honesty and helpfulness.
### Fine-tuning
A basic example of multi-GPUs supervised fine-tuning (SFT) with TRL and Accelerate modules is provided [here](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/resolve/main/sample_finetune.py).
## Benchmarks
We report the results for Phi-3-Mini-128K-Instruct on standard open-source benchmarks measuring the model's reasoning ability (both common sense reasoning and logical reasoning). We compare to Phi-2, Mistral-7b-v0.1, Mixtral-8x7b, Gemma 7B, Llama-3-8B-Instruct, and GPT-3.5.
All the reported numbers are produced with the exact same pipeline to ensure that the numbers are comparable. These numbers might differ from other published numbers due to slightly different choices in the evaluation.
As is now standard, we use few-shot prompts to evaluate the models, at temperature 0.
The prompts and number of shots are part of a Microsoft internal tool to evaluate language models, and in particular we did no optimization to the pipeline for Phi-3.
More specifically, we do not change prompts, pick different few-shot examples, change prompt format, or do any other form of optimization for the model.
The number of k–shot examples is listed per-benchmark.
| | Phi-3-Mini-128K-In<br>3.8b | Phi-3-Small<br>7b (preview) | Phi-3-Medium<br>14b (preview) | Phi-2<br>2.7b | Mistral<br>7b | Gemma<br>7b | Llama-3-In<br>8b | Mixtral<br>8x7b | GPT-3.5<br>version 1106 |
|---|---|---|---|---|---|---|---|---|---|
| MMLU <br>5-Shot | 68.1 | 75.3 | 78.2 | 56.3 | 61.7 | 63.6 | 66.5 | 68.4 | 71.4 |
| HellaSwag <br> 5-Shot | 74.5 | 78.7 | 83.2 | 53.6 | 58.5 | 49.8 | 71.1 | 70.4 | 78.8 |
| ANLI <br> 7-Shot | 52.8 | 55.0 | 58.7 | 42.5 | 47.1 | 48.7 | 57.3 | 55.2 | 58.1 |
| GSM-8K <br> 0-Shot; CoT | 83.6 | 86.4 | 90.8 | 61.1 | 46.4 | 59.8 | 77.4 | 64.7 | 78.1 |
| MedQA <br> 2-Shot | 55.3 | 58.2 | 69.8 | 40.9 | 49.6 | 50.0 | 60.5 | 62.2 | 63.4 |
| AGIEval <br> 0-Shot | 36.9 | 45.0 | 49.7 | 29.8 | 35.1 | 42.1 | 42.0 | 45.2 | 48.4 |
| TriviaQA <br> 5-Shot | 57.1 | 59.1 | 73.3 | 45.2 | 72.3 | 75.2 | 67.7 | 82.2 | 85.8 |
| Arc-C <br> 10-Shot | 84.0 | 90.7 | 91.9 | 75.9 | 78.6 | 78.3 | 82.8 | 87.3 | 87.4 |
| Arc-E <br> 10-Shot | 95.2 | 97.1 | 98.0 | 88.5 | 90.6 | 91.4 | 93.4 | 95.6 | 96.3 |
| PIQA <br> 5-Shot | 83.6 | 87.8 | 88.2 | 60.2 | 77.7 | 78.1 | 75.7 | 86.0 | 86.6 |
| SociQA <br> 5-Shot | 76.1 | 79.0 | 79.4 | 68.3 | 74.6 | 65.5 | 73.9 | 75.9 | 68.3 |
| BigBench-Hard <br> 0-Shot | 71.5 | 75.0 | 82.5 | 59.4 | 57.3 | 59.6 | 51.5 | 69.7 | 68.32 |
| WinoGrande <br> 5-Shot | 72.5 | 82.5 | 81.2 | 54.7 | 54.2 | 55.6 | 65.0 | 62.0 | 68.8 |
| OpenBookQA <br> 10-Shot | 80.6 | 88.4 | 86.6 | 73.6 | 79.8 | 78.6 | 82.6 | 85.8 | 86.0 |
| BoolQ <br> 0-Shot | 78.7 | 82.9 | 86.5 | -- | 72.2 | 66.0 | 80.9 | 77.6 | 79.1 |
| CommonSenseQA <br> 10-Shot | 78.0 | 80.3 | 82.6 | 69.3 | 72.6 | 76.2 | 79 | 78.1 | 79.6 |
| TruthfulQA <br> 10-Shot | 63.2 | 68.1 | 74.8 | -- | 52.1 | 53.0 | 63.2 | 60.1 | 85.8 |
| HumanEval <br> 0-Shot | 57.9 | 59.1 | 54.7 | 47.0 | 28.0 | 34.1 | 60.4| 37.8 | 62.2 |
| MBPP <br> 3-Shot | 62.5 | 71.4 | 73.7 | 60.6 | 50.8 | 51.5 | 67.7 | 60.2 | 77.8 |
## Software
* [PyTorch](https://github.com/pytorch/pytorch)
* [DeepSpeed](https://github.com/microsoft/DeepSpeed)
* [Transformers](https://github.com/huggingface/transformers)
* [Flash-Attention](https://github.com/HazyResearch/flash-attention)
## Hardware
Note that by default, the Phi-3-mini model uses flash attention, which requires certain types of GPU hardware to run. We have tested on the following GPU types:
* NVIDIA A100
* NVIDIA A6000
* NVIDIA H100
If you want to run the model on:
* NVIDIA V100 or earlier generation GPUs: call AutoModelForCausalLM.from_pretrained() with attn_implementation="eager"
* Optimized inference on GPU, CPU, and Mobile: use the **ONNX** models [128K](https://aka.ms/phi3-mini-128k-instruct-onnx)
## Cross Platform Support
ONNX runtime ecosystem now supports Phi-3 Mini models across platforms and hardware. You can find the optimized Phi-3 Mini-128K-Instruct ONNX model [here](https://aka.ms/phi3-mini-128k-instruct-onnx).
Optimized Phi-3 models are also published here in ONNX format, to run with ONNX Runtime on CPU and GPU across devices, including server platforms, Windows, Linux and Mac desktops, and mobile CPUs, with the precision best suited to each of these targets. DirectML support lets developers bring hardware acceleration to Windows devices at scale across AMD, Intel, and NVIDIA GPUs.
Along with DirectML, ONNX Runtime provides cross platform support for Phi-3 across a range of devices CPU, GPU, and mobile.
Here are some of the optimized configurations we have added:
1. ONNX models for int4 DML: Quantized to int4 via AWQ
2. ONNX model for fp16 CUDA
3. ONNX model for int4 CUDA: Quantized to int4 via RTN
4. ONNX model for int4 CPU and Mobile: Quantized to int4 via RTN
## License
The model is licensed under the [MIT license](https://huggingface.co/microsoft/Phi-3-mini-128k/resolve/main/LICENSE).
## Trademarks
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow [Microsoft’s Trademark & Brand Guidelines](https://www.microsoft.com/en-us/legal/intellectualproperty/trademarks). Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party’s policies.
| [
"MEDQA"
] |
DehydratedWater42/SeELLama-GGUF | DehydratedWater42 | text-generation | [
"transformers",
"gguf",
"math",
"semantic",
"extraction",
"graph",
"relations",
"science",
"synthetic",
"text-generation",
"en",
"dataset:DehydratedWater42/semantic_relations_extraction",
"license:llama2",
"region:us"
] | "2024-04-27T14:00:33Z" | 2024-04-27T22:21:15+00:00 | 0 | 1 | ---
datasets:
- DehydratedWater42/semantic_relations_extraction
language:
- en
library_name: transformers
license: llama2
pipeline_tag: text-generation
tags:
- math
- semantic
- extraction
- graph
- relations
- science
- synthetic
inference: false
---
# SeELLama (Semantic Extraction LLama)
Model is based on LLama2-7b and fine-tuned with the `DehydratedWater42/semantic_relations_extraction` dataset.
The purpose of this model is to extract semantic relations from text in a structured way.
#### Simplified Example:
- **Initial Text**: "While there is beautiful weather outside the building, from the window we can see a car. And what's the most annoying, pigeons love to sit on that car."
- **Entities**: ["pigeon", "car", "building"]
- **Relations between entities**: {"pigeon -> car": "pigeon sits on the car", "car -> building": "car is parked outside the building"}
**Note:** The text example above is **too short** for the actual model; please use **at least 500-token text** segments for extraction to avoid hallucinations.
### Other versions:
- **Get SeELLama as Safetensors:** [DehydratedWater42/SeELLama](https://huggingface.co/DehydratedWater42/SeELLama)
- **Get SeELLama as adapter:** [DehydratedWater42/SeELLama-qlora-adapter](https://huggingface.co/DehydratedWater42/SeELLama-qlora-adapter)
***
## How to use it:
### Template:
Use the **prompt template** provided below to extract relations from text. Replace `<<your_text_for_extraction>>` with your selected text, ideally between 500-1500 tokens,
with an **optimal range** of about **800-1000 tokens**. You can adjust the **temperature** between 0.3 and 1.0; a good starting point is **between 0.6 and 0.7**.
Temperatures below 0.3 may lead to never ending `section_description`. The higher the temperature, the more the model will fill in the gaps in the provided text.
It was **fine-tuned on scientific articles**, so it will supplement missing information with general knowledge.
Model was trained on 2560 context lenght where 1000-1500 tokens where used as input text.
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
<<your_text_for_extraction>>
### Extracted Relations:
{
"section_description":
```
The `JSON` opening is not necessary but it improves stability. Remember to use double `{{` instead of singular '{' if you are using LangChain prompts with fstring formatting.
### Quantized versions:
- `q2_k`: Uses Q4_K for the attention.vw and feed_forward.w2 tensors, Q2_K for the other tensors.
- `q3_k_l`: Uses Q5_K for the attention.wv, attention.wo, and feed_forward.w2 tensors, else Q3_K
- `q3_k_m`: Uses Q4_K for the attention.wv, attention.wo, and feed_forward.w2 tensors, else Q3_K
- `q3_k_s`: Uses Q3_K for all tensors
- `q4_0`: Original quant method, 4-bit.
- `q4_1`: Higher accuracy than q4_0 but not as high as q5_0. However has quicker inference than q5 models.
- `q4_k_m`: Uses Q6_K for half of the attention.wv and feed_forward.w2 tensors, else Q4_K
- `q4_k_s`: Uses Q4_K for all tensors
- `q5_0`: Higher accuracy, higher resource usage and slower inference.
- `q5_1`: Even higher accuracy, resource usage and slower inference.
- `q5_k_m`: Uses Q6_K for half of the attention.wv and feed_forward.w2 tensors, else Q5_K
- `q5_k_s`: Uses Q5_K for all tensors
- `q6_k`: Uses Q8_K for all tensors
- `q8_0`: Almost indistinguishable from float16. High resource use and slow. Not recommended for most users.
### Starter code
### Extract generated json
```python
import json
json_start = """{
"section_description":
"""
to_parse = json_start + generated_text.split(json_start)[1]
print(json.loads(to_parse))
```
### Example:
Extracting information from Minecraft Wiki.
#### Initial template
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Players in Survival, Adventure, or Spectator have access to the 2×2 crafting grid from their inventory screen. Small crafting recipes that are at most 2×2 can be made there. These include wooden planks, sticks, crafting tables, torches and some shapeless recipes. To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it. This brings up an interface with a larger, 3×3 crafting grid, which the player can use to craft any crafting recipe in the game. There is also a recipe book where the player can access all the crafting recipes.
Some recipes do not require their ingredients to be arranged in a specific way on the crafting grid. These are commonly known as shapeless recipes. For example, players may craft a fermented spider eye by placing its ingredients anywhere within the grid or like a flint and steel can be crafted in any orientation.
On the contrary, many recipes must have their ingredients placed in the correct relative positions on the crafting grid. These are commonly known as shaped recipes. Ingredients in shaped recipes can be ‘moved’ up, down, left, or right. They can also be flipped side-ways. For example, a 3×1 recipe, such as bread, can be made using the top, middle, or bottom row of the 3×3 grid, and a bow may be made with the strings placed on the left instead of on the right.
There are recipes that may not be moved or mirrored in this way. These are commonly known as fixed recipes. For example, dyes in banner recipes - only available in Bedrock Edition - must be specifically placed to achieve the desired pattern. Fixed recipes can be added by data packs, add-ons, or mods.
### Extracted Relations:
{
"section_description":
```
#### Result (about game):
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Players in Survival, Adventure, or Spectator have access to the 2×2 crafting grid from their inventory screen. Small crafting recipes that are at most 2×2 can be made there. These include wooden planks, sticks, crafting tables, torches and some shapeless recipes. To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it. This brings up an interface with a larger, 3×3 crafting grid, which the player can use to craft any crafting recipe in the game. There is also a recipe book where the player can access all the crafting recipes.
Some recipes do not require their ingredients to be arranged in a specific way on the crafting grid. These are commonly known as shapeless recipes. For example, players may craft a fermented spider eye by placing its ingredients anywhere within the grid or like a flint and steel can be crafted in any orientation.
On the contrary, many recipes must have their ingredients placed in the correct relative positions on the crafting grid. These are commonly known as shaped recipes. Ingredients in shaped recipes can be ‘moved’ up, down, left, or right. They can also be flipped side-ways. For example, a 3×1 recipe, such as bread, can be made using the top, middle, or bottom row of the 3×3 grid, and a bow may be made with the strings placed on the left instead of on the right.
There are recipes that may not be moved or mirrored in this way. These are commonly known as fixed recipes. For example, dyes in banner recipes - only available in Bedrock Edition - must be specifically placed to achieve the desired pattern. Fixed recipes can be added by data packs, add-ons, or mods.
### Extracted Relations:
{
"section_description": "In Minecraft, players can craft items using the 2x2 crafting grid. There are some fixed recipes that require the player to place their ingredients in a specific position.",
"list_of_entities": [
"crafting recipes",
"player",
"fixed recipes",
"2x2 crafting grid",
"survival, adventure, or spectator players",
"wooden planks",
"torches",
"shaped recipes",
"small crafting recipes",
"shapeless recipes",
"wooden planks, sticks, crafting tables, torches",
"flint and steel"
],
"relations": [
{
"description": "Players have access to the 2x2 crafting grid from their inventory screen.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"player"
],
"strength": "strong"
},
{
"description": "Small crafting recipes that are at most 2x2 can be made using the 2x2 crafting grid.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"survival, adventure, or spectator players"
],
"strength": "strong"
},
{
"description": "To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"wooden planks"
],
"strength": "strong"
},
{
"description": "To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"crafting tables"
],
"strength": "strong"
},
{
"description": "Crafting tables allow players to make bigger crafting recipes that are not possible with the 2x2 crafting grid.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"torches",
"flint and steel",
"wooden planks"
],
"strength": "strong"
},
{
"description": "Crafting tables are used to craft items using the 2x2 crafting grid.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"wooden planks, sticks, crafting tables, torches"
],
"strength": "strong"
},
{
"description": "There are some fixed recipes that require the player to place their ingredients in a specific position.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"fixed recipes"
],
"strength": "strong"
},
{
"description": "The recipes do not require their ingredients to be arranged in a specific way on the crafting grid.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"shapeless recipes"
],
"strength": "strong"
},
{
"description": "Ingredients in shapeless recipes can be ‘moved’ up, down, left, or right.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"shaped recipes"
],
"strength": "strong"
},
{
"description": "Ingredients in shaped recipes can be flipped side-ways.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"fixed recipes"
],
"strength": "strong"
},
{
"description": "There are some fixed recipes that require the player to place their ingredients in a specific position.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"fixed recipes"
],
"strength": "strong"
},
{
"description": "The recipes can be added by data packs, add-ons, or mods.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"fixed recipes"
],
"strength": "strong"
}
]
}
```
### Smaller example (About crafting)
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Players in Survival, Adventure, or Spectator have access to the 2×2 crafting grid from their inventory screen. Small crafting recipes that are at most 2×2 can be made there. These include wooden planks, sticks, crafting tables, torches and some shapeless recipes. To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it. This brings up an interface with a larger, 3×3 crafting grid, which the player can use to craft any crafting recipe in the game. There is also a recipe book where the player can access all the crafting recipes.
### Extracted Relations:
{
"section_description": "Survival, Adventure, and Spectator players have access to the 2×2 crafting grid. However, to craft items using a bigger grid, you need to create a crafting table with wooden planks, place it in the world, and press use while facing it.",
"list_of_entities": [
"Survival",
"wooden planks",
"Spectator",
"Adventure",
"crafting grid"
],
"relations": [
{
"description": "Survival players can access the 2×2 crafting grid.",
"source_entities": [
"Survival"
],
"target_entities": [
"crafting grid"
],
"strength": "strong"
},
{
"description": "Adventure and Spectator players can also access the 2×2 crafting grid.",
"source_entities": [
"Adventure"
],
"target_entities": [
"crafting grid"
],
"strength": "strong"
},
{
"description": "To craft items using a bigger grid, you need to create a crafting table with wooden planks.",
"source_entities": [
"Spectator"
],
"target_entities": [
"crafting grid"
],
"strength": "strong"
}
]
}
```
### Dopamine example
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Dopamine is synthesized in a restricted set of cell types, mainly neurons and cells in the medulla of the adrenal glands.[23] The primary and minor metabolic pathways respectively are:
Primary: L-Phenylalanine → L-Tyrosine → L-DOPA → Dopamine[20][21]
Minor: L-Phenylalanine → L-Tyrosine → p-Tyramine → Dopamine[20][21][22]
Minor: L-Phenylalanine → m-Tyrosine → m-Tyramine → Dopamine[22][24][25]
The direct precursor of dopamine, L-DOPA, can be synthesized indirectly from the essential amino acid phenylalanine or directly from the non-essential amino acid tyrosine.[26] These amino acids are found in nearly every protein and so are readily available in food, with tyrosine being the most common. Although dopamine is also found in many types of food, it is incapable of crossing the blood–brain barrier that surrounds and protects the brain.[27] It must therefore be synthesized inside the brain to perform its neuronal activity.[27]
L-Phenylalanine is converted into L-tyrosine by the enzyme phenylalanine hydroxylase, with molecular oxygen (O2) and tetrahydrobiopterin as cofactors. L-Tyrosine is converted into L-DOPA by the enzyme tyrosine hydroxylase, with tetrahydrobiopterin, O2, and iron (Fe2+) as cofactors.[26] L-DOPA is converted into dopamine by the enzyme aromatic L-amino acid decarboxylase (also known as DOPA decarboxylase), with pyridoxal phosphate as the cofactor.[26]
Dopamine itself is used as precursor in the synthesis of the neurotransmitters norepinephrine and epinephrine.[26] Dopamine is converted into norepinephrine by the enzyme dopamine β-hydroxylase, with O2 and L-ascorbic acid as cofactors.[26] Norepinephrine is converted into epinephrine by the enzyme phenylethanolamine N-methyltransferase with S-adenosyl-L-methionine as the cofactor.[26]
Some of the cofactors also require their own synthesis.[26] Deficiency in any required amino acid or cofactor can impair the synthesis of dopamine, norepinephrine, and epinephrine.[26]
Degradation
Dopamine is broken down into inactive metabolites by a set of enzymes—monoamine oxidase (MAO), catechol-O-methyl transferase (COMT), and aldehyde dehydrogenase (ALDH), acting in sequence.[28] Both isoforms of monoamine oxidase, MAO-A and MAO-B, effectively metabolize dopamine.[26] Different breakdown pathways exist but the main end-product is homovanillic acid (HVA), which has no known biological activity.[28] From the bloodstream, homovanillic acid is filtered out by the kidneys and then excreted in the urine.[28] The two primary metabolic routes that convert dopamine into HVA are:[29]
Dopamine → DOPAL → DOPAC → HVA – catalyzed by MAO, ALDH, and COMT respectively
Dopamine → 3-Methoxytyramine → HVA – catalyzed by COMT and MAO+ALDH respectively
In clinical research on schizophrenia, measurements of homovanillic acid in plasma have been used to estimate levels of dopamine activity in the brain. A difficulty in this approach however, is separating the high level of plasma homovanillic acid contributed by the metabolism of norepinephrine.[30][31]
Although dopamine is normally broken down by an oxidoreductase enzyme, it is also susceptible to oxidation by direct reaction with oxygen, yielding quinones plus various free radicals as products.[32] The rate of oxidation can be increased by the presence of ferric iron or other factors. Quinones and free radicals produced by autoxidation of dopamine can poison cells, and there is evidence that this mechanism may contribute to the cell loss that occurs in Parkinson's disease and other conditions.[33]
### Extracted Relations:
{
"section_description": "Dopamine is a neurotransmitter that plays a crucial role in regulating movement, motivation, and reward processing. It is produced in several regions of the brain, including the substantia nigra, ventral tegmental area, and hypothalamus. Dopamine is involved in several physiological functions such as motor control, cognition, and emotional responses. Low levels of dopamine have been linked to several neurological disorders, including Parkinson's disease, schizophrenia, and addiction. Dopamine is also involved in the reward system, which is responsible for motivation and pleasure. The brain's reward system is activated by the release of dopamine in response to pleasurable experiences, such as eating, sex, and drugs. Dopamine is synthesized from the amino acid L-tyrosine, which is derived from dietary sources. The primary precursor of dopamine is L-DOPA, which is synthesized from L-tyrosine by the enzyme tyrosine hydroxylase. Dopamine is then converted into norepinephrine and epinephrine by the enzyme dopamine β-hydroxylase. Dopamine is broken down into inactive metabolites by a set of enzymes, including monoamine oxidase (MAO), catechol-O-methyl transferase (COMT), and aldehyde dehydrogenase (ALDH). The two primary metabolic routes that convert dopamine into HVA are: Dopamine → DOPAL → DOPAC → HVA - catalyzed by MAO, ALDH, and COMT respectively Dopamine → 3-Methoxytyramine → HVA - catalyzed by COMT and MAO+ALDH respectively In clinical research on schizophrenia, measurements of homovanillic acid in plasma have been used to estimate levels of dopamine activity in the brain. A difficulty in this approach however, is separating the high level of plasma homovanillic acid contributed by the metabolism of norepinephrine. Dopamine is also susceptible to oxidation by direct reaction with oxygen, yielding quinones plus various free radicals as products. The rate of oxidation can be increased by the presence of ferric iron or other factors. Quinones and free radicals produced by autoxidation of dopamine can poison cells, and there is evidence that this mechanism may contribute to the cell loss that occurs in Parkinson's disease and other conditions. ",
"list_of_entities": [
"motivation",
"Parkinson's disease",
"cognition",
"pleasure",
"dopamine",
"L-tyrosine",
"schizophrenia",
"emotional responses",
"L-DOPA",
"dopamine β-hydroxylase",
"dopamine β-hydroxylase",
"L-DOPA",
"dopamine",
"L-tyrosine",
"dopamine β-hydroxylase",
"L-DOPA",
"L-tyrosine",
"L-DOPA",
"dopamine",
"L-DOPA",
"dopamine"
],
"relations": [
{
"description": "Dopamine is synthesized from the amino acid L-tyrosine, which is derived from dietary sources.",
"source_entities": [
"dopamine"
],
"target_entities": [
"L-tyrosine"
]
},
{
"description": "The primary precursor of dopamine is L-DOPA, which is synthesized from L-tyrosine by the enzyme tyrosine hydroxylase.",
"source_entities": [
"L-DOPA"
],
"target_entities": [
"dopamine"
]
},
{
"description": "Dopamine is then converted into norepinephrine and epinephrine by the enzyme dopamine β-hydroxylase.",
"source_entities": [
"dopamine"
],
"target_entities": [
"dopamine β-hydroxylase"
]
},
{
"description": "Dopamine is broken down into inactive metabolites by a set of enzymes, including monoamine oxidase (MAO), catechol-O-methyl transferase (COMT), and aldehyde dehydrogenase (ALDH).",
"source_entities": [
"dopamine"
],
"target_entities": [
"monoamine oxidase (MAO)",
"catechol-O-methyl transferase (COMT)",
"aldehyde dehydrogenase (ALDH)"
]
},
{
"description": "The two primary metabolic routes that convert dopamine into HVA are: Dopamine → DOPAL → DOPAC → HVA - catalyzed by MAO, ALDH, and COMT respectively Dopamine → 3-Methoxytyramine → HVA - catalyzed by COMT and MAO+ALDH respectively",
"source_entities": [
"dopamine"
],
"target_entities": [
"HVA",
"MAO",
"ALDH",
"COMT"
]
},
{
"description": "In clinical research on schizophrenia, measurements of homovanillic acid in plasma have been used to estimate levels of dopamine activity in the brain.",
"source_entities": [
"dopamine"
],
"target_entities": [
"homovanillic acid"
]
},
{
"description": "A difficulty in this approach however, is separating the high level of plasma homovanillic acid contributed by the metabolism of norepinephrine.",
"source_entities": [
"homovanillic acid"
],
"target_entities": [
"norepinephrine"
]
},
{
"description": "Dopamine is also susceptible to oxidation by direct reaction with oxygen, yielding quinones plus various free radicals as products.",
"source_entities": [
"dopamine"
],
"target_entities": [
"oxidation"
]
},
{
"description": "The rate of oxidation can be increased by the presence of ferric iron or other factors.",
"source_entities": [
"dopamine"
],
"target_entities": [
"ferric iron"
]
},
{
"description": "Quinones and free radicals produced by autoxidation of dopamine can poison cells, and there is evidence that this mechanism may contribute to the cell loss that occurs in Parkinson's disease and other conditions.",
"source_entities": [
"dopamine"
],
"target_entities": [
"cell loss"
]
}
]
}
```
### Longer example (dirt block)
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Usage
Using bone meal on grass blocks causes short grass, tall grass,[Java Edition only] ferns,[Bedrock Edition only] and flowers to grow.
Passive mobs tend to wander toward grass blocks. They also wander toward light.
When a sheep eats a grass block, the block becomes dirt, and a sheared sheep regrows its wool. Baby sheep graze grass much more often than adults and mature 1 minute faster when grazing.
Tilling a grass block with a hoe converts it to a farmland block.
Dirt paths can be created by using any type of shovel on the side or top of a grass block with air above it. The shovel loses 1 durability for each dirt path block created.
Death
Grass dies and changes to dirt a random time (when a random tick lands on the block) after it has been covered by any opaque blocks. Transparent blocks can kill grass in a similar manner if they cause the light level above the grass block to be four or below (like water does), and the surrounding area is not otherwise sufficiently illuminated.
Spawning
In Java Edition, animals occasionally spawn on grass blocks that have light level 9 or brighter in the space directly above. This is quite rare and requires that there be few other animals nearby or in the spawn chunks. Most animals are instead created along with the terrain.
In Bedrock Edition, most animals require surface grass blocks (grass blocks with no solid block anywhere above them at the same x, z coordinate) to spawn. Animals regularly spawn in most Overworld biomes on surface grass blocks that have light level 7 or brighter in the space directly above.
Snowy grass block
A snowy grass block with snow on top of it.
Grass blocks turn into a snowy grass block if snow, powder snow or a snow block is placed on top of the block. Removing the snow layer turns the block into a normal grass block.
Composting
In Bedrock Edition, placing grass blocks into a composter has a 30% chance of raising the compost level by 1.
### Extracted Relations:
{
"section_description": "The article discusses the various ways in which grass blocks can be used in Minecraft, including as food, fuel, and protection. It also covers how grass blocks can be used to tame animals, and the dangers of being exposed to grass. The article concludes by highlighting the importance of knowing how to use and defend against grass blocks.",
"list_of_entities": [
"dirt",
"Minecraft",
"light level",
"animal",
"surface grass blocks",
"Bedrock Edition",
"Java Edition",
"dirt path",
"grass",
"snow",
"snowy grass block",
"opaque blocks",
"sheep",
"composter",
"transparent blocks"
],
"relations": [
{
"description": "Grass blocks are a renewable resource that can be used as a food item.",
"source_entities": [
"Minecraft",
"grass"
],
"target_entities": [
"animal",
"surface grass blocks"
],
"strength": "strong"
},
{
"description": "Grass blocks can be used to tame animals.",
"source_entities": [
"Minecraft",
"grass"
],
"target_entities": [
"animal",
"surface grass blocks"
],
"strength": "strong"
},
{
"description": "Exposure to grass blocks can cause damage to the player's health.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"grass",
"player"
],
"strength": "strong"
},
{
"description": "Placing grass blocks in a composter has a 30% chance of raising the compost level by 1.",
"source_entities": [
"Minecraft",
"grass"
],
"target_entities": [
"composter",
"grass"
],
"strength": "strong"
},
{
"description": "Surface grass blocks are the only ones that can be used to spawn animals.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"surface grass blocks",
"animal"
],
"strength": "strong"
},
{
"description": "Animals regularly spawn in most Overworld biomes on surface grass blocks that have light level 7 or brighter in the space directly above.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"surface grass blocks",
"animal"
],
"strength": "strong"
},
{
"description": "If you walk into the grass without first getting rid of any animals or monsters that are there, they will attack you.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"grass",
"player"
],
"strength": "moderate"
},
{
"description": "Placing grass blocks with snow on top of them turns them into snowy grass blocks.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"snow",
"grass"
],
"strength": "strong"
},
{
"description": "Removing the snow layer turns the block into a normal grass block.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"snowy grass block",
"grass"
],
"strength": "strong"
},
{
"description": "Dirt path blocks can be created by using any type of shovel on the side or top of a grass block with air above it.",
"source_entities": [
"Minecraft",
"grass"
],
"target_entities": [
"dirt path",
"shovel"
],
"strength": "strong"
},
{
"description": "The shovel loses 1 durability for each dirt path block created.",
"source_entities": [
"Minecraft",
"shovel"
],
"target_entities": [
"dirt path",
"shovel"
],
"strength": "moderate"
},
{
"description": "Death grass block dies and changes to dirt a random time (when a random tick lands on the block)",
"source_entities": [
"Minecraft"
],
"target_entities": [
"death grass block",
"dirt"
],
"strength": "strong"
},
{
"description": "Grass can be used to create dirt paths",
"source_entities": [
"Minecraft"
],
"target_entities": [
"grass",
"dirt path"
],
"strength": "strong"
}
]
}
``` | [
"CRAFT"
] |
DehydratedWater42/SeELLama-qlora-adapter | DehydratedWater42 | null | [
"peft",
"safetensors",
"math",
"semantic",
"extraction",
"graph",
"relations",
"science",
"synthetic",
"en",
"dataset:DehydratedWater42/semantic_relations_extraction",
"base_model:NousResearch/Llama-2-7b-hf",
"base_model:adapter:NousResearch/Llama-2-7b-hf",
"license:llama2",
"region:us"
] | "2024-04-27T19:39:19Z" | 2024-04-27T22:25:04+00:00 | 0 | 0 | ---
base_model: NousResearch/Llama-2-7b-hf
datasets:
- DehydratedWater42/semantic_relations_extraction
language:
- en
library_name: peft
license: llama2
tags:
- math
- semantic
- extraction
- graph
- relations
- science
- synthetic
inference: false
---
# SeELLama (Semantic Extraction LLama)
Model is based on LLama2-7b and fine-tuned with the `DehydratedWater42/semantic_relations_extraction` dataset.
The purpose of this model is to extract semantic relations from text in a structured way.
#### Simplified Example:
- **Initial Text**: "While there is beautiful weather outside the building, from the window we can see a car. And what's the most annoying, pigeons love to sit on that car."
- **Entities**: ["pigeon", "car", "building"]
- **Relations between entities**: {"pigeon -> car": "pigeon sits on the car", "car -> building": "car is parked outside the building"}
**Note:** The text example above is **too short** for the actual model; please use **at least 500-token text** segments for extraction to avoid hallucinations.
### This is just adapter for `NousResearch/Llama-2-7b-hf`
- **Get SeELLama as Safetensors:** [DehydratedWater42/SeELLama](https://huggingface.co/DehydratedWater42/SeELLama)
- **Get SeELLama as GGUF:** [DehydratedWater42/SeELLama-GGUF](https://huggingface.co/DehydratedWater42/SeELLama-GGUF)
## How to use it:
### Template:
Use the **prompt template** provided below to extract relations from text. Replace `<<your_text_for_extraction>>` with your selected text, ideally between 500-1500 tokens,
with an **optimal range** of about **800-1000 tokens**. You can adjust the **temperature** between 0.3 and 1.0; a good starting point is **between 0.6 and 0.7**.
Temperatures below 0.3 may lead to never ending `section_description`. The higher the temperature, the more the model will fill in the gaps in the provided text.
It was **fine-tuned on scientific articles**, so it will supplement missing information with general knowledge.
Model was trained on 2560 context lenght where 1000-1500 tokens where used as input text.
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
<<your_text_for_extraction>>
### Extracted Relations:
{
"section_description":
```
The `JSON` opening is not necessary but it improves stability. Remember to use double `{{` instead of singular '{' if you are using LangChain prompts with fstring formatting.
## Fine-tuning code/settings
```python
tokenizer = AutoTokenizer.from_pretrained("NousResearch/Llama-2-7b-hf", trust_remote_code=True)
tokenizer.pad_token = tokenizer.eos_token
tokenizer.padding_side = "right"
torch.backends.cuda.matmul.allow_tf32 = True
torch.backends.cudnn.allow_tf32 = True
bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.bfloat16,
)
model = AutoModelForCausalLM.from_pretrained(
"NousResearch/Llama-2-7b-hf",
quantization_config=bnb_config,
use_cache=False,
use_flash_attention_2=False,
device_map="auto",
)
model.config.pretraining_tp = 1
peft_params = LoraConfig(
lora_alpha=16,
lora_dropout=0.1,
r=64,
bias="none",
task_type="CAUSAL_LM",
)
training_params = TrainingArguments(
output_dir="./results",
num_train_epochs=3,
per_device_train_batch_size=1,
gradient_accumulation_steps=1,
optim="paged_adamw_32bit",
save_steps=100,
logging_steps=1,
learning_rate=2e-4,
weight_decay=0.001,
bf16=True,
fp16=False,
tf32=True,
max_grad_norm=0.3,
max_steps=-1,
warmup_ratio=0.03,
group_by_length=True,
lr_scheduler_type="constant",
report_to="mlflow",
run_name="semantic-extraction-llama2-7b"
)
trainer = SFTTrainer(
model=model,
train_dataset=formated_dataset,
peft_config=peft_params,
dataset_text_field="text",
max_seq_length=int(1024 * 2.5), # 2560
tokenizer=tokenizer,
args=training_params,
packing=True,
)
```
### Example:
Extracting information from Minecraft Wiki.
#### Initial template
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Players in Survival, Adventure, or Spectator have access to the 2×2 crafting grid from their inventory screen. Small crafting recipes that are at most 2×2 can be made there. These include wooden planks, sticks, crafting tables, torches and some shapeless recipes. To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it. This brings up an interface with a larger, 3×3 crafting grid, which the player can use to craft any crafting recipe in the game. There is also a recipe book where the player can access all the crafting recipes.
Some recipes do not require their ingredients to be arranged in a specific way on the crafting grid. These are commonly known as shapeless recipes. For example, players may craft a fermented spider eye by placing its ingredients anywhere within the grid or like a flint and steel can be crafted in any orientation.
On the contrary, many recipes must have their ingredients placed in the correct relative positions on the crafting grid. These are commonly known as shaped recipes. Ingredients in shaped recipes can be ‘moved’ up, down, left, or right. They can also be flipped side-ways. For example, a 3×1 recipe, such as bread, can be made using the top, middle, or bottom row of the 3×3 grid, and a bow may be made with the strings placed on the left instead of on the right.
There are recipes that may not be moved or mirrored in this way. These are commonly known as fixed recipes. For example, dyes in banner recipes - only available in Bedrock Edition - must be specifically placed to achieve the desired pattern. Fixed recipes can be added by data packs, add-ons, or mods.
### Extracted Relations:
{
"section_description":
```
#### Result (about game):
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Players in Survival, Adventure, or Spectator have access to the 2×2 crafting grid from their inventory screen. Small crafting recipes that are at most 2×2 can be made there. These include wooden planks, sticks, crafting tables, torches and some shapeless recipes. To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it. This brings up an interface with a larger, 3×3 crafting grid, which the player can use to craft any crafting recipe in the game. There is also a recipe book where the player can access all the crafting recipes.
Some recipes do not require their ingredients to be arranged in a specific way on the crafting grid. These are commonly known as shapeless recipes. For example, players may craft a fermented spider eye by placing its ingredients anywhere within the grid or like a flint and steel can be crafted in any orientation.
On the contrary, many recipes must have their ingredients placed in the correct relative positions on the crafting grid. These are commonly known as shaped recipes. Ingredients in shaped recipes can be ‘moved’ up, down, left, or right. They can also be flipped side-ways. For example, a 3×1 recipe, such as bread, can be made using the top, middle, or bottom row of the 3×3 grid, and a bow may be made with the strings placed on the left instead of on the right.
There are recipes that may not be moved or mirrored in this way. These are commonly known as fixed recipes. For example, dyes in banner recipes - only available in Bedrock Edition - must be specifically placed to achieve the desired pattern. Fixed recipes can be added by data packs, add-ons, or mods.
### Extracted Relations:
{
"section_description": "In Minecraft, players can craft items using the 2x2 crafting grid. There are some fixed recipes that require the player to place their ingredients in a specific position.",
"list_of_entities": [
"crafting recipes",
"player",
"fixed recipes",
"2x2 crafting grid",
"survival, adventure, or spectator players",
"wooden planks",
"torches",
"shaped recipes",
"small crafting recipes",
"shapeless recipes",
"wooden planks, sticks, crafting tables, torches",
"flint and steel"
],
"relations": [
{
"description": "Players have access to the 2x2 crafting grid from their inventory screen.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"player"
],
"strength": "strong"
},
{
"description": "Small crafting recipes that are at most 2x2 can be made using the 2x2 crafting grid.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"survival, adventure, or spectator players"
],
"strength": "strong"
},
{
"description": "To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"wooden planks"
],
"strength": "strong"
},
{
"description": "To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"crafting tables"
],
"strength": "strong"
},
{
"description": "Crafting tables allow players to make bigger crafting recipes that are not possible with the 2x2 crafting grid.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"torches",
"flint and steel",
"wooden planks"
],
"strength": "strong"
},
{
"description": "Crafting tables are used to craft items using the 2x2 crafting grid.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"wooden planks, sticks, crafting tables, torches"
],
"strength": "strong"
},
{
"description": "There are some fixed recipes that require the player to place their ingredients in a specific position.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"fixed recipes"
],
"strength": "strong"
},
{
"description": "The recipes do not require their ingredients to be arranged in a specific way on the crafting grid.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"shapeless recipes"
],
"strength": "strong"
},
{
"description": "Ingredients in shapeless recipes can be ‘moved’ up, down, left, or right.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"shaped recipes"
],
"strength": "strong"
},
{
"description": "Ingredients in shaped recipes can be flipped side-ways.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"fixed recipes"
],
"strength": "strong"
},
{
"description": "There are some fixed recipes that require the player to place their ingredients in a specific position.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"fixed recipes"
],
"strength": "strong"
},
{
"description": "The recipes can be added by data packs, add-ons, or mods.",
"source_entities": [
"crafting recipes"
],
"target_entities": [
"fixed recipes"
],
"strength": "strong"
}
]
}
```
### Smaller example (About crafting)
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Players in Survival, Adventure, or Spectator have access to the 2×2 crafting grid from their inventory screen. Small crafting recipes that are at most 2×2 can be made there. These include wooden planks, sticks, crafting tables, torches and some shapeless recipes. To craft items using a bigger grid to allow more crafting recipes, create a crafting table with 4 wooden planks, place it in the world, and press use while facing it. This brings up an interface with a larger, 3×3 crafting grid, which the player can use to craft any crafting recipe in the game. There is also a recipe book where the player can access all the crafting recipes.
### Extracted Relations:
{
"section_description": "Survival, Adventure, and Spectator players have access to the 2×2 crafting grid. However, to craft items using a bigger grid, you need to create a crafting table with wooden planks, place it in the world, and press use while facing it.",
"list_of_entities": [
"Survival",
"wooden planks",
"Spectator",
"Adventure",
"crafting grid"
],
"relations": [
{
"description": "Survival players can access the 2×2 crafting grid.",
"source_entities": [
"Survival"
],
"target_entities": [
"crafting grid"
],
"strength": "strong"
},
{
"description": "Adventure and Spectator players can also access the 2×2 crafting grid.",
"source_entities": [
"Adventure"
],
"target_entities": [
"crafting grid"
],
"strength": "strong"
},
{
"description": "To craft items using a bigger grid, you need to create a crafting table with wooden planks.",
"source_entities": [
"Spectator"
],
"target_entities": [
"crafting grid"
],
"strength": "strong"
}
]
}
```
### Dopamine example
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Dopamine is synthesized in a restricted set of cell types, mainly neurons and cells in the medulla of the adrenal glands.[23] The primary and minor metabolic pathways respectively are:
Primary: L-Phenylalanine → L-Tyrosine → L-DOPA → Dopamine[20][21]
Minor: L-Phenylalanine → L-Tyrosine → p-Tyramine → Dopamine[20][21][22]
Minor: L-Phenylalanine → m-Tyrosine → m-Tyramine → Dopamine[22][24][25]
The direct precursor of dopamine, L-DOPA, can be synthesized indirectly from the essential amino acid phenylalanine or directly from the non-essential amino acid tyrosine.[26] These amino acids are found in nearly every protein and so are readily available in food, with tyrosine being the most common. Although dopamine is also found in many types of food, it is incapable of crossing the blood–brain barrier that surrounds and protects the brain.[27] It must therefore be synthesized inside the brain to perform its neuronal activity.[27]
L-Phenylalanine is converted into L-tyrosine by the enzyme phenylalanine hydroxylase, with molecular oxygen (O2) and tetrahydrobiopterin as cofactors. L-Tyrosine is converted into L-DOPA by the enzyme tyrosine hydroxylase, with tetrahydrobiopterin, O2, and iron (Fe2+) as cofactors.[26] L-DOPA is converted into dopamine by the enzyme aromatic L-amino acid decarboxylase (also known as DOPA decarboxylase), with pyridoxal phosphate as the cofactor.[26]
Dopamine itself is used as precursor in the synthesis of the neurotransmitters norepinephrine and epinephrine.[26] Dopamine is converted into norepinephrine by the enzyme dopamine β-hydroxylase, with O2 and L-ascorbic acid as cofactors.[26] Norepinephrine is converted into epinephrine by the enzyme phenylethanolamine N-methyltransferase with S-adenosyl-L-methionine as the cofactor.[26]
Some of the cofactors also require their own synthesis.[26] Deficiency in any required amino acid or cofactor can impair the synthesis of dopamine, norepinephrine, and epinephrine.[26]
Degradation
Dopamine is broken down into inactive metabolites by a set of enzymes—monoamine oxidase (MAO), catechol-O-methyl transferase (COMT), and aldehyde dehydrogenase (ALDH), acting in sequence.[28] Both isoforms of monoamine oxidase, MAO-A and MAO-B, effectively metabolize dopamine.[26] Different breakdown pathways exist but the main end-product is homovanillic acid (HVA), which has no known biological activity.[28] From the bloodstream, homovanillic acid is filtered out by the kidneys and then excreted in the urine.[28] The two primary metabolic routes that convert dopamine into HVA are:[29]
Dopamine → DOPAL → DOPAC → HVA – catalyzed by MAO, ALDH, and COMT respectively
Dopamine → 3-Methoxytyramine → HVA – catalyzed by COMT and MAO+ALDH respectively
In clinical research on schizophrenia, measurements of homovanillic acid in plasma have been used to estimate levels of dopamine activity in the brain. A difficulty in this approach however, is separating the high level of plasma homovanillic acid contributed by the metabolism of norepinephrine.[30][31]
Although dopamine is normally broken down by an oxidoreductase enzyme, it is also susceptible to oxidation by direct reaction with oxygen, yielding quinones plus various free radicals as products.[32] The rate of oxidation can be increased by the presence of ferric iron or other factors. Quinones and free radicals produced by autoxidation of dopamine can poison cells, and there is evidence that this mechanism may contribute to the cell loss that occurs in Parkinson's disease and other conditions.[33]
### Extracted Relations:
{
"section_description": "Dopamine is a neurotransmitter that plays a crucial role in regulating movement, motivation, and reward processing. It is produced in several regions of the brain, including the substantia nigra, ventral tegmental area, and hypothalamus. Dopamine is involved in several physiological functions such as motor control, cognition, and emotional responses. Low levels of dopamine have been linked to several neurological disorders, including Parkinson's disease, schizophrenia, and addiction. Dopamine is also involved in the reward system, which is responsible for motivation and pleasure. The brain's reward system is activated by the release of dopamine in response to pleasurable experiences, such as eating, sex, and drugs. Dopamine is synthesized from the amino acid L-tyrosine, which is derived from dietary sources. The primary precursor of dopamine is L-DOPA, which is synthesized from L-tyrosine by the enzyme tyrosine hydroxylase. Dopamine is then converted into norepinephrine and epinephrine by the enzyme dopamine β-hydroxylase. Dopamine is broken down into inactive metabolites by a set of enzymes, including monoamine oxidase (MAO), catechol-O-methyl transferase (COMT), and aldehyde dehydrogenase (ALDH). The two primary metabolic routes that convert dopamine into HVA are: Dopamine → DOPAL → DOPAC → HVA - catalyzed by MAO, ALDH, and COMT respectively Dopamine → 3-Methoxytyramine → HVA - catalyzed by COMT and MAO+ALDH respectively In clinical research on schizophrenia, measurements of homovanillic acid in plasma have been used to estimate levels of dopamine activity in the brain. A difficulty in this approach however, is separating the high level of plasma homovanillic acid contributed by the metabolism of norepinephrine. Dopamine is also susceptible to oxidation by direct reaction with oxygen, yielding quinones plus various free radicals as products. The rate of oxidation can be increased by the presence of ferric iron or other factors. Quinones and free radicals produced by autoxidation of dopamine can poison cells, and there is evidence that this mechanism may contribute to the cell loss that occurs in Parkinson's disease and other conditions. ",
"list_of_entities": [
"motivation",
"Parkinson's disease",
"cognition",
"pleasure",
"dopamine",
"L-tyrosine",
"schizophrenia",
"emotional responses",
"L-DOPA",
"dopamine β-hydroxylase",
"dopamine β-hydroxylase",
"L-DOPA",
"dopamine",
"L-tyrosine",
"dopamine β-hydroxylase",
"L-DOPA",
"L-tyrosine",
"L-DOPA",
"dopamine",
"L-DOPA",
"dopamine"
],
"relations": [
{
"description": "Dopamine is synthesized from the amino acid L-tyrosine, which is derived from dietary sources.",
"source_entities": [
"dopamine"
],
"target_entities": [
"L-tyrosine"
]
},
{
"description": "The primary precursor of dopamine is L-DOPA, which is synthesized from L-tyrosine by the enzyme tyrosine hydroxylase.",
"source_entities": [
"L-DOPA"
],
"target_entities": [
"dopamine"
]
},
{
"description": "Dopamine is then converted into norepinephrine and epinephrine by the enzyme dopamine β-hydroxylase.",
"source_entities": [
"dopamine"
],
"target_entities": [
"dopamine β-hydroxylase"
]
},
{
"description": "Dopamine is broken down into inactive metabolites by a set of enzymes, including monoamine oxidase (MAO), catechol-O-methyl transferase (COMT), and aldehyde dehydrogenase (ALDH).",
"source_entities": [
"dopamine"
],
"target_entities": [
"monoamine oxidase (MAO)",
"catechol-O-methyl transferase (COMT)",
"aldehyde dehydrogenase (ALDH)"
]
},
{
"description": "The two primary metabolic routes that convert dopamine into HVA are: Dopamine → DOPAL → DOPAC → HVA - catalyzed by MAO, ALDH, and COMT respectively Dopamine → 3-Methoxytyramine → HVA - catalyzed by COMT and MAO+ALDH respectively",
"source_entities": [
"dopamine"
],
"target_entities": [
"HVA",
"MAO",
"ALDH",
"COMT"
]
},
{
"description": "In clinical research on schizophrenia, measurements of homovanillic acid in plasma have been used to estimate levels of dopamine activity in the brain.",
"source_entities": [
"dopamine"
],
"target_entities": [
"homovanillic acid"
]
},
{
"description": "A difficulty in this approach however, is separating the high level of plasma homovanillic acid contributed by the metabolism of norepinephrine.",
"source_entities": [
"homovanillic acid"
],
"target_entities": [
"norepinephrine"
]
},
{
"description": "Dopamine is also susceptible to oxidation by direct reaction with oxygen, yielding quinones plus various free radicals as products.",
"source_entities": [
"dopamine"
],
"target_entities": [
"oxidation"
]
},
{
"description": "The rate of oxidation can be increased by the presence of ferric iron or other factors.",
"source_entities": [
"dopamine"
],
"target_entities": [
"ferric iron"
]
},
{
"description": "Quinones and free radicals produced by autoxidation of dopamine can poison cells, and there is evidence that this mechanism may contribute to the cell loss that occurs in Parkinson's disease and other conditions.",
"source_entities": [
"dopamine"
],
"target_entities": [
"cell loss"
]
}
]
}
```
### Longer example (dirt block)
```text
Below is an part of larger text. Your task is to extract information about entities and relations to the JSON format.
### Text Part to Extract From:
Usage
Using bone meal on grass blocks causes short grass, tall grass,[Java Edition only] ferns,[Bedrock Edition only] and flowers to grow.
Passive mobs tend to wander toward grass blocks. They also wander toward light.
When a sheep eats a grass block, the block becomes dirt, and a sheared sheep regrows its wool. Baby sheep graze grass much more often than adults and mature 1 minute faster when grazing.
Tilling a grass block with a hoe converts it to a farmland block.
Dirt paths can be created by using any type of shovel on the side or top of a grass block with air above it. The shovel loses 1 durability for each dirt path block created.
Death
Grass dies and changes to dirt a random time (when a random tick lands on the block) after it has been covered by any opaque blocks. Transparent blocks can kill grass in a similar manner if they cause the light level above the grass block to be four or below (like water does), and the surrounding area is not otherwise sufficiently illuminated.
Spawning
In Java Edition, animals occasionally spawn on grass blocks that have light level 9 or brighter in the space directly above. This is quite rare and requires that there be few other animals nearby or in the spawn chunks. Most animals are instead created along with the terrain.
In Bedrock Edition, most animals require surface grass blocks (grass blocks with no solid block anywhere above them at the same x, z coordinate) to spawn. Animals regularly spawn in most Overworld biomes on surface grass blocks that have light level 7 or brighter in the space directly above.
Snowy grass block
A snowy grass block with snow on top of it.
Grass blocks turn into a snowy grass block if snow, powder snow or a snow block is placed on top of the block. Removing the snow layer turns the block into a normal grass block.
Composting
In Bedrock Edition, placing grass blocks into a composter has a 30% chance of raising the compost level by 1.
### Extracted Relations:
{
"section_description": "The article discusses the various ways in which grass blocks can be used in Minecraft, including as food, fuel, and protection. It also covers how grass blocks can be used to tame animals, and the dangers of being exposed to grass. The article concludes by highlighting the importance of knowing how to use and defend against grass blocks.",
"list_of_entities": [
"dirt",
"Minecraft",
"light level",
"animal",
"surface grass blocks",
"Bedrock Edition",
"Java Edition",
"dirt path",
"grass",
"snow",
"snowy grass block",
"opaque blocks",
"sheep",
"composter",
"transparent blocks"
],
"relations": [
{
"description": "Grass blocks are a renewable resource that can be used as a food item.",
"source_entities": [
"Minecraft",
"grass"
],
"target_entities": [
"animal",
"surface grass blocks"
],
"strength": "strong"
},
{
"description": "Grass blocks can be used to tame animals.",
"source_entities": [
"Minecraft",
"grass"
],
"target_entities": [
"animal",
"surface grass blocks"
],
"strength": "strong"
},
{
"description": "Exposure to grass blocks can cause damage to the player's health.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"grass",
"player"
],
"strength": "strong"
},
{
"description": "Placing grass blocks in a composter has a 30% chance of raising the compost level by 1.",
"source_entities": [
"Minecraft",
"grass"
],
"target_entities": [
"composter",
"grass"
],
"strength": "strong"
},
{
"description": "Surface grass blocks are the only ones that can be used to spawn animals.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"surface grass blocks",
"animal"
],
"strength": "strong"
},
{
"description": "Animals regularly spawn in most Overworld biomes on surface grass blocks that have light level 7 or brighter in the space directly above.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"surface grass blocks",
"animal"
],
"strength": "strong"
},
{
"description": "If you walk into the grass without first getting rid of any animals or monsters that are there, they will attack you.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"grass",
"player"
],
"strength": "moderate"
},
{
"description": "Placing grass blocks with snow on top of them turns them into snowy grass blocks.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"snow",
"grass"
],
"strength": "strong"
},
{
"description": "Removing the snow layer turns the block into a normal grass block.",
"source_entities": [
"Minecraft"
],
"target_entities": [
"snowy grass block",
"grass"
],
"strength": "strong"
},
{
"description": "Dirt path blocks can be created by using any type of shovel on the side or top of a grass block with air above it.",
"source_entities": [
"Minecraft",
"grass"
],
"target_entities": [
"dirt path",
"shovel"
],
"strength": "strong"
},
{
"description": "The shovel loses 1 durability for each dirt path block created.",
"source_entities": [
"Minecraft",
"shovel"
],
"target_entities": [
"dirt path",
"shovel"
],
"strength": "moderate"
},
{
"description": "Death grass block dies and changes to dirt a random time (when a random tick lands on the block)",
"source_entities": [
"Minecraft"
],
"target_entities": [
"death grass block",
"dirt"
],
"strength": "strong"
},
{
"description": "Grass can be used to create dirt paths",
"source_entities": [
"Minecraft"
],
"target_entities": [
"grass",
"dirt path"
],
"strength": "strong"
}
]
}
```
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.10.0 | [
"CRAFT"
] |
croissantllm/croissant_small_models | croissantllm | text-generation | [
"tensorboard",
"safetensors",
"legal",
"code",
"text-generation-inference",
"art",
"text-generation",
"fr",
"en",
"dataset:cerebras/SlimPajama-627B",
"dataset:uonlp/CulturaX",
"dataset:pg19",
"dataset:bigcode/starcoderdata",
"dataset:croissantllm/croissant_dataset",
"arxiv:2402.00786",
"license:mit",
"region:us"
] | "2024-04-29T12:23:29Z" | 2024-04-29T12:33:10+00:00 | 0 | 2 | ---
datasets:
- cerebras/SlimPajama-627B
- uonlp/CulturaX
- pg19
- bigcode/starcoderdata
- croissantllm/croissant_dataset
language:
- fr
- en
license: mit
pipeline_tag: text-generation
tags:
- legal
- code
- text-generation-inference
- art
---
# CroissantLLM - All smaller checkpoints
These models are part of the CroissantLLM initiative, and correspond to the checkpoints after 100B tokens for smaller model sizes.
These are the models used for scaling laws.
To play with the final model, we recommend using the Chat version: https://huggingface.co/croissantllm/CroissantLLMChat-v0.1.
https://arxiv.org/abs/2402.00786
## Abstract
We introduce CroissantLLM, a 1.3B language model pretrained on a set of 3T English and French tokens, to bring to the research and industrial community a high-performance, fully open-sourced bilingual model that runs swiftly on consumer-grade local hardware.
To that end, we pioneer the approach of training an intrinsically bilingual model with a 1:1 English-to-French pretraining data ratio, a custom tokenizer, and bilingual finetuning datasets. We release the training dataset, notably containing a French split with manually curated, high-quality, and varied data sources.
To assess performance outside of English, we craft a novel benchmark, FrenchBench, consisting of an array of classification and generation tasks, covering various orthogonal aspects of model performance in the French Language. Additionally, rooted in transparency and to foster further Large Language Model research, we release codebases, and dozens of checkpoints across various model sizes, training data distributions, and training steps, as well as fine-tuned Chat models, and strong translation models. We evaluate our model through the FMTI framework, and validate 81% of the transparency criteria, far beyond the scores of even most open initiatives.
This work enriches the NLP landscape, breaking away from previous English-centric work in order to strengthen our understanding of multilinguality in language models.
## Citation
Our work can be cited as:
```bash
@misc{faysse2024croissantllm,
title={CroissantLLM: A Truly Bilingual French-English Language Model},
author={Manuel Faysse and Patrick Fernandes and Nuno M. Guerreiro and António Loison and Duarte M. Alves and Caio Corro and Nicolas Boizard and João Alves and Ricardo Rei and Pedro H. Martins and Antoni Bigata Casademunt and François Yvon and André F. T. Martins and Gautier Viaud and Céline Hudelot and Pierre Colombo},
year={2024},
eprint={2402.00786},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Usage
This model is a base model, that is, it is not finetuned for Chat function and works best with few-shot prompting strategies.
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "croissantllm/CroissantLLMBase"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.float16, device_map="auto")
inputs = tokenizer("I am so tired I could sleep right now. -> Je suis si fatigué que je pourrais m'endormir maintenant.\nHe is heading to the market. -> Il va au marché.\nWe are running on the beach. ->", return_tensors="pt").to(model.device)
tokens = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.95, top_k=60, temperature=0.3)
print(tokenizer.decode(tokens[0]))
# remove bos token
inputs = tokenizer("Capitales: France -> Paris, Italie -> Rome, Allemagne -> Berlin, Espagne ->", return_tensors="pt", add_special_tokens=True).to(model.device)
tokens = model.generate(**inputs, max_length=100, do_sample=True, top_p=0.95, top_k=60)
print(tokenizer.decode(tokens[0]))
``` | [
"CRAFT"
] |
Jha-Pranav/blm-lab | Jha-Pranav | null | [
"region:us"
] | "2024-04-30T14:06:15Z" | 2024-04-30T16:31:52+00:00 | 0 | 0 | ---
{}
---
# Model Registry: Baby Language Model
## Experiment Details:
I am using this as a model registry for all my experiments I will be performing with the baby language model. In order to test these models, please refer to my personal git repository.
## Reference Code:
[OpenTransformer](https://github.com/Jha-Pranav/OpenTransformer)
## Note:
Checkpoint weights seem to be malfunctioning with the Hugging Face Transformer library. Please bear with me as I work on debugging and resolving the issue. Please use the steps outlined in the GitHub page until this fix is done.
Thank you for your patience and understanding!
| [
"BEAR"
] |
zhan1993/library-phi_2-v3-10-flan-clusters | zhan1993 | null | [
"region:us"
] | "2024-04-30T21:25:48Z" | 2025-01-05T00:23:21+00:00 | 0 | 0 | ---
{}
---
Number of experts present in the library: 10
| Expert Name | Base Model | Trained on | Adapter Type |
| --- | --- | --- | --- |
| phi2_joint_3epoch_sim_cluster_10 | phi-2 | sordonia/flan-10k-flat/dream_read_the_following_conversation_and_answer_the_question,app_reviews_convert_to_star_rating,cos_e_v1_11_question_option_description_text,social_i_qa_Show_choices_and_generate_answer,quartz_answer_question_based_on,sciq_Direct_Question_Closed_Book_,qasc_qa_with_separated_facts_3,quartz_given_the_fact_answer_the_q,quartz_answer_question_below,kilt_tasks_hotpotqa_final_exam,sciq_Multiple_Choice,wiqa_does_the_supposed_perturbation_have_an_effect,cos_e_v1_11_question_description_option_text,wiki_qa_Is_This_True_,quartz_use_info_from_question_paragraph,sciq_Direct_Question,qasc_qa_with_separated_facts_2,wiqa_which_of_the_following_is_the_supposed_perturbation,app_reviews_convert_to_rating,cos_e_v1_11_question_option_description_id,wiqa_effect_with_string_answer,qasc_qa_with_separated_facts_5,dream_baseline,quartz_having_read_above_passage,cos_e_v1_11_question_description_option_id,qasc_qa_with_separated_facts_1,cos_e_v1_11_description_question_option_text,qasc_qa_with_combined_facts_1,qasc_is_correct_1,cos_e_v1_11_description_question_option_id,social_i_qa_Check_if_a_random_answer_is_valid_or_not,sciq_Multiple_Choice_Closed_Book_,quartz_use_info_from_paragraph_question,qasc_is_correct_2,qasc_qa_with_separated_facts_4,quartz_read_passage_below_choose,quartz_paragraph_question_plain_concat,sciq_Multiple_Choice_Question_First | lora |
| phi2_joint_3epoch_sim_cluster_3 | phi-2 | sordonia/flan-10k-flat/wiki_qa_found_on_google,app_reviews_categorize_rating_using_review,race_middle_Is_this_the_right_answer,super_glue_cb_1_0_2,wiki_qa_Topic_Prediction_Answer_Only,wiki_qa_Direct_Answer_to_Question,super_glue_wsc_fixed_1_0_2,cot_gsm8k_ii,unified_qa_science_inst,race_high_Is_this_the_right_answer,cot_strategyqa,cot_ecqa_ii,quarel_do_not_use,wiki_qa_exercise,wiki_qa_automatic_system,cot_creak_ii,quarel_heres_a_story,quarel_choose_between,stream_qed_ii,wiki_qa_Topic_Prediction_Question_Only,glue_qnli_2_0_0,cot_sensemaking_ii,super_glue_copa_1_0_2,social_i_qa_Generate_the_question_from_the_answer,social_i_qa_Show_choices_and_generate_index,quarel_testing_students,wiki_qa_Topic_Prediction_Question_and_Answer_Pair,wiki_qa_Decide_good_answer,wiki_qa_Jeopardy_style,wiki_qa_Generate_Question_from_Topic,definite_pronoun_resolution_1_1_0,wiqa_effect_with_label_answer,glue_wnli_2_0_0,cot_qasc,cot_strategyqa_ii,quarel_logic_test,stream_aqua_ii | lora |
| phi2_joint_3epoch_sim_cluster_9 | phi-2 | sordonia/flan-10k-flat/super_glue_rte_1_0_2,cot_sensemaking,super_glue_wic_1_0_2,cos_e_v1_11_rationale,anli_r3_0_1_0,dream_generate_last_utterance,paws_wiki_1_1_0,cos_e_v1_11_generate_explanation_given_text,cot_creak,stream_aqua,snli_1_1_0,cos_e_v1_11_i_think,glue_qqp_2_0_0,cos_e_v1_11_explain_why_human,anli_r2_0_1_0,anli_r1_0_1_0,glue_stsb_2_0_0,cos_e_v1_11_aligned_with_common_sense,glue_mnli_2_0_0,social_i_qa_I_was_wondering,cosmos_qa_1_0_0,glue_mrpc_2_0_0,social_i_qa_Generate_answer | lora |
| phi2_joint_3epoch_sim_cluster_1 | phi-2 | sordonia/flan-10k-flat/natural_questions_open_1_0_0,web_questions_whats_the_answer,web_questions_question_answer,dbpedia_14_pick_one_category_for_the_following_text,kilt_tasks_hotpotqa_combining_facts,web_questions_short_general_knowledge_q,kilt_tasks_hotpotqa_straighforward_qa,adversarial_qa_dbidaf_generate_question,adversarial_qa_droberta_based_on,web_questions_get_the_answer,kilt_tasks_hotpotqa_complex_question,web_questions_potential_correct_answer,trivia_qa_rc_1_1_0,kilt_tasks_hotpotqa_formulate,adversarial_qa_dbert_based_on,adversarial_qa_dbidaf_based_on,squad_v1_1_3_0_0 | lora |
| phi2_joint_3epoch_sim_cluster_5 | phi-2 | sordonia/flan-10k-flat/race_middle_Read_the_article_and_answer_the_question_no_option_,race_high_Select_the_best_answer,quail_description_context_question_answer_id,quail_context_question_description_text,race_high_Read_the_article_and_answer_the_question_no_option_,race_high_Select_the_best_answer_no_instructions_,quail_context_description_question_answer_id,race_high_Taking_a_test,super_glue_multirc_1_0_2,race_middle_Select_the_best_answer,quail_context_question_description_answer_id,quail_description_context_question_answer_text,quail_context_question_answer_description_text,race_high_Select_the_best_answer_generate_span_,race_middle_Select_the_best_answer_generate_span_,quail_context_question_answer_description_id,quail_context_description_question_answer_text,quail_context_description_question_text,quail_context_question_description_answer_text,quail_description_context_question_text,race_middle_Taking_a_test,quail_no_prompt_id,quail_no_prompt_text,race_middle_Select_the_best_answer_no_instructions_ | lora |
| phi2_joint_3epoch_sim_cluster_8 | phi-2 | sordonia/flan-10k-flat/ropes_background_new_situation_answer,ropes_prompt_bottom_no_hint,ropes_plain_background_situation,ropes_new_situation_background_answer,ropes_given_background_situation,ropes_prompt_bottom_hint_beginning,ropes_prompt_beginning,ropes_read_background_situation,ropes_plain_bottom_hint,ropes_plain_no_background,ropes_prompt_mix,ropes_background_situation_middle | lora |
| phi2_joint_3epoch_sim_cluster_2 | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_question_context_answer,super_glue_record_1_0_2,wiki_hop_original_generate_object,adversarial_qa_droberta_tell_what_it_is,dbpedia_14_given_a_choice_of_categories_,wiki_hop_original_choose_best_object_affirmative_3,quac_1_0_0,wiki_hop_original_choose_best_object_interrogative_1,wiki_hop_original_choose_best_object_affirmative_1,adversarial_qa_dbert_answer_the_following_q,wiki_hop_original_choose_best_object_interrogative_2,adversarial_qa_droberta_question_context_answer,squad_v2_0_3_0_0,wiki_hop_original_generate_subject,wiki_bio_guess_person,adversarial_qa_dbidaf_answer_the_following_q,adversarial_qa_droberta_answer_the_following_q,adversarial_qa_dbert_tell_what_it_is,race_high_Write_a_multi_choice_question_options_given_,wiki_hop_original_choose_best_object_affirmative_2,wiki_hop_original_generate_subject_and_object,drop_2_0_0,adversarial_qa_dbert_question_context_answer,adversarial_qa_dbidaf_tell_what_it_is | lora |
| phi2_joint_3epoch_sim_cluster_7 | phi-2 | sordonia/flan-10k-flat/glue_sst2_2_0_0,adversarial_qa_droberta_generate_question,true_case,stream_qed,huggingface_xsum,cot_esnli,cot_gsm8k,trec_1_0_0,yelp_polarity_reviews_0_2_0,lambada_1_0_0,glue_cola_2_0_0,ag_news_subset_1_0_0,gem_dart_1_1_0,math_dataset_algebra__linear_1d_1_0_0,cnn_dailymail_3_4_0,wiki_hop_original_explain_relation,dbpedia_14_given_list_what_category_does_the_paragraph_belong_to,gem_wiki_lingua_english_en_1_1_0,fix_punct,imdb_reviews_plain_text_1_0_0,race_middle_Write_a_multi_choice_question_for_the_following_article,gigaword_1_2_0,dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to,gem_web_nlg_en_1_1_0,word_segment,race_high_Write_a_multi_choice_question_for_the_following_article,wmt16_translate_de_en_1_0_0,cot_ecqa,aeslc_1_0_0,dream_generate_first_utterance,wmt16_translate_fi_en_1_0_0,dream_answer_to_dialogue,para_crawl_enes,adversarial_qa_dbert_generate_question,race_middle_Write_a_multi_choice_question_options_given_,wmt14_translate_fr_en_1_0_0 | lora |
| phi2_joint_3epoch_sim_cluster_6 | phi-2 | sordonia/flan-10k-flat/quoref_Context_Contains_Answer,duorc_SelfRC_generate_question_by_answer,quoref_Find_Answer,duorc_ParaphraseRC_movie_director,duorc_ParaphraseRC_answer_question,quoref_Found_Context_Online,quoref_Read_And_Extract_,duorc_ParaphraseRC_title_generation,duorc_ParaphraseRC_decide_worth_it,quoref_What_Is_The_Answer,duorc_ParaphraseRC_generate_question,quoref_Guess_Title_For_Context,quoref_Answer_Test,duorc_SelfRC_question_answering,duorc_SelfRC_title_generation,duorc_ParaphraseRC_generate_question_by_answer,duorc_ParaphraseRC_extract_answer,duorc_SelfRC_answer_question,duorc_SelfRC_decide_worth_it,duorc_ParaphraseRC_question_answering,quoref_Answer_Question_Given_Context,duorc_SelfRC_extract_answer,quoref_Guess_Answer,quoref_Answer_Friend_Question,duorc_SelfRC_movie_director,duorc_SelfRC_generate_question,quoref_Given_Context_Answer_Question | lora |
| phi2_joint_3epoch_sim_cluster_4 | phi-2 | sordonia/flan-10k-flat/wiqa_what_might_be_the_first_step_of_the_process,wiqa_what_is_the_final_step_of_the_following_process,wmt16_translate_ro_en_1_0_0,wiqa_what_might_be_the_last_step_of_the_process,wiki_bio_key_content,gem_common_gen_1_1_0,duorc_SelfRC_build_story_around_qa,app_reviews_generate_review,wiki_bio_what_content,wiki_bio_who,gem_e2e_nlg_1_1_0,cot_esnli_ii,wmt16_translate_tr_en_1_0_0,wiqa_what_is_the_missing_first_step,wiki_bio_comprehension,coqa_1_0_0,duorc_ParaphraseRC_build_story_around_qa,multi_news_1_0_0 | lora |
Last updated on: 2024-04-30 21:25:48+00:00
| [
"SCIQ"
] |
yessilver/new_model | yessilver | null | [
"mteb",
"model-index",
"region:us"
] | "2024-05-03T02:39:02Z" | 2024-05-03T04:45:45+00:00 | 0 | 0 | ---
tags:
- mteb
model-index:
- name: e5-small-v2
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 76.65671641791046
- type: ap
value: 40.16054083847425
- type: f1
value: 70.73805260085523
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 46.431999999999995
- type: f1
value: 44.4239364840113
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 24.182000000000002
- type: map_at_10
value: 38.53
- type: map_at_100
value: 39.574999999999996
- type: map_at_1000
value: 39.593
- type: map_at_3
value: 33.796
- type: map_at_5
value: 36.406
- type: mrr_at_1
value: 24.964
- type: mrr_at_10
value: 38.829
- type: mrr_at_100
value: 39.867000000000004
- type: mrr_at_1000
value: 39.885999999999996
- type: mrr_at_3
value: 34.092
- type: mrr_at_5
value: 36.713
- type: ndcg_at_1
value: 24.182000000000002
- type: ndcg_at_10
value: 46.865
- type: ndcg_at_100
value: 51.611
- type: ndcg_at_1000
value: 52.137
- type: ndcg_at_3
value: 37.036
- type: ndcg_at_5
value: 41.715999999999994
- type: precision_at_1
value: 24.182000000000002
- type: precision_at_10
value: 7.367999999999999
- type: precision_at_100
value: 0.951
- type: precision_at_1000
value: 0.099
- type: precision_at_3
value: 15.481
- type: precision_at_5
value: 11.55
- type: recall_at_1
value: 24.182000000000002
- type: recall_at_10
value: 73.68400000000001
- type: recall_at_100
value: 95.092
- type: recall_at_1000
value: 99.289
- type: recall_at_3
value: 46.444
- type: recall_at_5
value: 57.752
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 43.243157093430476
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 36.48617956618108
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 57.6915668741631
- type: mrr
value: 70.97832300048366
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 82.25177125617765
- type: cos_sim_spearman
value: 82.19042698150236
- type: euclidean_pearson
value: 81.39677961271671
- type: euclidean_spearman
value: 82.19042698150236
- type: manhattan_pearson
value: 81.83582953195571
- type: manhattan_spearman
value: 82.20127060207557
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 73.73701298701299
- type: f1
value: 72.68295178070956
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 35.55562814544096
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 31.024495399036073
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: mteb/cqadupstack-android
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 31.356
- type: map_at_10
value: 41.583
- type: map_at_100
value: 42.931999999999995
- type: map_at_1000
value: 43.059999999999995
- type: map_at_3
value: 38.572
- type: map_at_5
value: 40.184999999999995
- type: mrr_at_1
value: 39.485
- type: mrr_at_10
value: 48.325
- type: mrr_at_100
value: 49.044
- type: mrr_at_1000
value: 49.095
- type: mrr_at_3
value: 45.97
- type: mrr_at_5
value: 47.38
- type: ndcg_at_1
value: 39.485
- type: ndcg_at_10
value: 47.689
- type: ndcg_at_100
value: 52.611
- type: ndcg_at_1000
value: 54.75600000000001
- type: ndcg_at_3
value: 43.675000000000004
- type: ndcg_at_5
value: 45.305
- type: precision_at_1
value: 39.485
- type: precision_at_10
value: 9.142
- type: precision_at_100
value: 1.4460000000000002
- type: precision_at_1000
value: 0.19
- type: precision_at_3
value: 21.364
- type: precision_at_5
value: 15.021
- type: recall_at_1
value: 31.356
- type: recall_at_10
value: 58.338
- type: recall_at_100
value: 79.23400000000001
- type: recall_at_1000
value: 93.4
- type: recall_at_3
value: 45.224
- type: recall_at_5
value: 50.719
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: mteb/cqadupstack-english
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 25.988
- type: map_at_10
value: 34.314
- type: map_at_100
value: 35.323
- type: map_at_1000
value: 35.453
- type: map_at_3
value: 31.855
- type: map_at_5
value: 33.317
- type: mrr_at_1
value: 32.675
- type: mrr_at_10
value: 40.199
- type: mrr_at_100
value: 40.912
- type: mrr_at_1000
value: 40.964
- type: mrr_at_3
value: 38.132
- type: mrr_at_5
value: 39.421
- type: ndcg_at_1
value: 32.675
- type: ndcg_at_10
value: 39.346
- type: ndcg_at_100
value: 43.578
- type: ndcg_at_1000
value: 45.975
- type: ndcg_at_3
value: 35.75
- type: ndcg_at_5
value: 37.578
- type: precision_at_1
value: 32.675
- type: precision_at_10
value: 7.228999999999999
- type: precision_at_100
value: 1.204
- type: precision_at_1000
value: 0.172
- type: precision_at_3
value: 17.113
- type: precision_at_5
value: 12.166
- type: recall_at_1
value: 25.988
- type: recall_at_10
value: 47.943000000000005
- type: recall_at_100
value: 66.326
- type: recall_at_1000
value: 82.02000000000001
- type: recall_at_3
value: 37.169999999999995
- type: recall_at_5
value: 42.356
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: mteb/cqadupstack-gaming
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 38.536
- type: map_at_10
value: 49.514
- type: map_at_100
value: 50.55500000000001
- type: map_at_1000
value: 50.615
- type: map_at_3
value: 46.329
- type: map_at_5
value: 48.278
- type: mrr_at_1
value: 43.887
- type: mrr_at_10
value: 52.900999999999996
- type: mrr_at_100
value: 53.63099999999999
- type: mrr_at_1000
value: 53.664
- type: mrr_at_3
value: 50.502
- type: mrr_at_5
value: 52.063
- type: ndcg_at_1
value: 43.887
- type: ndcg_at_10
value: 54.847
- type: ndcg_at_100
value: 59.163
- type: ndcg_at_1000
value: 60.44199999999999
- type: ndcg_at_3
value: 49.6
- type: ndcg_at_5
value: 52.493
- type: precision_at_1
value: 43.887
- type: precision_at_10
value: 8.677
- type: precision_at_100
value: 1.176
- type: precision_at_1000
value: 0.133
- type: precision_at_3
value: 21.797
- type: precision_at_5
value: 15.146999999999998
- type: recall_at_1
value: 38.536
- type: recall_at_10
value: 67.23
- type: recall_at_100
value: 86.095
- type: recall_at_1000
value: 95.26400000000001
- type: recall_at_3
value: 53.388000000000005
- type: recall_at_5
value: 60.4
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: mteb/cqadupstack-gis
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 23.488
- type: map_at_10
value: 30.375000000000004
- type: map_at_100
value: 31.343
- type: map_at_1000
value: 31.447999999999997
- type: map_at_3
value: 28.017999999999997
- type: map_at_5
value: 29.415999999999997
- type: mrr_at_1
value: 25.085
- type: mrr_at_10
value: 31.935000000000002
- type: mrr_at_100
value: 32.843
- type: mrr_at_1000
value: 32.929
- type: mrr_at_3
value: 29.548000000000002
- type: mrr_at_5
value: 31.04
- type: ndcg_at_1
value: 25.085
- type: ndcg_at_10
value: 34.48
- type: ndcg_at_100
value: 39.501
- type: ndcg_at_1000
value: 42.141
- type: ndcg_at_3
value: 29.831000000000003
- type: ndcg_at_5
value: 32.312999999999995
- type: precision_at_1
value: 25.085
- type: precision_at_10
value: 5.153
- type: precision_at_100
value: 0.815
- type: precision_at_1000
value: 0.108
- type: precision_at_3
value: 12.09
- type: precision_at_5
value: 8.701
- type: recall_at_1
value: 23.488
- type: recall_at_10
value: 45.671
- type: recall_at_100
value: 69.062
- type: recall_at_1000
value: 88.82
- type: recall_at_3
value: 33.376
- type: recall_at_5
value: 39.311
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: mteb/cqadupstack-mathematica
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 12.879999999999999
- type: map_at_10
value: 18.873
- type: map_at_100
value: 20.097
- type: map_at_1000
value: 20.222
- type: map_at_3
value: 16.982
- type: map_at_5
value: 17.902
- type: mrr_at_1
value: 15.920000000000002
- type: mrr_at_10
value: 22.71
- type: mrr_at_100
value: 23.818
- type: mrr_at_1000
value: 23.898
- type: mrr_at_3
value: 20.626
- type: mrr_at_5
value: 21.733
- type: ndcg_at_1
value: 15.920000000000002
- type: ndcg_at_10
value: 22.959
- type: ndcg_at_100
value: 29.270000000000003
- type: ndcg_at_1000
value: 32.448
- type: ndcg_at_3
value: 19.356
- type: ndcg_at_5
value: 20.816000000000003
- type: precision_at_1
value: 15.920000000000002
- type: precision_at_10
value: 4.328
- type: precision_at_100
value: 0.8710000000000001
- type: precision_at_1000
value: 0.127
- type: precision_at_3
value: 9.203999999999999
- type: precision_at_5
value: 6.5920000000000005
- type: recall_at_1
value: 12.879999999999999
- type: recall_at_10
value: 31.724999999999998
- type: recall_at_100
value: 60.049
- type: recall_at_1000
value: 83.133
- type: recall_at_3
value: 21.981
- type: recall_at_5
value: 25.668999999999997
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: mteb/cqadupstack-physics
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 22.774
- type: map_at_10
value: 31.312
- type: map_at_100
value: 32.487
- type: map_at_1000
value: 32.609
- type: map_at_3
value: 28.589
- type: map_at_5
value: 30.142999999999997
- type: mrr_at_1
value: 28.393
- type: mrr_at_10
value: 36.813
- type: mrr_at_100
value: 37.724999999999994
- type: mrr_at_1000
value: 37.789
- type: mrr_at_3
value: 34.392
- type: mrr_at_5
value: 35.893
- type: ndcg_at_1
value: 28.393
- type: ndcg_at_10
value: 36.835
- type: ndcg_at_100
value: 42.192
- type: ndcg_at_1000
value: 44.812000000000005
- type: ndcg_at_3
value: 32.268
- type: ndcg_at_5
value: 34.515
- type: precision_at_1
value: 28.393
- type: precision_at_10
value: 6.737
- type: precision_at_100
value: 1.114
- type: precision_at_1000
value: 0.154
- type: precision_at_3
value: 15.399
- type: precision_at_5
value: 10.991
- type: recall_at_1
value: 22.774
- type: recall_at_10
value: 48.136
- type: recall_at_100
value: 71.0
- type: recall_at_1000
value: 88.74
- type: recall_at_3
value: 35.098
- type: recall_at_5
value: 41.134
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: mteb/cqadupstack-programmers
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 23.669
- type: map_at_10
value: 32.554
- type: map_at_100
value: 33.886
- type: map_at_1000
value: 34.004
- type: map_at_3
value: 29.944
- type: map_at_5
value: 31.330999999999996
- type: mrr_at_1
value: 29.110000000000003
- type: mrr_at_10
value: 37.234
- type: mrr_at_100
value: 38.151
- type: mrr_at_1000
value: 38.218999999999994
- type: mrr_at_3
value: 35.046
- type: mrr_at_5
value: 36.056
- type: ndcg_at_1
value: 29.110000000000003
- type: ndcg_at_10
value: 37.743
- type: ndcg_at_100
value: 43.413000000000004
- type: ndcg_at_1000
value: 46.06
- type: ndcg_at_3
value: 33.501999999999995
- type: ndcg_at_5
value: 35.175
- type: precision_at_1
value: 29.110000000000003
- type: precision_at_10
value: 6.872
- type: precision_at_100
value: 1.129
- type: precision_at_1000
value: 0.154
- type: precision_at_3
value: 16.02
- type: precision_at_5
value: 11.21
- type: recall_at_1
value: 23.669
- type: recall_at_10
value: 48.615
- type: recall_at_100
value: 72.708
- type: recall_at_1000
value: 90.96300000000001
- type: recall_at_3
value: 36.373
- type: recall_at_5
value: 41.06
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: mteb/cqadupstack-stats
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 21.364
- type: map_at_10
value: 27.208
- type: map_at_100
value: 28.083000000000002
- type: map_at_1000
value: 28.182000000000002
- type: map_at_3
value: 25.448999999999998
- type: map_at_5
value: 26.397
- type: mrr_at_1
value: 24.233
- type: mrr_at_10
value: 29.802
- type: mrr_at_100
value: 30.595
- type: mrr_at_1000
value: 30.660999999999998
- type: mrr_at_3
value: 28.17
- type: mrr_at_5
value: 28.967
- type: ndcg_at_1
value: 24.233
- type: ndcg_at_10
value: 30.774
- type: ndcg_at_100
value: 35.414
- type: ndcg_at_1000
value: 37.962
- type: ndcg_at_3
value: 27.497
- type: ndcg_at_5
value: 28.957
- type: precision_at_1
value: 24.233
- type: precision_at_10
value: 4.755
- type: precision_at_100
value: 0.775
- type: precision_at_1000
value: 0.108
- type: precision_at_3
value: 11.860999999999999
- type: precision_at_5
value: 8.097999999999999
- type: recall_at_1
value: 21.364
- type: recall_at_10
value: 39.291
- type: recall_at_100
value: 60.907
- type: recall_at_1000
value: 79.786
- type: recall_at_3
value: 30.257
- type: recall_at_5
value: 33.924
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: mteb/cqadupstack-tex
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 15.139
- type: map_at_10
value: 21.063000000000002
- type: map_at_100
value: 22.070999999999998
- type: map_at_1000
value: 22.203999999999997
- type: map_at_3
value: 19.204
- type: map_at_5
value: 20.185
- type: mrr_at_1
value: 18.445
- type: mrr_at_10
value: 24.698999999999998
- type: mrr_at_100
value: 25.569999999999997
- type: mrr_at_1000
value: 25.659
- type: mrr_at_3
value: 22.866
- type: mrr_at_5
value: 23.868000000000002
- type: ndcg_at_1
value: 18.445
- type: ndcg_at_10
value: 24.998
- type: ndcg_at_100
value: 29.982999999999997
- type: ndcg_at_1000
value: 33.271
- type: ndcg_at_3
value: 21.692
- type: ndcg_at_5
value: 23.102
- type: precision_at_1
value: 18.445
- type: precision_at_10
value: 4.542
- type: precision_at_100
value: 0.84
- type: precision_at_1000
value: 0.129
- type: precision_at_3
value: 10.381
- type: precision_at_5
value: 7.356999999999999
- type: recall_at_1
value: 15.139
- type: recall_at_10
value: 33.268
- type: recall_at_100
value: 55.87
- type: recall_at_1000
value: 79.841
- type: recall_at_3
value: 23.629
- type: recall_at_5
value: 27.541
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: mteb/cqadupstack-unix
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 24.317
- type: map_at_10
value: 31.661
- type: map_at_100
value: 32.844
- type: map_at_1000
value: 32.952
- type: map_at_3
value: 29.118
- type: map_at_5
value: 30.410999999999998
- type: mrr_at_1
value: 28.544999999999998
- type: mrr_at_10
value: 36.059999999999995
- type: mrr_at_100
value: 36.983
- type: mrr_at_1000
value: 37.047999999999995
- type: mrr_at_3
value: 33.738
- type: mrr_at_5
value: 34.871
- type: ndcg_at_1
value: 28.544999999999998
- type: ndcg_at_10
value: 36.546
- type: ndcg_at_100
value: 42.039
- type: ndcg_at_1000
value: 44.61
- type: ndcg_at_3
value: 31.835
- type: ndcg_at_5
value: 33.755
- type: precision_at_1
value: 28.544999999999998
- type: precision_at_10
value: 6.0729999999999995
- type: precision_at_100
value: 0.991
- type: precision_at_1000
value: 0.132
- type: precision_at_3
value: 13.993
- type: precision_at_5
value: 9.795
- type: recall_at_1
value: 24.317
- type: recall_at_10
value: 47.227000000000004
- type: recall_at_100
value: 71.245
- type: recall_at_1000
value: 89.584
- type: recall_at_3
value: 34.292
- type: recall_at_5
value: 39.129000000000005
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: mteb/cqadupstack-webmasters
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 24.169999999999998
- type: map_at_10
value: 32.669
- type: map_at_100
value: 34.195
- type: map_at_1000
value: 34.438
- type: map_at_3
value: 30.264000000000003
- type: map_at_5
value: 31.694
- type: mrr_at_1
value: 29.249000000000002
- type: mrr_at_10
value: 37.230999999999995
- type: mrr_at_100
value: 38.216
- type: mrr_at_1000
value: 38.291
- type: mrr_at_3
value: 35.178
- type: mrr_at_5
value: 36.453
- type: ndcg_at_1
value: 29.249000000000002
- type: ndcg_at_10
value: 37.967
- type: ndcg_at_100
value: 43.514
- type: ndcg_at_1000
value: 46.63
- type: ndcg_at_3
value: 34.437
- type: ndcg_at_5
value: 36.299
- type: precision_at_1
value: 29.249000000000002
- type: precision_at_10
value: 7.055
- type: precision_at_100
value: 1.431
- type: precision_at_1000
value: 0.23800000000000002
- type: precision_at_3
value: 16.469
- type: precision_at_5
value: 11.897
- type: recall_at_1
value: 24.169999999999998
- type: recall_at_10
value: 47.577000000000005
- type: recall_at_100
value: 72.375
- type: recall_at_1000
value: 92.711
- type: recall_at_3
value: 36.551
- type: recall_at_5
value: 41.739
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: mteb/cqadupstack-wordpress
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 18.306
- type: map_at_10
value: 24.882
- type: map_at_100
value: 25.898
- type: map_at_1000
value: 25.991999999999997
- type: map_at_3
value: 22.506999999999998
- type: map_at_5
value: 23.708000000000002
- type: mrr_at_1
value: 20.148
- type: mrr_at_10
value: 27.014
- type: mrr_at_100
value: 27.886
- type: mrr_at_1000
value: 27.955999999999996
- type: mrr_at_3
value: 24.553
- type: mrr_at_5
value: 25.801000000000002
- type: ndcg_at_1
value: 20.148
- type: ndcg_at_10
value: 29.211
- type: ndcg_at_100
value: 34.307
- type: ndcg_at_1000
value: 36.875
- type: ndcg_at_3
value: 24.333
- type: ndcg_at_5
value: 26.455000000000002
- type: precision_at_1
value: 20.148
- type: precision_at_10
value: 4.713
- type: precision_at_100
value: 0.784
- type: precision_at_1000
value: 0.11
- type: precision_at_3
value: 10.290000000000001
- type: precision_at_5
value: 7.394
- type: recall_at_1
value: 18.306
- type: recall_at_10
value: 40.591
- type: recall_at_100
value: 64.18199999999999
- type: recall_at_1000
value: 83.646
- type: recall_at_3
value: 27.528999999999996
- type: recall_at_5
value: 32.619
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 7.872999999999999
- type: map_at_10
value: 13.361999999999998
- type: map_at_100
value: 15.024999999999999
- type: map_at_1000
value: 15.254000000000001
- type: map_at_3
value: 10.895000000000001
- type: map_at_5
value: 12.131
- type: mrr_at_1
value: 16.743
- type: mrr_at_10
value: 26.033
- type: mrr_at_100
value: 27.290999999999997
- type: mrr_at_1000
value: 27.356
- type: mrr_at_3
value: 22.573
- type: mrr_at_5
value: 24.336
- type: ndcg_at_1
value: 16.743
- type: ndcg_at_10
value: 19.675
- type: ndcg_at_100
value: 27.345000000000002
- type: ndcg_at_1000
value: 31.685999999999996
- type: ndcg_at_3
value: 15.036
- type: ndcg_at_5
value: 16.643
- type: precision_at_1
value: 16.743
- type: precision_at_10
value: 6.43
- type: precision_at_100
value: 1.4749999999999999
- type: precision_at_1000
value: 0.22599999999999998
- type: precision_at_3
value: 11.01
- type: precision_at_5
value: 8.924999999999999
- type: recall_at_1
value: 7.872999999999999
- type: recall_at_10
value: 25.026
- type: recall_at_100
value: 52.245
- type: recall_at_1000
value: 76.949
- type: recall_at_3
value: 13.962
- type: recall_at_5
value: 18.085
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 8.586
- type: map_at_10
value: 17.098
- type: map_at_100
value: 23.857
- type: map_at_1000
value: 25.357000000000003
- type: map_at_3
value: 12.574
- type: map_at_5
value: 14.374999999999998
- type: mrr_at_1
value: 59.5
- type: mrr_at_10
value: 68.199
- type: mrr_at_100
value: 68.699
- type: mrr_at_1000
value: 68.71199999999999
- type: mrr_at_3
value: 65.958
- type: mrr_at_5
value: 67.38300000000001
- type: ndcg_at_1
value: 48.625
- type: ndcg_at_10
value: 36.064
- type: ndcg_at_100
value: 41.137
- type: ndcg_at_1000
value: 49.08
- type: ndcg_at_3
value: 39.615
- type: ndcg_at_5
value: 37.080999999999996
- type: precision_at_1
value: 59.5
- type: precision_at_10
value: 28.050000000000004
- type: precision_at_100
value: 9.133
- type: precision_at_1000
value: 1.8960000000000001
- type: precision_at_3
value: 42.75
- type: precision_at_5
value: 35.25
- type: recall_at_1
value: 8.586
- type: recall_at_10
value: 23.148
- type: recall_at_100
value: 48.479
- type: recall_at_1000
value: 73.75500000000001
- type: recall_at_3
value: 13.718
- type: recall_at_5
value: 16.862
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 47.440000000000005
- type: f1
value: 40.19931464357708
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 50.544
- type: map_at_10
value: 63.495000000000005
- type: map_at_100
value: 64.005
- type: map_at_1000
value: 64.023
- type: map_at_3
value: 60.937
- type: map_at_5
value: 62.556
- type: mrr_at_1
value: 54.379999999999995
- type: mrr_at_10
value: 67.266
- type: mrr_at_100
value: 67.647
- type: mrr_at_1000
value: 67.65299999999999
- type: mrr_at_3
value: 64.85600000000001
- type: mrr_at_5
value: 66.402
- type: ndcg_at_1
value: 54.379999999999995
- type: ndcg_at_10
value: 69.977
- type: ndcg_at_100
value: 72.045
- type: ndcg_at_1000
value: 72.404
- type: ndcg_at_3
value: 65.12299999999999
- type: ndcg_at_5
value: 67.843
- type: precision_at_1
value: 54.379999999999995
- type: precision_at_10
value: 9.469
- type: precision_at_100
value: 1.0670000000000002
- type: precision_at_1000
value: 0.11199999999999999
- type: precision_at_3
value: 26.533
- type: precision_at_5
value: 17.441000000000003
- type: recall_at_1
value: 50.544
- type: recall_at_10
value: 86.253
- type: recall_at_100
value: 94.92699999999999
- type: recall_at_1000
value: 97.301
- type: recall_at_3
value: 73.215
- type: recall_at_5
value: 79.81899999999999
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 18.027
- type: map_at_10
value: 28.347
- type: map_at_100
value: 30.123
- type: map_at_1000
value: 30.284
- type: map_at_3
value: 24.862000000000002
- type: map_at_5
value: 26.698
- type: mrr_at_1
value: 34.105000000000004
- type: mrr_at_10
value: 42.747
- type: mrr_at_100
value: 43.672
- type: mrr_at_1000
value: 43.723
- type: mrr_at_3
value: 40.303
- type: mrr_at_5
value: 41.6
- type: ndcg_at_1
value: 34.105000000000004
- type: ndcg_at_10
value: 35.495
- type: ndcg_at_100
value: 42.447
- type: ndcg_at_1000
value: 45.537
- type: ndcg_at_3
value: 31.911
- type: ndcg_at_5
value: 32.995999999999995
- type: precision_at_1
value: 34.105000000000004
- type: precision_at_10
value: 9.738
- type: precision_at_100
value: 1.687
- type: precision_at_1000
value: 0.22399999999999998
- type: precision_at_3
value: 20.988
- type: precision_at_5
value: 15.432000000000002
- type: recall_at_1
value: 18.027
- type: recall_at_10
value: 41.897
- type: recall_at_100
value: 67.949
- type: recall_at_1000
value: 86.735
- type: recall_at_3
value: 29.342000000000002
- type: recall_at_5
value: 34.365
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 35.409
- type: map_at_10
value: 55.894
- type: map_at_100
value: 56.838
- type: map_at_1000
value: 56.901999999999994
- type: map_at_3
value: 52.074
- type: map_at_5
value: 54.429
- type: mrr_at_1
value: 70.817
- type: mrr_at_10
value: 78.532
- type: mrr_at_100
value: 78.755
- type: mrr_at_1000
value: 78.763
- type: mrr_at_3
value: 77.171
- type: mrr_at_5
value: 78.03
- type: ndcg_at_1
value: 70.817
- type: ndcg_at_10
value: 64.995
- type: ndcg_at_100
value: 68.27499999999999
- type: ndcg_at_1000
value: 69.525
- type: ndcg_at_3
value: 59.401
- type: ndcg_at_5
value: 62.471
- type: precision_at_1
value: 70.817
- type: precision_at_10
value: 13.957
- type: precision_at_100
value: 1.651
- type: precision_at_1000
value: 0.182
- type: precision_at_3
value: 38.267
- type: precision_at_5
value: 25.385999999999996
- type: recall_at_1
value: 35.409
- type: recall_at_10
value: 69.784
- type: recall_at_100
value: 82.54599999999999
- type: recall_at_1000
value: 90.824
- type: recall_at_3
value: 57.4
- type: recall_at_5
value: 63.464
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 79.54679999999999
- type: ap
value: 73.47419341239319
- type: f1
value: 79.4507801491805
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: mteb/msmarco
config: default
split: test
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 2.465
- type: map_at_10
value: 15.237
- type: map_at_100
value: 39.974
- type: map_at_1000
value: 47.487
- type: map_at_3
value: 6.798
- type: map_at_5
value: 9.635
- type: mrr_at_1
value: 93.023
- type: mrr_at_10
value: 94.961
- type: mrr_at_100
value: 95.041
- type: mrr_at_1000
value: 95.041
- type: mrr_at_3
value: 94.961
- type: mrr_at_5
value: 94.961
- type: ndcg_at_1
value: 75.194
- type: ndcg_at_10
value: 68.715
- type: ndcg_at_100
value: 64.191
- type: ndcg_at_1000
value: 71.192
- type: ndcg_at_3
value: 73.085
- type: ndcg_at_5
value: 72.817
- type: precision_at_1
value: 93.023
- type: precision_at_10
value: 76.512
- type: precision_at_100
value: 37.698
- type: precision_at_1000
value: 6.851
- type: precision_at_3
value: 88.372
- type: precision_at_5
value: 84.651
- type: recall_at_1
value: 2.465
- type: recall_at_10
value: 16.181
- type: recall_at_100
value: 52.515
- type: recall_at_1000
value: 77.483
- type: recall_at_3
value: 6.922000000000001
- type: recall_at_5
value: 9.945
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 90.48335613315092
- type: f1
value: 90.3575395041569
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 58.10533515731875
- type: f1
value: 41.93379347349137
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 65.60524546065906
- type: f1
value: 62.37255545904355
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 71.049092131809
- type: f1
value: 70.19452987909062
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 31.698383065423773
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 27.763066538701253
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.320838995172895
- type: mrr
value: 31.223609863654694
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 5.127000000000001
- type: map_at_10
value: 11.395
- type: map_at_100
value: 14.252999999999998
- type: map_at_1000
value: 15.601
- type: map_at_3
value: 8.327
- type: map_at_5
value: 9.637
- type: mrr_at_1
value: 42.105
- type: mrr_at_10
value: 50.495000000000005
- type: mrr_at_100
value: 51.175000000000004
- type: mrr_at_1000
value: 51.217999999999996
- type: mrr_at_3
value: 48.452
- type: mrr_at_5
value: 49.830000000000005
- type: ndcg_at_1
value: 40.093
- type: ndcg_at_10
value: 31.806
- type: ndcg_at_100
value: 28.949
- type: ndcg_at_1000
value: 37.655
- type: ndcg_at_3
value: 36.692
- type: ndcg_at_5
value: 34.348
- type: precision_at_1
value: 41.486000000000004
- type: precision_at_10
value: 23.777
- type: precision_at_100
value: 7.457999999999999
- type: precision_at_1000
value: 2.018
- type: precision_at_3
value: 34.572
- type: precision_at_5
value: 29.536
- type: recall_at_1
value: 5.127000000000001
- type: recall_at_10
value: 15.427
- type: recall_at_100
value: 29.206
- type: recall_at_1000
value: 60.716
- type: recall_at_3
value: 9.261999999999999
- type: recall_at_5
value: 11.677999999999999
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 29.275000000000002
- type: map_at_10
value: 44.374
- type: map_at_100
value: 45.405
- type: map_at_1000
value: 45.437
- type: map_at_3
value: 40.028000000000006
- type: map_at_5
value: 42.492999999999995
- type: mrr_at_1
value: 32.966
- type: mrr_at_10
value: 46.905
- type: mrr_at_100
value: 47.699999999999996
- type: mrr_at_1000
value: 47.721000000000004
- type: mrr_at_3
value: 43.308
- type: mrr_at_5
value: 45.458
- type: ndcg_at_1
value: 32.966
- type: ndcg_at_10
value: 52.151
- type: ndcg_at_100
value: 56.565
- type: ndcg_at_1000
value: 57.315000000000005
- type: ndcg_at_3
value: 43.973
- type: ndcg_at_5
value: 48.125
- type: precision_at_1
value: 32.966
- type: precision_at_10
value: 8.72
- type: precision_at_100
value: 1.121
- type: precision_at_1000
value: 0.11900000000000001
- type: precision_at_3
value: 20.085
- type: precision_at_5
value: 14.45
- type: recall_at_1
value: 29.275000000000002
- type: recall_at_10
value: 73.288
- type: recall_at_100
value: 92.56
- type: recall_at_1000
value: 98.139
- type: recall_at_3
value: 52.11
- type: recall_at_5
value: 61.696
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: map_at_1
value: 67.537
- type: map_at_10
value: 80.879
- type: map_at_100
value: 81.577
- type: map_at_1000
value: 81.602
- type: map_at_3
value: 77.981
- type: map_at_5
value: 79.768
- type: mrr_at_1
value: 77.69
- type: mrr_at_10
value: 84.417
- type: mrr_at_100
value: 84.59299999999999
- type: mrr_at_1000
value: 84.596
- type: mrr_at_3
value: 83.26
- type: mrr_at_5
value: 84.023
- type: ndcg_at_1
value: 77.72
- type: ndcg_at_10
value: 85.021
- type: ndcg_at_100
value: 86.66199999999999
- type: ndcg_at_1000
value: 86.87700000000001
- type: ndcg_at_3
value: 81.90899999999999
- type: ndcg_at_5
value: 83.55
- type: precision_at_1
value: 77.72
- type: precision_at_10
value: 12.876999999999999
- type: precision_at_100
value: 1.498
- type: precision_at_1000
value: 0.156
- type: precision_at_3
value: 35.653
- type: precision_at_5
value: 23.476
- type: recall_at_1
value: 67.537
- type: recall_at_10
value: 92.878
- type: recall_at_100
value: 98.786
- type: recall_at_1000
value: 99.892
- type: recall_at_3
value: 83.968
- type: recall_at_5
value: 88.571
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 49.16241148820256
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: v_measure
value: 61.54900278834193
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: map_at_1
value: 4.173
- type: map_at_10
value: 10.120999999999999
- type: map_at_100
value: 11.956
- type: map_at_1000
value: 12.219
- type: map_at_3
value: 7.3580000000000005
- type: map_at_5
value: 8.799
- type: mrr_at_1
value: 20.599999999999998
- type: mrr_at_10
value: 30.326999999999998
- type: mrr_at_100
value: 31.412000000000003
- type: mrr_at_1000
value: 31.480000000000004
- type: mrr_at_3
value: 26.983
- type: mrr_at_5
value: 28.938000000000002
- type: ndcg_at_1
value: 20.599999999999998
- type: ndcg_at_10
value: 17.365
- type: ndcg_at_100
value: 24.623
- type: ndcg_at_1000
value: 29.65
- type: ndcg_at_3
value: 16.509999999999998
- type: ndcg_at_5
value: 14.542
- type: precision_at_1
value: 20.599999999999998
- type: precision_at_10
value: 8.98
- type: precision_at_100
value: 1.939
- type: precision_at_1000
value: 0.315
- type: precision_at_3
value: 15.4
- type: precision_at_5
value: 12.8
- type: recall_at_1
value: 4.173
- type: recall_at_10
value: 18.212999999999997
- type: recall_at_100
value: 39.363
- type: recall_at_1000
value: 63.94499999999999
- type: recall_at_3
value: 9.373
- type: recall_at_5
value: 13.008000000000001
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cos_sim_pearson
value: 83.87431570350371
- type: cos_sim_spearman
value: 79.25074443392982
- type: euclidean_pearson
value: 80.9080554083112
- type: euclidean_spearman
value: 79.2507399109411
- type: manhattan_pearson
value: 80.90956765983888
- type: manhattan_spearman
value: 79.20576643481074
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 83.48662954870734
- type: cos_sim_spearman
value: 73.70799073411621
- type: euclidean_pearson
value: 80.49103960387095
- type: euclidean_spearman
value: 73.7055087532169
- type: manhattan_pearson
value: 80.5783519196888
- type: manhattan_spearman
value: 73.90297846138822
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 80.70595293210951
- type: cos_sim_spearman
value: 82.31727223815786
- type: euclidean_pearson
value: 81.5306062072953
- type: euclidean_spearman
value: 82.31721735735299
- type: manhattan_pearson
value: 81.43418231655517
- type: manhattan_spearman
value: 82.20026619822572
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 81.24706825802423
- type: cos_sim_spearman
value: 80.06920825678749
- type: euclidean_pearson
value: 80.48334698932342
- type: euclidean_spearman
value: 80.06918911208002
- type: manhattan_pearson
value: 80.40681414406772
- type: manhattan_spearman
value: 80.0102866792831
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 86.16929217014857
- type: cos_sim_spearman
value: 87.2100080395613
- type: euclidean_pearson
value: 86.4066737251256
- type: euclidean_spearman
value: 87.20998056215564
- type: manhattan_pearson
value: 86.39080868256596
- type: manhattan_spearman
value: 87.1927937048571
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 80.53662089031329
- type: cos_sim_spearman
value: 82.33056272292711
- type: euclidean_pearson
value: 81.40056519211387
- type: euclidean_spearman
value: 82.33056272292711
- type: manhattan_pearson
value: 81.27845573928735
- type: manhattan_spearman
value: 82.22192854693785
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 87.66415281856406
- type: cos_sim_spearman
value: 87.58094863633612
- type: euclidean_pearson
value: 88.25085288996081
- type: euclidean_spearman
value: 87.58094863633612
- type: manhattan_pearson
value: 88.34016528668018
- type: manhattan_spearman
value: 87.67773968789653
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 65.91354529556227
- type: cos_sim_spearman
value: 66.29904599827411
- type: euclidean_pearson
value: 66.99135025654104
- type: euclidean_spearman
value: 66.29904599827411
- type: manhattan_pearson
value: 67.29167796154489
- type: manhattan_spearman
value: 66.54035688112117
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 83.17371544155577
- type: cos_sim_spearman
value: 84.91600230031912
- type: euclidean_pearson
value: 84.58535536355062
- type: euclidean_spearman
value: 84.91603828194314
- type: manhattan_pearson
value: 84.52786631260929
- type: manhattan_spearman
value: 84.8279451537192
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 79.90931256553237
- type: mrr
value: 94.55430462783404
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 52.233
- type: map_at_10
value: 63.135
- type: map_at_100
value: 63.766999999999996
- type: map_at_1000
value: 63.788999999999994
- type: map_at_3
value: 60.374
- type: map_at_5
value: 62.11600000000001
- type: mrr_at_1
value: 54.333
- type: mrr_at_10
value: 64.208
- type: mrr_at_100
value: 64.687
- type: mrr_at_1000
value: 64.705
- type: mrr_at_3
value: 62.166999999999994
- type: mrr_at_5
value: 63.532999999999994
- type: ndcg_at_1
value: 54.333
- type: ndcg_at_10
value: 67.965
- type: ndcg_at_100
value: 70.709
- type: ndcg_at_1000
value: 71.221
- type: ndcg_at_3
value: 63.376
- type: ndcg_at_5
value: 65.977
- type: precision_at_1
value: 54.333
- type: precision_at_10
value: 9.167
- type: precision_at_100
value: 1.0630000000000002
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 25.0
- type: precision_at_5
value: 16.733
- type: recall_at_1
value: 52.233
- type: recall_at_10
value: 81.289
- type: recall_at_100
value: 93.767
- type: recall_at_1000
value: 97.667
- type: recall_at_3
value: 69.294
- type: recall_at_5
value: 75.64999999999999
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.8069306930693
- type: cos_sim_ap
value: 95.01715408250185
- type: cos_sim_f1
value: 90.27431421446383
- type: cos_sim_precision
value: 90.04975124378109
- type: cos_sim_recall
value: 90.5
- type: dot_accuracy
value: 99.8069306930693
- type: dot_ap
value: 95.01715420720572
- type: dot_f1
value: 90.27431421446383
- type: dot_precision
value: 90.04975124378109
- type: dot_recall
value: 90.5
- type: euclidean_accuracy
value: 99.8069306930693
- type: euclidean_ap
value: 95.01715408250185
- type: euclidean_f1
value: 90.27431421446383
- type: euclidean_precision
value: 90.04975124378109
- type: euclidean_recall
value: 90.5
- type: manhattan_accuracy
value: 99.8108910891089
- type: manhattan_ap
value: 95.08344895081773
- type: manhattan_f1
value: 90.2672718103883
- type: manhattan_precision
value: 91.04781281790437
- type: manhattan_recall
value: 89.5
- type: max_accuracy
value: 99.8108910891089
- type: max_ap
value: 95.08344895081773
- type: max_f1
value: 90.27431421446383
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 56.77496100801627
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 32.03980982336066
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 49.92590367093363
- type: mrr
value: 50.72744249214838
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 29.873523128424296
- type: cos_sim_spearman
value: 29.77696422152863
- type: dot_pearson
value: 29.873538265911392
- type: dot_spearman
value: 29.77696422152863
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: map_at_1
value: 0.16
- type: map_at_10
value: 1.196
- type: map_at_100
value: 6.525
- type: map_at_1000
value: 17.379
- type: map_at_3
value: 0.43299999999999994
- type: map_at_5
value: 0.687
- type: mrr_at_1
value: 64.0
- type: mrr_at_10
value: 76.467
- type: mrr_at_100
value: 76.533
- type: mrr_at_1000
value: 76.533
- type: mrr_at_3
value: 73.667
- type: mrr_at_5
value: 75.467
- type: ndcg_at_1
value: 56.99999999999999
- type: ndcg_at_10
value: 52.614000000000004
- type: ndcg_at_100
value: 41.677
- type: ndcg_at_1000
value: 41.565000000000005
- type: ndcg_at_3
value: 55.765
- type: ndcg_at_5
value: 55.553
- type: precision_at_1
value: 64.0
- type: precision_at_10
value: 56.8
- type: precision_at_100
value: 43.18
- type: precision_at_1000
value: 19.016
- type: precision_at_3
value: 60.0
- type: precision_at_5
value: 60.4
- type: recall_at_1
value: 0.16
- type: recall_at_10
value: 1.4909999999999999
- type: recall_at_100
value: 10.238999999999999
- type: recall_at_1000
value: 40.492
- type: recall_at_3
value: 0.486
- type: recall_at_5
value: 0.8099999999999999
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 1.078
- type: map_at_10
value: 4.777
- type: map_at_100
value: 8.552
- type: map_at_1000
value: 9.831
- type: map_at_3
value: 2.33
- type: map_at_5
value: 3.102
- type: mrr_at_1
value: 14.285999999999998
- type: mrr_at_10
value: 25.688
- type: mrr_at_100
value: 27.211000000000002
- type: mrr_at_1000
value: 27.262999999999998
- type: mrr_at_3
value: 20.408
- type: mrr_at_5
value: 23.265
- type: ndcg_at_1
value: 13.264999999999999
- type: ndcg_at_10
value: 13.225999999999999
- type: ndcg_at_100
value: 23.873
- type: ndcg_at_1000
value: 35.357
- type: ndcg_at_3
value: 11.162999999999998
- type: ndcg_at_5
value: 12.202
- type: precision_at_1
value: 14.285999999999998
- type: precision_at_10
value: 13.469000000000001
- type: precision_at_100
value: 5.592
- type: precision_at_1000
value: 1.278
- type: precision_at_3
value: 12.245000000000001
- type: precision_at_5
value: 13.877999999999998
- type: recall_at_1
value: 1.078
- type: recall_at_10
value: 10.094
- type: recall_at_100
value: 35.723
- type: recall_at_1000
value: 70.161
- type: recall_at_3
value: 3.078
- type: recall_at_5
value: 5.171
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 63.526
- type: ap
value: 11.499475362455422
- type: f1
value: 49.007047166853305
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 61.77136389360498
- type: f1
value: 61.60711673348749
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 40.700597517044926
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 86.59474280264648
- type: cos_sim_ap
value: 75.2354882574253
- type: cos_sim_f1
value: 69.23641703377386
- type: cos_sim_precision
value: 64.55956184390689
- type: cos_sim_recall
value: 74.64379947229551
- type: dot_accuracy
value: 86.59474280264648
- type: dot_ap
value: 75.2355004100119
- type: dot_f1
value: 69.23641703377386
- type: dot_precision
value: 64.55956184390689
- type: dot_recall
value: 74.64379947229551
- type: euclidean_accuracy
value: 86.59474280264648
- type: euclidean_ap
value: 75.23549109559548
- type: euclidean_f1
value: 69.23641703377386
- type: euclidean_precision
value: 64.55956184390689
- type: euclidean_recall
value: 74.64379947229551
- type: manhattan_accuracy
value: 86.46361089586935
- type: manhattan_ap
value: 74.97783476285602
- type: manhattan_f1
value: 69.16030534351145
- type: manhattan_precision
value: 66.78132678132678
- type: manhattan_recall
value: 71.71503957783642
- type: max_accuracy
value: 86.59474280264648
- type: max_ap
value: 75.2355004100119
- type: max_f1
value: 69.23641703377386
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.03830480847596
- type: cos_sim_ap
value: 85.95577773962282
- type: cos_sim_f1
value: 78.27735233907043
- type: cos_sim_precision
value: 77.10231516056758
- type: cos_sim_recall
value: 79.48875885432707
- type: dot_accuracy
value: 89.03830480847596
- type: dot_ap
value: 85.95578535080806
- type: dot_f1
value: 78.27735233907043
- type: dot_precision
value: 77.10231516056758
- type: dot_recall
value: 79.48875885432707
- type: euclidean_accuracy
value: 89.03830480847596
- type: euclidean_ap
value: 85.95573921817162
- type: euclidean_f1
value: 78.27735233907043
- type: euclidean_precision
value: 77.10231516056758
- type: euclidean_recall
value: 79.48875885432707
- type: manhattan_accuracy
value: 88.9024721543059
- type: manhattan_ap
value: 85.89551017445959
- type: manhattan_f1
value: 78.19396487013964
- type: manhattan_precision
value: 76.28148799062683
- type: manhattan_recall
value: 80.20480443486295
- type: max_accuracy
value: 89.03830480847596
- type: max_ap
value: 85.95578535080806
- type: max_f1
value: 78.27735233907043
---
| [
"BIOSSES",
"SCIFACT"
] |
smarttiger/ipcamera | smarttiger | null | [
"transformers",
"code",
"medical",
"lv",
"arxiv:2404.14619",
"license:mit",
"endpoints_compatible",
"region:us"
] | "2024-05-04T04:21:01Z" | 2024-05-04T08:46:07+00:00 | 0 | 0 | ---
language:
- lv
library_name: transformers
license: mit
tags:
- code
- medical
---
# Ipamera:test hugging face
*Sachin Mehta, Mohammad Hossein Sekhavat, Qingqing Cao, Maxwell Horton, Yanzi Jin, Chenfan Sun, Iman Mirzadeh, Mahyar Najibi, Dmitry Belenko, Peter Zatloukal, Mohammad Rastegari*
We introduce **OpenELM**, a family of **Open** **E**fficient **L**anguage **M**odels. OpenELM uses a layer-wise scaling strategy to efficiently allocate parameters within each layer of the transformer model, leading to enhanced accuracy. We pretrained OpenELM models using the [CoreNet](https://github.com/apple/corenet) library. We release both pretrained and instruction tuned models with 270M, 450M, 1.1B and 3B parameters.
Our pre-training dataset contains RefinedWeb, deduplicated PILE, a subset of RedPajama, and a subset of Dolma v1.6, totaling approximately 1.8 trillion tokens. Please check license agreements and terms of these datasets before using them.
See the list below for the details of each model:
- [OpenELM-270M](https://huggingface.co/apple/OpenELM-270M)
- [OpenELM-450M](https://huggingface.co/apple/OpenELM-450M)
- [OpenELM-1_1B](https://huggingface.co/apple/OpenELM-1_1B)
- [OpenELM-3B](https://huggingface.co/apple/OpenELM-3B)
- [OpenELM-270M-Instruct](https://huggingface.co/apple/OpenELM-270M-Instruct)
- [OpenELM-450M-Instruct](https://huggingface.co/apple/OpenELM-450M-Instruct)
- [OpenELM-1_1B-Instruct](https://huggingface.co/apple/OpenELM-1_1B-Instruct)
- [OpenELM-3B-Instruct](https://huggingface.co/apple/OpenELM-3B-Instruct)
```python
from transformers import AutoModelForCausalLM
openelm_270m = AutoModelForCausalLM.from_pretrained("apple/OpenELM-270M", trust_remote_code=True)
openelm_450m = AutoModelForCausalLM.from_pretrained("apple/OpenELM-450M", trust_remote_code=True)
openelm_1b = AutoModelForCausalLM.from_pretrained("apple/OpenELM-1_1B", trust_remote_code=True)
openelm_3b = AutoModelForCausalLM.from_pretrained("apple/OpenELM-3B", trust_remote_code=True)
openelm_270m_instruct = AutoModelForCausalLM.from_pretrained("apple/OpenELM-270M-Instruct", trust_remote_code=True)
openelm_450m_instruct = AutoModelForCausalLM.from_pretrained("apple/OpenELM-450M-Instruct", trust_remote_code=True)
openelm_1b_instruct = AutoModelForCausalLM.from_pretrained("apple/OpenELM-1_1B-Instruct", trust_remote_code=True)
openelm_3b_instruct = AutoModelForCausalLM.from_pretrained("apple/OpenELM-3B-Instruct", trust_remote_code=True)
```
## Usage
We have provided an example function to generate output from OpenELM models loaded via [HuggingFace Hub](https://huggingface.co/docs/hub/) in `generate_openelm.py`.
You can try the model by running the following command:
```
python generate_openelm.py --model [MODEL_NAME] --hf_access_token [HF_ACCESS_TOKEN] --prompt 'Once upon a time there was' --generate_kwargs repetition_penalty=1.2
```
Please refer to [this link](https://huggingface.co/docs/hub/security-tokens) to obtain your hugging face access token.
Additional arguments to the hugging face generate function can be passed via `generate_kwargs`. As an example, to speedup the inference, you can try [lookup token speculative generation](https://huggingface.co/docs/transformers/generation_strategies) by passing the `prompt_lookup_num_tokens` argument as follows:
```
python generate_openelm.py --model [MODEL_NAME] --hf_access_token [HF_ACCESS_TOKEN] --prompt 'Once upon a time there was' --generate_kwargs repetition_penalty=1.2 prompt_lookup_num_tokens=10
```
Alternatively, try model-wise speculative generation with an [assistive model](https://huggingface.co/blog/assisted-generation) by passing a smaller model through the `assistant_model` argument, for example:
```
python generate_openelm.py --model [MODEL_NAME] --hf_access_token [HF_ACCESS_TOKEN] --prompt 'Once upon a time there was' --generate_kwargs repetition_penalty=1.2 --assistant_model [SMALLER_MODEL_NAME]
```
## Main Results
### Zero-Shot
| **Model Size** | **ARC-c** | **ARC-e** | **BoolQ** | **HellaSwag** | **PIQA** | **SciQ** | **WinoGrande** | **Average** |
|-----------------------------------------------------------------------------|-----------|-----------|-----------|---------------|-----------|-----------|----------------|-------------|
| [OpenELM-270M](https://huggingface.co/apple/OpenELM-270M) | 26.45 | 45.08 | **53.98** | 46.71 | 69.75 | **84.70** | **53.91** | 54.37 |
| [OpenELM-270M-Instruct](https://huggingface.co/apple/OpenELM-270M-Instruct) | **30.55** | **46.68** | 48.56 | **52.07** | **70.78** | 84.40 | 52.72 | **55.11** |
| [OpenELM-450M](https://huggingface.co/apple/OpenELM-450M) | 27.56 | 48.06 | 55.78 | 53.97 | 72.31 | 87.20 | 58.01 | 57.56 |
| [OpenELM-450M-Instruct](https://huggingface.co/apple/OpenELM-450M-Instruct) | **30.38** | **50.00** | **60.37** | **59.34** | **72.63** | **88.00** | **58.96** | **59.95** |
| [OpenELM-1_1B](https://huggingface.co/apple/OpenELM-1_1B) | 32.34 | **55.43** | 63.58 | 64.81 | **75.57** | **90.60** | 61.72 | 63.44 |
| [OpenELM-1_1B-Instruct](https://huggingface.co/apple/OpenELM-1_1B-Instruct) | **37.97** | 52.23 | **70.00** | **71.20** | 75.03 | 89.30 | **62.75** | **65.50** |
| [OpenELM-3B](https://huggingface.co/apple/OpenELM-3B) | 35.58 | 59.89 | 67.40 | 72.44 | 78.24 | **92.70** | 65.51 | 67.39 |
| [OpenELM-3B-Instruct](https://huggingface.co/apple/OpenELM-3B-Instruct) | **39.42** | **61.74** | **68.17** | **76.36** | **79.00** | 92.50 | **66.85** | **69.15** |
### LLM360
| **Model Size** | **ARC-c** | **HellaSwag** | **MMLU** | **TruthfulQA** | **WinoGrande** | **Average** |
|-----------------------------------------------------------------------------|-----------|---------------|-----------|----------------|----------------|-------------|
| [OpenELM-270M](https://huggingface.co/apple/OpenELM-270M) | 27.65 | 47.15 | 25.72 | **39.24** | **53.83** | 38.72 |
| [OpenELM-270M-Instruct](https://huggingface.co/apple/OpenELM-270M-Instruct) | **32.51** | **51.58** | **26.70** | 38.72 | 53.20 | **40.54** |
| [OpenELM-450M](https://huggingface.co/apple/OpenELM-450M) | 30.20 | 53.86 | **26.01** | 40.18 | 57.22 | 41.50 |
| [OpenELM-450M-Instruct](https://huggingface.co/apple/OpenELM-450M-Instruct) | **33.53** | **59.31** | 25.41 | **40.48** | **58.33** | **43.41** |
| [OpenELM-1_1B](https://huggingface.co/apple/OpenELM-1_1B) | 36.69 | 65.71 | **27.05** | 36.98 | 63.22 | 45.93 |
| [OpenELM-1_1B-Instruct](https://huggingface.co/apple/OpenELM-1_1B-Instruct) | **41.55** | **71.83** | 25.65 | **45.95** | **64.72** | **49.94** |
| [OpenELM-3B](https://huggingface.co/apple/OpenELM-3B) | 42.24 | 73.28 | **26.76** | 34.98 | 67.25 | 48.90 |
| [OpenELM-3B-Instruct](https://huggingface.co/apple/OpenELM-3B-Instruct) | **47.70** | **76.87** | 24.80 | **38.76** | **67.96** | **51.22** |
### OpenLLM Leaderboard
| **Model Size** | **ARC-c** | **CrowS-Pairs** | **HellaSwag** | **MMLU** | **PIQA** | **RACE** | **TruthfulQA** | **WinoGrande** | **Average** |
|-----------------------------------------------------------------------------|-----------|-----------------|---------------|-----------|-----------|-----------|----------------|----------------|-------------|
| [OpenELM-270M](https://huggingface.co/apple/OpenELM-270M) | 27.65 | **66.79** | 47.15 | 25.72 | 69.75 | 30.91 | **39.24** | **53.83** | 45.13 |
| [OpenELM-270M-Instruct](https://huggingface.co/apple/OpenELM-270M-Instruct) | **32.51** | 66.01 | **51.58** | **26.70** | **70.78** | 33.78 | 38.72 | 53.20 | **46.66** |
| [OpenELM-450M](https://huggingface.co/apple/OpenELM-450M) | 30.20 | **68.63** | 53.86 | **26.01** | 72.31 | 33.11 | 40.18 | 57.22 | 47.69 |
| [OpenELM-450M-Instruct](https://huggingface.co/apple/OpenELM-450M-Instruct) | **33.53** | 67.44 | **59.31** | 25.41 | **72.63** | **36.84** | **40.48** | **58.33** | **49.25** |
| [OpenELM-1_1B](https://huggingface.co/apple/OpenELM-1_1B) | 36.69 | **71.74** | 65.71 | **27.05** | **75.57** | 36.46 | 36.98 | 63.22 | 51.68 |
| [OpenELM-1_1B-Instruct](https://huggingface.co/apple/OpenELM-1_1B-Instruct) | **41.55** | 71.02 | **71.83** | 25.65 | 75.03 | **39.43** | **45.95** | **64.72** | **54.40** |
| [OpenELM-3B](https://huggingface.co/apple/OpenELM-3B) | 42.24 | **73.29** | 73.28 | **26.76** | 78.24 | **38.76** | 34.98 | 67.25 | 54.35 |
| [OpenELM-3B-Instruct](https://huggingface.co/apple/OpenELM-3B-Instruct) | **47.70** | 72.33 | **76.87** | 24.80 | **79.00** | 38.47 | **38.76** | **67.96** | **55.73** |
See the technical report for more results and comparison.
## Evaluation
### Setup
Install the following dependencies:
```bash
# install public lm-eval-harness
harness_repo="public-lm-eval-harness"
git clone https://github.com/EleutherAI/lm-evaluation-harness ${harness_repo}
cd ${harness_repo}
# use main branch on 03-15-2024, SHA is dc90fec
git checkout dc90fec
pip install -e .
cd ..
# 66d6242 is the main branch on 2024-04-01
pip install datasets@git+https://github.com/huggingface/datasets.git@66d6242
pip install tokenizers>=0.15.2 transformers>=4.38.2 sentencepiece>=0.2.0
```
### Evaluate OpenELM
```bash
# OpenELM-270M
hf_model=apple/OpenELM-270M
# this flag is needed because lm-eval-harness set add_bos_token to False by default, but OpenELM uses LLaMA tokenizer which requires add_bos_token to be True
tokenizer=meta-llama/Llama-2-7b-hf
add_bos_token=True
batch_size=1
mkdir lm_eval_output
shot=0
task=arc_challenge,arc_easy,boolq,hellaswag,piqa,race,winogrande,sciq,truthfulqa_mc2
lm_eval --model hf \
--model_args pretrained=${hf_model},trust_remote_code=True,add_bos_token=${add_bos_token},tokenizer=${tokenizer} \
--tasks ${task} \
--device cuda:0 \
--num_fewshot ${shot} \
--output_path ./lm_eval_output/${hf_model//\//_}_${task//,/_}-${shot}shot \
--batch_size ${batch_size} 2>&1 | tee ./lm_eval_output/eval-${hf_model//\//_}_${task//,/_}-${shot}shot.log
shot=5
task=mmlu,winogrande
lm_eval --model hf \
--model_args pretrained=${hf_model},trust_remote_code=True,add_bos_token=${add_bos_token},tokenizer=${tokenizer} \
--tasks ${task} \
--device cuda:0 \
--num_fewshot ${shot} \
--output_path ./lm_eval_output/${hf_model//\//_}_${task//,/_}-${shot}shot \
--batch_size ${batch_size} 2>&1 | tee ./lm_eval_output/eval-${hf_model//\//_}_${task//,/_}-${shot}shot.log
shot=25
task=arc_challenge,crows_pairs_english
lm_eval --model hf \
--model_args pretrained=${hf_model},trust_remote_code=True,add_bos_token=${add_bos_token},tokenizer=${tokenizer} \
--tasks ${task} \
--device cuda:0 \
--num_fewshot ${shot} \
--output_path ./lm_eval_output/${hf_model//\//_}_${task//,/_}-${shot}shot \
--batch_size ${batch_size} 2>&1 | tee ./lm_eval_output/eval-${hf_model//\//_}_${task//,/_}-${shot}shot.log
shot=10
task=hellaswag
lm_eval --model hf \
--model_args pretrained=${hf_model},trust_remote_code=True,add_bos_token=${add_bos_token},tokenizer=${tokenizer} \
--tasks ${task} \
--device cuda:0 \
--num_fewshot ${shot} \
--output_path ./lm_eval_output/${hf_model//\//_}_${task//,/_}-${shot}shot \
--batch_size ${batch_size} 2>&1 | tee ./lm_eval_output/eval-${hf_model//\//_}_${task//,/_}-${shot}shot.log
```
## Bias, Risks, and Limitations
The release of OpenELM models aims to empower and enrich the open research community by providing access to state-of-the-art language models. Trained on publicly available datasets, these models are made available without any safety guarantees. Consequently, there exists the possibility of these models producing outputs that are inaccurate, harmful, biased, or objectionable in response to user prompts. Thus, it is imperative for users and developers to undertake thorough safety testing and implement appropriate filtering mechanisms tailored to their specific requirements.
## Citation
If you find our work useful, please cite:
```BibTex
@article{mehtaOpenELMEfficientLanguage2024,
title = {{OpenELM}: {An} {Efficient} {Language} {Model} {Family} with {Open} {Training} and {Inference} {Framework}},
shorttitle = {{OpenELM}},
url = {https://arxiv.org/abs/2404.14619v1},
language = {en},
urldate = {2024-04-24},
journal = {arXiv.org},
author = {Mehta, Sachin and Sekhavat, Mohammad Hossein and Cao, Qingqing and Horton, Maxwell and Jin, Yanzi and Sun, Chenfan and Mirzadeh, Iman and Najibi, Mahyar and Belenko, Dmitry and Zatloukal, Peter and Rastegari, Mohammad},
month = apr,
year = {2024},
}
@inproceedings{mehta2022cvnets,
author = {Mehta, Sachin and Abdolhosseini, Farzad and Rastegari, Mohammad},
title = {CVNets: High Performance Library for Computer Vision},
year = {2022},
booktitle = {Proceedings of the 30th ACM International Conference on Multimedia},
series = {MM '22}
}
``` | [
"SCIQ"
] |
VaraVroom/MindQsMed | VaraVroom | null | [
"region:us"
] | "2024-05-04T09:36:26Z" | 2024-05-04T09:37:00+00:00 | 0 | 0 | ---
{}
---
# MedQA - Assistant
***Answer medical queries through a simple LLM chatbot rather than searching the web and piecing it together on your own***
[Access on the model on Hugging Face spaces](https://huggingface.co/spaces/xpsychted/MedQA-Assistant) - `Response time is 2 mins on average`
## How to set up the environment
#### Step one: Install the required dependencies
```
pip install transformers accelerate
pip install -qU pinecone-client[grpc] sentence-transformers
pip install gradio torch pandas openpyxl tqdm
```
OR
```
pip install -r requirements.txt
```
#### Step two:
- Set up secret/environment variables `PINECONE_API_KEY` to the value of the Pinecone API key, and `PINECONE_ENV` to the value of 'Environment' in Pinecone
- In `pinecone_integration.py`, change
```
PINECONE_API_KEY = os.environ.get('PINECONE_API_KEY', None)
PINECONE_ENV = os.environ.get('PINECONE_ENV', "us-west4-gcp")
```
<br>
## NOTE - How to use memory sharding with Kaggle notebooks:
#### 1. In Kaggle, select Accelerator = 'None', then turn on the notebook
- #### Run `shard_model.py` or in `app.py`, run the below code
```
PI = PineconeIndex()
PI.build_index()
qamodel = QAModel()
model, tokenizer = qamodel.load_sharded_model()
```
`If RAM capacity is not enough to fit the model, use another platform with larger capacity to shard the model and then download and paste the folder in the root directory`
#### 2. Shutdown the notebook
#### 3. Then, select Accelerator = 'GPU P100', then turn on the notebook
- #### Run `app.py`
<br>
## Problems we faced:
- ### Reliable medical database with proper information
- Created a custom-curated medical database that had information about common to rare diseases
- ### Resource constraint
- Used Kaggle notebook with GPU, TPU, and higher-capacity CPU support
- ### Memory constraint
- Smaller LLMs do not respond well to queries and prompts, and larger LLMs are difficult to fit in memory, even on Kaggle's P100 GPU with 13GB VRAM
- With the help of the Hugging Face Accelerate library, we used memory sharding that allowed the CPU to load pieces of the model as and when required
- This solved our issue of not using a larger LLM, and we could use Google's flan-t5-xl as the base model.
- ### Context awareness
- Pinecone has a great library to use for vector embedding-related tasks
- We used Pinecone to store the medical database and retrieve the relevant documents according to the query input by the user
- ### Access constraint
- To showcase the model, we initially used FastAPI combined with ngrok to create a public access API for the website and used Gradio on the user side to access the chatbot
- We then used only the Gradio interface with `.launch(share=True)` embedded in the server-side program itself to create a public interface which removed the usage of FastAPI and ngrok
| [
"MEDQA"
] |
nes470/repo_id | nes470 | null | [
"safetensors",
"region:us"
] | "2024-05-05T23:26:53Z" | 2024-05-05T23:27:06+00:00 | 0 | 0 | ---
{}
---
The evaluation of this project is to answer trivia questions. You do
not need to do well at this task, but you should submit a system that
completes the task or create adversarial questions in that setting. This will help the whole class share data and
resources.
If you focus on something other than predicting answers, *that's fine*!
About the Data
==============
Quiz bowl is an academic competition between schools in
English-speaking countries; hundreds of teams compete in dozens of
tournaments each year. Quiz bowl is different from Jeopardy, a recent
application area. While Jeopardy also uses signaling devices, these
are only usable after a question is completed (interrupting Jeopardy's
questions would make for bad television). Thus, Jeopardy is rapacious
classification followed by a race---among those who know the
answer---to punch a button first.
Here's an example of a quiz bowl question:
Expanding on a 1908 paper by Smoluchowski, he derived a formula for
the intensity of scattered light in media fluctuating densities that
reduces to Rayleigh's law for ideal gases in The Theory of the
Opalescence of Homogenous Fluids and Liquid Mixtures near the Critical
State. That research supported his theories of matter first developed
when he calculated the diffusion constant in terms of fundamental
parameters of the particles of a gas undergoing Brownian Motion. In
that same year, 1905, he also published On a Heuristic Point of View
Concerning the Production and Transformation of Light. That
explication of the photoelectric effect won him 1921 Nobel in Physics.
For ten points, name this German physicist best known for his theory
of Relativity.
*ANSWER*: Albert _Einstein_
Two teams listen to the same question. Teams interrupt the question at
any point by "buzzing in"; if the answer is correct, the team gets
points and the next question is read. Otherwise, the team loses
points and the other team can answer.
You are welcome to use any *automatic* method to choose an answer. It
need not be similar nor build on our provided systems. In addition to
the data we provide, you are welcome to use any external data *except*
our test quiz bowl questions (i.e., don't hack our server!). You are
welcome (an encouraged) to use any publicly available software, but
you may want to check on Piazza for suggestions as many tools are
better (or easier to use) than others.
If you don't like the interruptability of questions, you can also just answer entire questions. However, you must also output a confidence.
Competition
==================
We will use Dynabech website (https://dynabench.org/tasks/qa). If you remember the past workshop about Dynabench submission, this is the way to do it. The specific task name is "Grounded QA". Here, with the help of the video tutorial, you submit your QA model and assess how your QA model did compared to others. The assessment will take place by testing your QA model on several QA test datasets and the results of yours and your competitors will be visible on the leaderboard. Your goal is to rank the highest in terms of expected wins: you buzz in with probability proportional to your confidence, and if you're more right than the competition, you win.
Writing Questions
==================
Alternatively, you can also *write* 50 adversarial questions that
challenge modern NLP systems. These questions must be diverse in the
subjects asked about, the skills computers need to answer the
questions, and the entities in those questions. Remember that your questions should be *factual* and
*specific* enough for humans to answer, because your task is to stump
the computers relative to humans!
In addition to the raw questions, you will also need to create citations describing:
* Why the question is difficult for computers: include citations from the NLP/AI/ML literature
* Why the information in the question is correct: include citations from the sources you drew on the write the question
* Why the question is interesting: include scholarly / popular culture artifacts to prove that people care about this
* Why the question is pyramidal: discuss why your first clues are harder than your later clues
**Category**
We want questions from many domains such as Art, Literature, Geography, History,
Science, TV and Film, Music, Lifestyle, and Sport. The questions
should be written using all topics above (5 questions for each
category and 5 more for the remaining categories). Indicate in your
writeup which category you chose to write on for each question.
Art:
* Questions about works: Mona Lisa, Raft of the Medussa
* Questions about forms: color, contour, texture
* Questions about artists: Picasso, Monet, Leonardo da Vinci
* Questions about context: Renaissance, post-modernism, expressionism, surrealism
Literature:
* Questions about works: novels (1984), plays (The Lion and the Jewel), poems (Rubaiyat), criticism (Poetics)
* Questions about major characters or events in literature: The Death of Anna Karenina, Noboru Wataya, the Marriage of Hippolyta and Theseus
* Questions about literary movements (Sturm und Drang)
* Questions about translations
* Cross-cutting questions (appearances of Overcoats in novels)
* Common link questions (the literary output of a country/region)
Geography:
* Questions about location: names of capital, state, river
* Questions about the place: temperature, wind flow, humidity
History:
* When: When did the First World war start?
* Who: Who is called Napoleon of Iran?
* Where: Where was the first Summer Olympics held?
* Which: Which is the oldest civilization in the world?
Science:
* Questions about terminology: The concept of gravity was discovered by which famous physicist?
* Questions about the experiment
* Questions about theory: The social action theory believes that individuals are influenced by this theory.
TV and Film:
* Quotes: What are the dying words of Charles Foster Kane in Citizen Kane?
* Title: What 1927 musical was the first "talkie"?
* Plot: In The Matrix, does Neo take the blue pill or the red pill?
Music:
* Singer: What singer has had a Billboard No. 1 hit in each of the last four decades?
* Band: Before Bleachers and fun., Jack Antonoff fronted what band?
* Title: What was Madonna's first top 10 hit?
* History: Which classical composer was deaf?
Lifestyle:
* Clothes: What clothing company, founded by a tennis player, has an alligator logo?
* Decoration: What was the first perfume sold by Coco Chanel?
Sport:
* Known facts: What sport is best known as the ‘king of sports’?
* Nationality: What’s the national sport of Canada?
* Sport player: The classic 1980 movie called Raging Bull is about which real-life boxer?
* Country: What country has competed the most times in the Summer Olympics yet hasn’t won any kind of medal?
**Diversity**
Other than category diversity, if you find an ingenious way of writing questions about underrepresented countries, you will get bonus points (indicate which questions you included the diversity component in your writeup). You may decide which are underrepresented countries with your own reasonable reason (etc., less population may indicate underrepresented), but make sure to articulate this in your writeup.
* Run state of the art QA systems on the questions to show they struggle, give individual results for each question and a summary over all questions
For an example of what the writeup for a single question should look like, see the adversarial HW:
https://github.com/Pinafore/nlp-hw/blob/master/adversarial/question.tex
Proposal
==================
The project proposal is a one page PDF document that describes:
* Who is on your team (team sizes can be between three and six
students, but six is really too big to be effective; my suggestion
is that most groups should be between four or five).
* What techniques you will explore
* Your timeline for completing the project (be realistic; you should
have your first submission in a week or two)
Submit the proposal on Gradescope, but make sure to include all group
members. If all group members are not included, you will lose points. Late days cannot be used on this
assignment.
Milestone 1
======================
You'll have to update how things are going: what's
working, what isn't, and how does it change your timeline? How does it change your division of labor?
*Question Writing*: You'll need to have answers selected for all of
your questions and first drafts of at least 15 questions. This must
be submitted as a JSON file so that we run computer QA systems on it.
*Project*: You'll need to have made a submission to the leaderboard with something that satisfies the API.
Submit a PDF updating on your progress to Gradescope. If all team
members are not on the submission, you will lose points.
Milestone 2
===================
As before, provide an updated timeline / division of labor, provide your intermediary results.
*Question Writing*: You'll need to have reflected the feedback from the first questions and completed a first draft of at least 30 questions. You'll also need machine results to your questions and an overall evaluation of your human/computer accuracy.
*Project*: You'll need to have a made a submission to the leaderboard with a working system (e.g., not just obey the API, but actually get reasonable answers).
Submit a PDF updating on your progress.
Final Presentation
======================
The final presentation will be virtual (uploading a video). In
the final presentation you will:
* Explain what you did
* Who did what. For example, for the question writing project a team of five people might write: A wrote the first draft of questions. B and C verified they were initially answerable by a human. B ran computer systems to verify they were challenging to a computer. C edited the questions and increased the computer difficulty. D and E verified that the edited questions were still answerable by a human. D and E checked all of the questions for factual accuracy and created citations and the writeup.
* What challenges you had
* Review how well you did (based on the competition or your own metrics). If you do not use the course infrastructure to evaluate your project's work, you should talk about what alternative evaluations you used, why they're appropriate/fair, and how well you did on them.
* Provide an error analysis. An error analysis must contain examples from the
development set that you get wrong. You should show those sentences
and explain why (in terms of features or the model) they have the
wrong answer. You should have been doing this all along as you
derive new features, but this is your final inspection of
your errors. The feature or model problems you discover should not
be trivial features you could add easily. Instead, these should be
features or models that are difficult to correct. An error analysis
is not the same thing as simply presenting the error matrix, as it
does not inspect any individual examples. If you're writing questions, talk about examples of questions that didn't work out as intended.
* The linguistic motivation for your features / how your wrote the questions. This is a
computational linguistics class, so you should give precedence to
features / techniques that we use in this class (e.g., syntax,
morphology, part of speech, word sense, etc.). Given two features
that work equally well and one that is linguistically motivated,
we'll prefer the linguistically motivated one.
* Presumably you did many different things; how did they each
individually contribute to your final result?
Each group has 10 minutes to deliver their presentation. Please record the video, and upload it to Google Drive, and include the link in your writeup submission.
Final Question Submission
======================
Because we need to get the questions ready for the systems, upload your raw questions on May 10. This doesn't include the citations or other parts of the writeup.
System Submission
======================
You must submit a version of your system by May 12. It may not be perfect, but this what the question writing teams will use to test their results.
Your system should be sent directly to the professor and TAs in zip files, including the correct dependencies and a working inference code. Your inference code should run successfully in the root folder (extracted from zip folder) directory with the command:
```
> python3 inference.py --data=evaluation_set.json
```
The input will be in the form of a .json file () in the same format as the file the adversarial question writing team submits. The output format should also be in string.
If you have any notes or comments that we should be aware of while running your code, please include them in the folder as a .txt file. Also, dependency information should be included as a .txt file.
Please prepend your email title with [2024-CMSC 470 System Submission].
Project Writeup and JSON file
======================
By May 17, submit your project writeup explaining what
you did and what results you achieved. This document should
make it clear:
* Why this is a good idea
* What you did
* Who did what
* Whether your technique worked or not
For systems, please do not go over 2500 words unless you have a really good reason.
Images are a much better use of space than words, usually (there's no
limit on including images, but use judgement and be selective).
For question writing, you have one page (single spaced, two column) per question plus a two page summary of results. Talk about how you organized the question writing, how you evaluated the questions, and a summary of the results. Along with your writeup, turn in a json including the raw text of the question and answer and category. The json file is included in this directory. Make sure your json file is in the correct format and is callable via below code. Your submission will not be graded if it does not follow the format of the example json file.
```
with open('path to your json file', 'r') as f:
data = json.load(f)
```
Grade
======================
The grade will be out of 25 points, broken into five areas:
* _Presentation_: For your oral presentation, do you highlight what
you did and make people care? Did you use time well during the
presentation?
* _Writeup_: Does the writeup explain what you did in a way that is
clear and effective?
The final three areas are different between the system and the questions.
| | System | Questions |
|----------|:-------------:|------:|
| _Technical Soundness_ | Did you use the right tools for the job, and did you use them correctly? Were they relevant to this class? | Were your questions correct and accurately cited. |
| _Effort_ | Did you do what you say you would, and was it the right ammount of effort. | Are the questions well-written, interesting, and thoroughly edited? |
| _Performance_ | How did your techniques perform in terms of accuracy, recall, etc.? | Is the human accuracy substantially higher than the computer accuracy? |
All members of the group will receive the same grade. It's impossible for the course staff to adjudicate Rashomon-style accounts of who did what, and the goal of a group project is for all team members to work together to create a cohesive project that works well together. While it makes sense to divide the work into distinct areas of responsibility, at grading time we have now way to know who really did what, so it's the groups responsibility to create a piece of output that reflects well on the whole group.
| [
"MEDAL"
] |
pclucas14/library-mixtral-8x7b-5ep-raw | pclucas14 | null | [
"region:us"
] | "2024-05-06T19:55:28Z" | 2024-05-14T12:05:37+00:00 | 0 | 0 | ---
{}
---
Number of experts present in the library: 256
| Expert Name | Base Model | Trained on | Adapter Type |
| --- | --- | --- | --- |
| dream_baseline | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/dream_baseline | lora |
| app_reviews_convert_to_star_rating | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/app_reviews_convert_to_star_rating | lora |
| race_high_Select_the_best_answer_no_instructions_ | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_high_Select_the_best_answer_no_instructions_ | lora |
| wmt16_translate_tr_en_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wmt16_translate_tr_en_1_0_0 | lora |
| wiqa_what_might_be_the_last_step_of_the_process | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiqa_what_might_be_the_last_step_of_the_process | lora |
| race_high_Write_a_multi_choice_question_options_given_ | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_options_given_ | lora |
| wiki_qa_automatic_system | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_qa_automatic_system | lora |
| dbpedia_14_pick_one_category_for_the_following_text | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/dbpedia_14_pick_one_category_for_the_following_text | lora |
| adversarial_qa_droberta_answer_the_following_q | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_droberta_answer_the_following_q | lora |
| qasc_qa_with_separated_facts_2 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_2 | lora |
| duorc_SelfRC_title_generation | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_SelfRC_title_generation | lora |
| cot_gsm8k_ii | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cot_gsm8k_ii | lora |
| trivia_qa_rc_1_1_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/trivia_qa_rc_1_1_0 | lora |
| trec_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/trec_1_0_0 | lora |
| wiqa_what_is_the_missing_first_step | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiqa_what_is_the_missing_first_step | lora |
| ropes_plain_bottom_hint | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/ropes_plain_bottom_hint | lora |
| social_i_qa_Show_choices_and_generate_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_answer | lora |
| wiki_hop_original_generate_subject_and_object | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_hop_original_generate_subject_and_object | lora |
| quoref_Read_And_Extract_ | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quoref_Read_And_Extract_ | lora |
| dream_read_the_following_conversation_and_answer_the_question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/dream_read_the_following_conversation_and_answer_the_question | lora |
| qasc_qa_with_combined_facts_1 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/qasc_qa_with_combined_facts_1 | lora |
| coqa_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/coqa_1_0_0 | lora |
| duorc_SelfRC_generate_question_by_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_SelfRC_generate_question_by_answer | lora |
| wiki_hop_original_choose_best_object_affirmative_3 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_3 | lora |
| dream_generate_last_utterance | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/dream_generate_last_utterance | lora |
| wiqa_what_might_be_the_first_step_of_the_process | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiqa_what_might_be_the_first_step_of_the_process | lora |
| math_dataset_algebra__linear_1d_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/math_dataset_algebra__linear_1d_1_0_0 | lora |
| fix_punct | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/fix_punct | lora |
| quoref_Find_Answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quoref_Find_Answer | lora |
| quac_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quac_1_0_0 | lora |
| super_glue_record_1_0_2 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/super_glue_record_1_0_2 | lora |
| wiki_qa_found_on_google | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_qa_found_on_google | lora |
| ropes_plain_no_background | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/ropes_plain_no_background | lora |
| cos_e_v1_11_rationale | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cos_e_v1_11_rationale | lora |
| quail_context_description_question_answer_id | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quail_context_description_question_answer_id | lora |
| quail_context_question_answer_description_text | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quail_context_question_answer_description_text | lora |
| adversarial_qa_dbert_question_context_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_dbert_question_context_answer | lora |
| gigaword_1_2_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/gigaword_1_2_0 | lora |
| super_glue_cb_1_0_2 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/super_glue_cb_1_0_2 | lora |
| cos_e_v1_11_explain_why_human | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cos_e_v1_11_explain_why_human | lora |
| adversarial_qa_dbert_generate_question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_dbert_generate_question | lora |
| quail_context_question_description_text | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quail_context_question_description_text | lora |
| sciq_Multiple_Choice_Closed_Book_ | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/sciq_Multiple_Choice_Closed_Book_ | lora |
| quartz_answer_question_below | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quartz_answer_question_below | lora |
| wiki_qa_exercise | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_qa_exercise | lora |
| squad_v1_1_3_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/squad_v1_1_3_0_0 | lora |
| app_reviews_categorize_rating_using_review | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/app_reviews_categorize_rating_using_review | lora |
| wiki_bio_who | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_bio_who | lora |
| social_i_qa_Show_choices_and_generate_index | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_index | lora |
| duorc_ParaphraseRC_extract_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_ParaphraseRC_extract_answer | lora |
| sciq_Multiple_Choice | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/sciq_Multiple_Choice | lora |
| dbpedia_14_given_a_choice_of_categories_ | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/dbpedia_14_given_a_choice_of_categories_ | lora |
| huggingface_xsum | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/huggingface_xsum | lora |
| quoref_Answer_Question_Given_Context | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quoref_Answer_Question_Given_Context | lora |
| race_high_Is_this_the_right_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_high_Is_this_the_right_answer | lora |
| kilt_tasks_hotpotqa_straighforward_qa | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_straighforward_qa | lora |
| quartz_given_the_fact_answer_the_q | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quartz_given_the_fact_answer_the_q | lora |
| social_i_qa_Generate_the_question_from_the_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/social_i_qa_Generate_the_question_from_the_answer | lora |
| qasc_qa_with_separated_facts_4 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_4 | lora |
| duorc_ParaphraseRC_movie_director | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_ParaphraseRC_movie_director | lora |
| stream_qed | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/stream_qed | lora |
| qasc_is_correct_2 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/qasc_is_correct_2 | lora |
| cot_strategyqa_ii | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cot_strategyqa_ii | lora |
| cos_e_v1_11_question_description_option_id | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_id | lora |
| adversarial_qa_dbidaf_question_context_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_question_context_answer | lora |
| super_glue_wsc_fixed_1_0_2 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/super_glue_wsc_fixed_1_0_2 | lora |
| social_i_qa_Check_if_a_random_answer_is_valid_or_not | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/social_i_qa_Check_if_a_random_answer_is_valid_or_not | lora |
| yelp_polarity_reviews_0_2_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/yelp_polarity_reviews_0_2_0 | lora |
| duorc_ParaphraseRC_generate_question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question | lora |
| cos_e_v1_11_question_option_description_id | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_id | lora |
| glue_mnli_2_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/glue_mnli_2_0_0 | lora |
| quoref_What_Is_The_Answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quoref_What_Is_The_Answer | lora |
| quartz_use_info_from_paragraph_question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quartz_use_info_from_paragraph_question | lora |
| adversarial_qa_dbidaf_generate_question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_generate_question | lora |
| quarel_testing_students | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quarel_testing_students | lora |
| race_middle_Taking_a_test | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_middle_Taking_a_test | lora |
| quoref_Context_Contains_Answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quoref_Context_Contains_Answer | lora |
| natural_questions_open_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/natural_questions_open_1_0_0 | lora |
| duorc_SelfRC_question_answering | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_SelfRC_question_answering | lora |
| kilt_tasks_hotpotqa_formulate | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_formulate | lora |
| paws_wiki_1_1_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/paws_wiki_1_1_0 | lora |
| ropes_background_new_situation_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/ropes_background_new_situation_answer | lora |
| wiqa_effect_with_string_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiqa_effect_with_string_answer | lora |
| duorc_ParaphraseRC_decide_worth_it | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_ParaphraseRC_decide_worth_it | lora |
| quarel_logic_test | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quarel_logic_test | lora |
| squad_v2_0_3_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/squad_v2_0_3_0_0 | lora |
| quail_description_context_question_text | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quail_description_context_question_text | lora |
| race_high_Select_the_best_answer_generate_span_ | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_high_Select_the_best_answer_generate_span_ | lora |
| cot_sensemaking | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cot_sensemaking | lora |
| wiki_qa_Generate_Question_from_Topic | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_qa_Generate_Question_from_Topic | lora |
| quoref_Given_Context_Answer_Question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quoref_Given_Context_Answer_Question | lora |
| stream_qed_ii | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/stream_qed_ii | lora |
| unified_qa_science_inst | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/unified_qa_science_inst | lora |
| quoref_Guess_Title_For_Context | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quoref_Guess_Title_For_Context | lora |
| quail_context_description_question_answer_text | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quail_context_description_question_answer_text | lora |
| wiki_hop_original_choose_best_object_interrogative_2 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_2 | lora |
| adversarial_qa_dbidaf_tell_what_it_is | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_tell_what_it_is | lora |
| duorc_SelfRC_extract_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_SelfRC_extract_answer | lora |
| dream_answer_to_dialogue | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/dream_answer_to_dialogue | lora |
| web_questions_potential_correct_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/web_questions_potential_correct_answer | lora |
| kilt_tasks_hotpotqa_final_exam | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_final_exam | lora |
| cot_qasc | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cot_qasc | lora |
| web_questions_get_the_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/web_questions_get_the_answer | lora |
| quail_description_context_question_answer_text | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quail_description_context_question_answer_text | lora |
| sciq_Direct_Question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/sciq_Direct_Question | lora |
| app_reviews_convert_to_rating | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/app_reviews_convert_to_rating | lora |
| wmt16_translate_fi_en_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wmt16_translate_fi_en_1_0_0 | lora |
| glue_qqp_2_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/glue_qqp_2_0_0 | lora |
| qasc_qa_with_separated_facts_3 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_3 | lora |
| adversarial_qa_dbidaf_based_on | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_based_on | lora |
| quarel_heres_a_story | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quarel_heres_a_story | lora |
| cosmos_qa_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cosmos_qa_1_0_0 | lora |
| wiki_bio_what_content | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_bio_what_content | lora |
| super_glue_wic_1_0_2 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/super_glue_wic_1_0_2 | lora |
| adversarial_qa_droberta_tell_what_it_is | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_droberta_tell_what_it_is | lora |
| quoref_Found_Context_Online | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quoref_Found_Context_Online | lora |
| qasc_qa_with_separated_facts_1 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_1 | lora |
| stream_aqua | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/stream_aqua | lora |
| wiki_qa_Is_This_True_ | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_qa_Is_This_True_ | lora |
| duorc_SelfRC_decide_worth_it | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_SelfRC_decide_worth_it | lora |
| gem_e2e_nlg_1_1_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/gem_e2e_nlg_1_1_0 | lora |
| race_high_Select_the_best_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_high_Select_the_best_answer | lora |
| dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | lora |
| quail_description_context_question_answer_id | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quail_description_context_question_answer_id | lora |
| race_high_Write_a_multi_choice_question_for_the_following_article | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_for_the_following_article | lora |
| quarel_do_not_use | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quarel_do_not_use | lora |
| adversarial_qa_dbidaf_answer_the_following_q | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_answer_the_following_q | lora |
| cot_esnli_ii | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cot_esnli_ii | lora |
| quail_context_description_question_text | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quail_context_description_question_text | lora |
| anli_r3_0_1_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/anli_r3_0_1_0 | lora |
| quail_context_question_description_answer_id | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quail_context_question_description_answer_id | lora |
| duorc_ParaphraseRC_answer_question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_ParaphraseRC_answer_question | lora |
| multi_news_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/multi_news_1_0_0 | lora |
| social_i_qa_Generate_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/social_i_qa_Generate_answer | lora |
| quartz_paragraph_question_plain_concat | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quartz_paragraph_question_plain_concat | lora |
| race_middle_Write_a_multi_choice_question_for_the_following_article | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_for_the_following_article | lora |
| wiqa_does_the_supposed_perturbation_have_an_effect | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiqa_does_the_supposed_perturbation_have_an_effect | lora |
| quail_no_prompt_text | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quail_no_prompt_text | lora |
| quoref_Guess_Answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quoref_Guess_Answer | lora |
| ropes_prompt_bottom_hint_beginning | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/ropes_prompt_bottom_hint_beginning | lora |
| glue_cola_2_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/glue_cola_2_0_0 | lora |
| aeslc_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/aeslc_1_0_0 | lora |
| duorc_SelfRC_movie_director | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_SelfRC_movie_director | lora |
| wiqa_which_of_the_following_is_the_supposed_perturbation | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiqa_which_of_the_following_is_the_supposed_perturbation | lora |
| cot_creak | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cot_creak | lora |
| wiki_bio_guess_person | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_bio_guess_person | lora |
| ropes_given_background_situation | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/ropes_given_background_situation | lora |
| cot_gsm8k | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cot_gsm8k | lora |
| dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | lora |
| wiki_qa_Jeopardy_style | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_qa_Jeopardy_style | lora |
| anli_r2_0_1_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/anli_r2_0_1_0 | lora |
| cos_e_v1_11_description_question_option_id | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_id | lora |
| wiki_hop_original_choose_best_object_affirmative_1 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_1 | lora |
| wiqa_what_is_the_final_step_of_the_following_process | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiqa_what_is_the_final_step_of_the_following_process | lora |
| qasc_is_correct_1 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/qasc_is_correct_1 | lora |
| social_i_qa_I_was_wondering | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/social_i_qa_I_was_wondering | lora |
| kilt_tasks_hotpotqa_combining_facts | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_combining_facts | lora |
| ropes_read_background_situation | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/ropes_read_background_situation | lora |
| lambada_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/lambada_1_0_0 | lora |
| snli_1_1_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/snli_1_1_0 | lora |
| cot_creak_ii | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cot_creak_ii | lora |
| adversarial_qa_droberta_question_context_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_droberta_question_context_answer | lora |
| race_middle_Select_the_best_answer_generate_span_ | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_generate_span_ | lora |
| cos_e_v1_11_question_description_option_text | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_text | lora |
| quartz_answer_question_based_on | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quartz_answer_question_based_on | lora |
| imdb_reviews_plain_text_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/imdb_reviews_plain_text_1_0_0 | lora |
| duorc_ParaphraseRC_title_generation | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_ParaphraseRC_title_generation | lora |
| app_reviews_generate_review | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/app_reviews_generate_review | lora |
| wiki_qa_Direct_Answer_to_Question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_qa_Direct_Answer_to_Question | lora |
| stream_aqua_ii | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/stream_aqua_ii | lora |
| sciq_Multiple_Choice_Question_First | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/sciq_Multiple_Choice_Question_First | lora |
| wiki_qa_Decide_good_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_qa_Decide_good_answer | lora |
| race_high_Read_the_article_and_answer_the_question_no_option_ | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_high_Read_the_article_and_answer_the_question_no_option_ | lora |
| cos_e_v1_11_generate_explanation_given_text | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cos_e_v1_11_generate_explanation_given_text | lora |
| wmt14_translate_fr_en_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wmt14_translate_fr_en_1_0_0 | lora |
| gem_dart_1_1_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/gem_dart_1_1_0 | lora |
| super_glue_rte_1_0_2 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/super_glue_rte_1_0_2 | lora |
| glue_wnli_2_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/glue_wnli_2_0_0 | lora |
| quartz_read_passage_below_choose | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quartz_read_passage_below_choose | lora |
| quarel_choose_between | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quarel_choose_between | lora |
| quoref_Answer_Friend_Question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quoref_Answer_Friend_Question | lora |
| web_questions_short_general_knowledge_q | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/web_questions_short_general_knowledge_q | lora |
| cot_strategyqa | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cot_strategyqa | lora |
| adversarial_qa_droberta_based_on | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_droberta_based_on | lora |
| cos_e_v1_11_description_question_option_text | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_text | lora |
| glue_stsb_2_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/glue_stsb_2_0_0 | lora |
| adversarial_qa_dbert_based_on | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_dbert_based_on | lora |
| glue_qnli_2_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/glue_qnli_2_0_0 | lora |
| quartz_use_info_from_question_paragraph | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quartz_use_info_from_question_paragraph | lora |
| adversarial_qa_dbert_tell_what_it_is | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_dbert_tell_what_it_is | lora |
| cnn_dailymail_3_4_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cnn_dailymail_3_4_0 | lora |
| wiki_hop_original_generate_object | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_hop_original_generate_object | lora |
| definite_pronoun_resolution_1_1_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/definite_pronoun_resolution_1_1_0 | lora |
| quoref_Answer_Test | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quoref_Answer_Test | lora |
| word_segment | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/word_segment | lora |
| duorc_SelfRC_answer_question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_SelfRC_answer_question | lora |
| cot_esnli | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cot_esnli | lora |
| glue_sst2_2_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/glue_sst2_2_0_0 | lora |
| anli_r1_0_1_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/anli_r1_0_1_0 | lora |
| gem_common_gen_1_1_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/gem_common_gen_1_1_0 | lora |
| wiki_hop_original_choose_best_object_interrogative_1 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_1 | lora |
| wiki_qa_Topic_Prediction_Question_and_Answer_Pair | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_and_Answer_Pair | lora |
| dream_generate_first_utterance | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/dream_generate_first_utterance | lora |
| quail_context_question_description_answer_text | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quail_context_question_description_answer_text | lora |
| ropes_prompt_mix | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/ropes_prompt_mix | lora |
| race_high_Taking_a_test | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_high_Taking_a_test | lora |
| ropes_new_situation_background_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/ropes_new_situation_background_answer | lora |
| wiki_qa_Topic_Prediction_Answer_Only | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Answer_Only | lora |
| wiki_qa_Topic_Prediction_Question_Only | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_Only | lora |
| duorc_ParaphraseRC_question_answering | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_ParaphraseRC_question_answering | lora |
| race_middle_Write_a_multi_choice_question_options_given_ | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_options_given_ | lora |
| duorc_SelfRC_build_story_around_qa | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_SelfRC_build_story_around_qa | lora |
| sciq_Direct_Question_Closed_Book_ | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/sciq_Direct_Question_Closed_Book_ | lora |
| wiki_hop_original_explain_relation | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_hop_original_explain_relation | lora |
| kilt_tasks_hotpotqa_complex_question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_complex_question | lora |
| wiqa_effect_with_label_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiqa_effect_with_label_answer | lora |
| quail_no_prompt_id | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quail_no_prompt_id | lora |
| wiki_bio_key_content | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_bio_key_content | lora |
| race_middle_Is_this_the_right_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_middle_Is_this_the_right_answer | lora |
| cot_ecqa_ii | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cot_ecqa_ii | lora |
| glue_mrpc_2_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/glue_mrpc_2_0_0 | lora |
| duorc_ParaphraseRC_build_story_around_qa | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_ParaphraseRC_build_story_around_qa | lora |
| quail_context_question_answer_description_id | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quail_context_question_answer_description_id | lora |
| web_questions_whats_the_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/web_questions_whats_the_answer | lora |
| duorc_ParaphraseRC_generate_question_by_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question_by_answer | lora |
| drop_2_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/drop_2_0_0 | lora |
| ropes_prompt_bottom_no_hint | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/ropes_prompt_bottom_no_hint | lora |
| wiki_hop_original_choose_best_object_affirmative_2 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_2 | lora |
| wmt16_translate_ro_en_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wmt16_translate_ro_en_1_0_0 | lora |
| ropes_background_situation_middle | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/ropes_background_situation_middle | lora |
| gem_web_nlg_en_1_1_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/gem_web_nlg_en_1_1_0 | lora |
| adversarial_qa_droberta_generate_question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_droberta_generate_question | lora |
| super_glue_copa_1_0_2 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/super_glue_copa_1_0_2 | lora |
| wiki_hop_original_generate_subject | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_hop_original_generate_subject | lora |
| cos_e_v1_11_question_option_description_text | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_text | lora |
| web_questions_question_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/web_questions_question_answer | lora |
| ropes_prompt_beginning | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/ropes_prompt_beginning | lora |
| gem_wiki_lingua_english_en_1_1_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/gem_wiki_lingua_english_en_1_1_0 | lora |
| true_case | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/true_case | lora |
| ag_news_subset_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/ag_news_subset_1_0_0 | lora |
| ropes_plain_background_situation | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/ropes_plain_background_situation | lora |
| para_crawl_enes | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/para_crawl_enes | lora |
| duorc_SelfRC_generate_question | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/duorc_SelfRC_generate_question | lora |
| cot_ecqa | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cot_ecqa | lora |
| wiki_bio_comprehension | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wiki_bio_comprehension | lora |
| cos_e_v1_11_aligned_with_common_sense | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cos_e_v1_11_aligned_with_common_sense | lora |
| cot_sensemaking_ii | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cot_sensemaking_ii | lora |
| wmt16_translate_de_en_1_0_0 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/wmt16_translate_de_en_1_0_0 | lora |
| adversarial_qa_dbert_answer_the_following_q | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_dbert_answer_the_following_q | lora |
| race_middle_Select_the_best_answer_no_instructions_ | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_no_instructions_ | lora |
| qasc_qa_with_separated_facts_5 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_5 | lora |
| quartz_having_read_above_passage | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/quartz_having_read_above_passage | lora |
| race_middle_Select_the_best_answer | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer | lora |
| cos_e_v1_11_i_think | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/cos_e_v1_11_i_think | lora |
| super_glue_multirc_1_0_2 | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/super_glue_multirc_1_0_2 | lora |
| race_middle_Read_the_article_and_answer_the_question_no_option_ | mistralai/Mixtral-8x7B-v0.1 | sordonia/flan-10k-flat/race_middle_Read_the_article_and_answer_the_question_no_option_ | lora |
Last updated on: 2024-05-14 12:05:37+00:00
| [
"SCIQ"
] |
PragmaticPete/phi3 | PragmaticPete | text-generation | [
"transformers",
"safetensors",
"phi3",
"text-generation",
"nlp",
"code",
"conversational",
"custom_code",
"en",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | "2024-05-09T20:53:56Z" | 2024-05-10T15:39:28+00:00 | 0 | 0 | ---
language:
- en
license: mit
license_link: https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/resolve/main/LICENSE
pipeline_tag: text-generation
tags:
- nlp
- code
widget:
- messages:
- role: user
content: Can you provide ways to eat combinations of bananas and dragonfruits?
---
## Model Summary
The Phi-3-Mini-128K-Instruct is a 3.8 billion-parameter, lightweight, state-of-the-art open model trained using the Phi-3 datasets.
This dataset includes both synthetic data and filtered publicly available website data, with an emphasis on high-quality and reasoning-dense properties.
The model belongs to the Phi-3 family with the Mini version in two variants [4K](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) and [128K](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct) which is the context length (in tokens) that it can support.
After initial training, the model underwent a post-training process that involved supervised fine-tuning and direct preference optimization to enhance its ability to follow instructions and adhere to safety measures.
When evaluated against benchmarks that test common sense, language understanding, mathematics, coding, long-term context, and logical reasoning, the Phi-3 Mini-128K-Instruct demonstrated robust and state-of-the-art performance among models with fewer than 13 billion parameters.
Resources and Technical Documentation:
+ [Phi-3 Microsoft Blog](https://aka.ms/phi3blog-april)
+ [Phi-3 Technical Report](https://aka.ms/phi3-tech-report)
+ [Phi-3 on Azure AI Studio](https://aka.ms/phi3-azure-ai)
+ Phi-3 ONNX: [128K](https://aka.ms/Phi3-mini-128k-instruct-onnx)
## Intended Uses
**Primary use cases**
The model is intended for commercial and research use in English. The model provides uses for applications which require:
1) Memory/compute constrained environments
2) Latency bound scenarios
3) Strong reasoning (especially code, math and logic)
Our model is designed to accelerate research on language and multimodal models, for use as a building block for generative AI powered features.
**Use case considerations**
Our models are not specifically designed or evaluated for all downstream purposes. Developers should consider common limitations of language models as they select use cases, and evaluate and mitigate for accuracy, safety, and fariness before using within a specific downstream use case, particularly for high risk scenarios. Developers should be aware of and adhere to applicable laws or regulations (including privacy, trade compliance laws, etc.) that are relevant to their use case.
Nothing contained in this Model Card should be interpreted as or deemed a restriction or modification to the license the model is released under.
## How to Use
Phi-3 Mini-128K-Instruct has been integrated in the development version (4.41.0.dev0) of `transformers`. Until the official version is released through `pip`, ensure that you are doing one of the following:
* When loading the model, ensure that `trust_remote_code=True` is passed as an argument of the `from_pretrained()` function.
* Update your local `transformers` to the development version: `pip uninstall -y transformers && pip install git+https://github.com/huggingface/transformers`. The previous command is an alternative to cloning and installing from the source.
The current `transformers` version can be verified with: `pip list | grep transformers`.
### Tokenizer
Phi-3 Mini-128K-Instruct supports a vocabulary size of up to `32064` tokens. The [tokenizer files](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/blob/main/added_tokens.json) already provide placeholder tokens that can be used for downstream fine-tuning, but they can also be extended up to the model's vocabulary size.
### Chat Format
Given the nature of the training data, the Phi-3 Mini-128K-Instruct model is best suited for prompts using the chat format as follows.
You can provide the prompt as a question with a generic template as follow:
```markdown
<|user|>\nQuestion<|end|>\n<|assistant|>
```
For example:
```markdown
<|user|>
How to explain Internet for a medieval knight?<|end|>
<|assistant|>
```
where the model generates the text after `<|assistant|>`. In case of few-shots prompt, the prompt can be formatted as the following:
```markdown
<|user|>
I am going to Paris, what should I see?<|end|>
<|assistant|>
Paris, the capital of France, is known for its stunning architecture, art museums, historical landmarks, and romantic atmosphere. Here are some of the top attractions to see in Paris:\n\n1. The Eiffel Tower: The iconic Eiffel Tower is one of the most recognizable landmarks in the world and offers breathtaking views of the city.\n2. The Louvre Museum: The Louvre is one of the world's largest and most famous museums, housing an impressive collection of art and artifacts, including the Mona Lisa.\n3. Notre-Dame Cathedral: This beautiful cathedral is one of the most famous landmarks in Paris and is known for its Gothic architecture and stunning stained glass windows.\n\nThese are just a few of the many attractions that Paris has to offer. With so much to see and do, it's no wonder that Paris is one of the most popular tourist destinations in the world."<|end|>
<|user|>
What is so great about #1?<|end|>
<|assistant|>
```
### Sample inference code
This code snippets show how to get quickly started with running the model on a GPU:
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
torch.random.manual_seed(0)
model = AutoModelForCausalLM.from_pretrained(
"microsoft/Phi-3-mini-128k-instruct",
device_map="cuda",
torch_dtype="auto",
trust_remote_code=True,
)
tokenizer = AutoTokenizer.from_pretrained("microsoft/Phi-3-mini-128k-instruct")
messages = [
{"role": "user", "content": "Can you provide ways to eat combinations of bananas and dragonfruits?"},
{"role": "assistant", "content": "Sure! Here are some ways to eat bananas and dragonfruits together: 1. Banana and dragonfruit smoothie: Blend bananas and dragonfruits together with some milk and honey. 2. Banana and dragonfruit salad: Mix sliced bananas and dragonfruits together with some lemon juice and honey."},
{"role": "user", "content": "What about solving an 2x + 3 = 7 equation?"},
]
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
)
generation_args = {
"max_new_tokens": 500,
"return_full_text": False,
"temperature": 0.0,
"do_sample": False,
}
output = pipe(messages, **generation_args)
print(output[0]['generated_text'])
```
*Some applications/frameworks might not include a BOS token (`<s>`) at the start of the conversation. Please ensure that it is included since it provides more reliable results.*
## Responsible AI Considerations
Like other language models, the Phi series models can potentially behave in ways that are unfair, unreliable, or offensive. Some of the limiting behaviors to be aware of include:
+ Quality of Service: the Phi models are trained primarily on English text. Languages other than English will experience worse performance. English language varieties with less representation in the training data might experience worse performance than standard American English.
+ Representation of Harms & Perpetuation of Stereotypes: These models can over- or under-represent groups of people, erase representation of some groups, or reinforce demeaning or negative stereotypes. Despite safety post-training, these limitations may still be present due to differing levels of representation of different groups or prevalence of examples of negative stereotypes in training data that reflect real-world patterns and societal biases.
+ Inappropriate or Offensive Content: these models may produce other types of inappropriate or offensive content, which may make it inappropriate to deploy for sensitive contexts without additional mitigations that are specific to the use case.
+ Information Reliability: Language models can generate nonsensical content or fabricate content that might sound reasonable but is inaccurate or outdated.
+ Limited Scope for Code: Majority of Phi-3 training data is based in Python and use common packages such as "typing, math, random, collections, datetime, itertools". If the model generates Python scripts that utilize other packages or scripts in other languages, we strongly recommend users manually verify all API uses.
Developers should apply responsible AI best practices and are responsible for ensuring that a specific use case complies with relevant laws and regulations (e.g. privacy, trade, etc.). Important areas for consideration include:
+ Allocation: Models may not be suitable for scenarios that could have consequential impact on legal status or the allocation of resources or life opportunities (ex: housing, employment, credit, etc.) without further assessments and additional debiasing techniques.
+ High-Risk Scenarios: Developers should assess suitability of using models in high-risk scenarios where unfair, unreliable or offensive outputs might be extremely costly or lead to harm. This includes providing advice in sensitive or expert domains where accuracy and reliability are critical (ex: legal or health advice). Additional safeguards should be implemented at the application level according to the deployment context.
+ Misinformation: Models may produce inaccurate information. Developers should follow transparency best practices and inform end-users they are interacting with an AI system. At the application level, developers can build feedback mechanisms and pipelines to ground responses in use-case specific, contextual information, a technique known as Retrieval Augmented Generation (RAG).
+ Generation of Harmful Content: Developers should assess outputs for their context and use available safety classifiers or custom solutions appropriate for their use case.
+ Misuse: Other forms of misuse such as fraud, spam, or malware production may be possible, and developers should ensure that their applications do not violate applicable laws and regulations.
## Training
### Model
* Architecture: Phi-3 Mini-128K-Instruct has 3.8B parameters and is a dense decoder-only Transformer model. The model is fine-tuned with Supervised fine-tuning (SFT) and Direct Preference Optimization (DPO) to ensure alignment with human preferences and safety guidlines.
* Inputs: Text. It is best suited for prompts using chat format.
* Context length: 128K tokens
* GPUs: 512 H100-80G
* Training time: 7 days
* Training data: 3.3T tokens
* Outputs: Generated text in response to the input
* Dates: Our models were trained between February and April 2024
* Status: This is a static model trained on an offline dataset with cutoff date October 2023. Future versions of the tuned models may be released as we improve models.
### Datasets
Our training data includes a wide variety of sources, totaling 3.3 trillion tokens, and is a combination of
1) Publicly available documents filtered rigorously for quality, selected high-quality educational data, and code;
2) Newly created synthetic, “textbook-like” data for the purpose of teaching math, coding, common sense reasoning, general knowledge of the world (science, daily activities, theory of mind, etc.);
3) High quality chat format supervised data covering various topics to reflect human preferences on different aspects such as instruct-following, truthfulness, honesty and helpfulness.
### Fine-tuning
A basic example of multi-GPUs supervised fine-tuning (SFT) with TRL and Accelerate modules is provided [here](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/resolve/main/sample_finetune.py).
## Benchmarks
We report the results for Phi-3-Mini-128K-Instruct on standard open-source benchmarks measuring the model's reasoning ability (both common sense reasoning and logical reasoning). We compare to Phi-2, Mistral-7b-v0.1, Mixtral-8x7b, Gemma 7B, Llama-3-8B-Instruct, and GPT-3.5.
All the reported numbers are produced with the exact same pipeline to ensure that the numbers are comparable. These numbers might differ from other published numbers due to slightly different choices in the evaluation.
As is now standard, we use few-shot prompts to evaluate the models, at temperature 0.
The prompts and number of shots are part of a Microsoft internal tool to evaluate language models, and in particular we did no optimization to the pipeline for Phi-3.
More specifically, we do not change prompts, pick different few-shot examples, change prompt format, or do any other form of optimization for the model.
The number of k–shot examples is listed per-benchmark.
| | Phi-3-Mini-128K-In<br>3.8b | Phi-3-Small<br>7b (preview) | Phi-3-Medium<br>14b (preview) | Phi-2<br>2.7b | Mistral<br>7b | Gemma<br>7b | Llama-3-In<br>8b | Mixtral<br>8x7b | GPT-3.5<br>version 1106 |
|---|---|---|---|---|---|---|---|---|---|
| MMLU <br>5-Shot | 68.1 | 75.3 | 78.2 | 56.3 | 61.7 | 63.6 | 66.5 | 68.4 | 71.4 |
| HellaSwag <br> 5-Shot | 74.5 | 78.7 | 83.2 | 53.6 | 58.5 | 49.8 | 71.1 | 70.4 | 78.8 |
| ANLI <br> 7-Shot | 52.8 | 55.0 | 58.7 | 42.5 | 47.1 | 48.7 | 57.3 | 55.2 | 58.1 |
| GSM-8K <br> 0-Shot; CoT | 83.6 | 86.4 | 90.8 | 61.1 | 46.4 | 59.8 | 77.4 | 64.7 | 78.1 |
| MedQA <br> 2-Shot | 55.3 | 58.2 | 69.8 | 40.9 | 49.6 | 50.0 | 60.5 | 62.2 | 63.4 |
| AGIEval <br> 0-Shot | 36.9 | 45.0 | 49.7 | 29.8 | 35.1 | 42.1 | 42.0 | 45.2 | 48.4 |
| TriviaQA <br> 5-Shot | 57.1 | 59.1 | 73.3 | 45.2 | 72.3 | 75.2 | 67.7 | 82.2 | 85.8 |
| Arc-C <br> 10-Shot | 84.0 | 90.7 | 91.9 | 75.9 | 78.6 | 78.3 | 82.8 | 87.3 | 87.4 |
| Arc-E <br> 10-Shot | 95.2 | 97.1 | 98.0 | 88.5 | 90.6 | 91.4 | 93.4 | 95.6 | 96.3 |
| PIQA <br> 5-Shot | 83.6 | 87.8 | 88.2 | 60.2 | 77.7 | 78.1 | 75.7 | 86.0 | 86.6 |
| SociQA <br> 5-Shot | 76.1 | 79.0 | 79.4 | 68.3 | 74.6 | 65.5 | 73.9 | 75.9 | 68.3 |
| BigBench-Hard <br> 0-Shot | 71.5 | 75.0 | 82.5 | 59.4 | 57.3 | 59.6 | 51.5 | 69.7 | 68.32 |
| WinoGrande <br> 5-Shot | 72.5 | 82.5 | 81.2 | 54.7 | 54.2 | 55.6 | 65.0 | 62.0 | 68.8 |
| OpenBookQA <br> 10-Shot | 80.6 | 88.4 | 86.6 | 73.6 | 79.8 | 78.6 | 82.6 | 85.8 | 86.0 |
| BoolQ <br> 0-Shot | 78.7 | 82.9 | 86.5 | -- | 72.2 | 66.0 | 80.9 | 77.6 | 79.1 |
| CommonSenseQA <br> 10-Shot | 78.0 | 80.3 | 82.6 | 69.3 | 72.6 | 76.2 | 79 | 78.1 | 79.6 |
| TruthfulQA <br> 10-Shot | 63.2 | 68.1 | 74.8 | -- | 52.1 | 53.0 | 63.2 | 60.1 | 85.8 |
| HumanEval <br> 0-Shot | 57.9 | 59.1 | 54.7 | 47.0 | 28.0 | 34.1 | 60.4| 37.8 | 62.2 |
| MBPP <br> 3-Shot | 62.5 | 71.4 | 73.7 | 60.6 | 50.8 | 51.5 | 67.7 | 60.2 | 77.8 |
## Software
* [PyTorch](https://github.com/pytorch/pytorch)
* [DeepSpeed](https://github.com/microsoft/DeepSpeed)
* [Transformers](https://github.com/huggingface/transformers)
* [Flash-Attention](https://github.com/HazyResearch/flash-attention)
## Hardware
Note that by default, the Phi-3-mini model uses flash attention, which requires certain types of GPU hardware to run. We have tested on the following GPU types:
* NVIDIA A100
* NVIDIA A6000
* NVIDIA H100
If you want to run the model on:
* NVIDIA V100 or earlier generation GPUs: call AutoModelForCausalLM.from_pretrained() with attn_implementation="eager"
* Optimized inference on GPU, CPU, and Mobile: use the **ONNX** models [128K](https://aka.ms/phi3-mini-128k-instruct-onnx)
## Cross Platform Support
ONNX runtime ecosystem now supports Phi-3 Mini models across platforms and hardware. You can find the optimized Phi-3 Mini-128K-Instruct ONNX model [here](https://aka.ms/phi3-mini-128k-instruct-onnx).
Optimized Phi-3 models are also published here in ONNX format, to run with ONNX Runtime on CPU and GPU across devices, including server platforms, Windows, Linux and Mac desktops, and mobile CPUs, with the precision best suited to each of these targets. DirectML support lets developers bring hardware acceleration to Windows devices at scale across AMD, Intel, and NVIDIA GPUs.
Along with DirectML, ONNX Runtime provides cross platform support for Phi-3 across a range of devices CPU, GPU, and mobile.
Here are some of the optimized configurations we have added:
1. ONNX models for int4 DML: Quantized to int4 via AWQ
2. ONNX model for fp16 CUDA
3. ONNX model for int4 CUDA: Quantized to int4 via RTN
4. ONNX model for int4 CPU and Mobile: Quantized to int4 via RTN
## License
The model is licensed under the [MIT license](https://huggingface.co/microsoft/Phi-3-mini-128k/resolve/main/LICENSE).
## Trademarks
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow [Microsoft’s Trademark & Brand Guidelines](https://www.microsoft.com/en-us/legal/intellectualproperty/trademarks). Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party’s policies.
| [
"MEDQA"
] |
IEETA/BioNExt | IEETA | null | [
"en",
"dataset:bigbio/biored",
"license:mit",
"region:us"
] | "2024-05-10T13:48:26Z" | 2024-05-13T13:54:02+00:00 | 0 | 1 | ---
datasets:
- bigbio/biored
language:
- en
license: mit
metrics:
- f1
---
# Model Card for BioNExt
BioNExt, is an end-to-end Biomedical Relation Extraction and Classifcation system. The work utilized three modules, a Tagger (Named Entity Recognition), Linker (Entity Linking) and an Extractor (Relation Extraction and Classification).
This repositories contains two models:
1. **Tagger:** Named Entity Recognition module, which performs 6 class biomedical NER: **Genes, Diseases, Chemicals, Variants (mutations), Species, and Cell Lines**.
2. **Extractor:** Performs Relation Extraction and classification. The classes for the relation Extraction are: **Positive Correlation, Negative Correlation, Association, Binding, Drug Interaction, Cotreatment, Comparison, and Conversion.**
For a full description on how to utilize our end-to-end pipeline we point you towards our [GitHub](https://github.com/ieeta-pt/BioNExt) repository.
- **Developed by:** IEETA
- **Model type:** BERT Base
- **Language(s) (NLP):** English
- **License:** MIT
- **Finetuned from model:** BioLinkBERT-Large
### Model Sources
- **Repository:** [IEETA BioNExt GitHub](https://github.com/ieeta-pt/BioNExt)
- **Paper:** Towards Discovery: An End-to-End System for Uncovering Novel Biomedical Relations [Awaiting Publication]
**Authors:**
- Tiago Almeida ([ORCID: 0000-0002-4258-3350](https://orcid.org/0000-0002-4258-3350))
- Richard A A Jonker ([ORCID: 0000-0002-3806-6940](https://orcid.org/0000-0002-3806-6940))
- Rui Antunes ([ORCID: 0000-0003-3533-8872](https://orcid.org/0000-0003-3533-8872))
- João R Almeida ([ORCID: 0000-0003-0729-2264](https://orcid.org/0000-0003-0729-2264))
- Sérgio Matos ([ORCID: 0000-0003-1941-3983](https://orcid.org/0000-0003-1941-3983))
## Uses
Note we do not take any liability for the use of the model in any professional/medical domain. The model is intended for academic purposes only.
## How to Get Started with the Model
Please refer to our GitHub repository for more information on our end-to-end inference pipeline: [IEETA BioNExt GitHub](https://github.com/ieeta-pt/BioNExt)
## Training Data
The training data utilized was the BioRED corpus, wihtin the scope of the BioCreative-VIII challenge.
Ling Luo, Po-Ting Lai, Chih-Hsuan Wei, Cecilia N Arighi, Zhiyong Lu, BioRED: a rich biomedical relation extraction dataset, Briefings in Bioinformatics, Volume 23, Issue 5, September 2022, bbac282, https://doi.org/10.1093/bib/bbac282
## Results
As evaluated as an end to end system, our results are as follows:
- **Tagger**: 43.10
- **Linker**: 32.46
- **Extractor**: 24.59
| Configuration | Entity Pair (P/R/F%) | + Relation (P/R/F%) | + Novel (P/R/F%) |
|---------------------------------------|-----------------------|----------------------|------------------|
| Competition best | -/-/55.84 | -/-/43.03 | -/-/32.75 |
| BioNExt (end-to-end) | 45.89/40.63/43.10 | 34.56/30.60/32.46 | 26.18/23.18/24.59 |
## Citation
**BibTeX:**
[Awaiting Publication] | [
"BIORED"
] |
IEETA/Multi-Head-CRF | IEETA | null | [
"es",
"dataset:IEETA/SPACCC-Spanish-NER",
"license:mit",
"region:us"
] | "2024-05-10T14:29:08Z" | 2024-05-14T12:45:00+00:00 | 0 | 0 | ---
datasets:
- IEETA/SPACCC-Spanish-NER
language:
- es
license: mit
metrics:
- f1
---
# Model Card for Biomedical Named Entity Recognition in Spanish Clinical Texts
Our model focuses on Biomedical Named Entity Recognition (NER) in Spanish clinical texts, crucial for automated information extraction in medical research and treatment improvements. It proposes a novel approach using a Multi-Head Conditional Random Field (CRF) classifier to tackle multi-class NER tasks, overcoming challenges of overlapping entity instances. The classes it recognizes include symptoms, procedures, diseases, chemicals, and proteins.
We provide 4 different models, available as branches of this repository.
## Model Details
### Model Description
- **Developed by:** IEETA
- **Model type:** Multi-Head-CRF, Roberta Base
- **Language(s) (NLP):** Spanish
- **License:** MIT
- **Finetuned from model:** lcampillos/roberta-es-clinical-trials-ner
### Model Sources
- **Repository:** [IEETA Multi-Head-CRF GitHub](https://github.com/ieeta-pt/Multi-Head-CRF)
- **Paper:** Multi-head CRF classifier for biomedical multi-class Named Entity Recognition on Spanish clinical notes [Awaiting Publication]
**Authors:**
- Richard A A Jonker ([ORCID: 0000-0002-3806-6940](https://orcid.org/0000-0002-3806-6940))
- Tiago Almeida ([ORCID: 0000-0002-4258-3350](https://orcid.org/0000-0002-4258-3350))
- Rui Antunes ([ORCID: 0000-0003-3533-8872](https://orcid.org/0000-0003-3533-8872))
- João R Almeida ([ORCID: 0000-0003-0729-2264](https://orcid.org/0000-0003-0729-2264))
- Sérgio Matos ([ORCID: 0000-0003-1941-3983](https://orcid.org/0000-0003-1941-3983))
## Uses
Note we do not take any liability for the use of the model in any professional/medical domain. The model is intended for academic purposes only. It performs Named Entity Recognition over 5 classes namely: SYMPTOM PROCEDURE DISEASE PROTEIN CHEMICAL
## How to Get Started with the Model
Please refer to our GitHub repository for more information on how to train the model and run inference: [IEETA Multi-Head-CRF GitHub](https://github.com/ieeta-pt/Multi-Head-CRF)
## Training Details
### Training Data
The training data can be found on IEETA/SPACCC-Spanish-NER, which is further described on the dataset card.
The dataset used consists of 4 seperate datasets:
- [SympTEMIST](https://zenodo.org/records/10635215)
- [MedProcNER](https://zenodo.org/records/8224056)
- [DisTEMIST](https://zenodo.org/records/7614764)
- [PharmaCoNER](https://zenodo.org/records/4270158)
### Speeds, Sizes, Times
The models were trained using an Nvidia Quadro RTX 8000. The models for 5 classes took approximately 1 hour to train and occupy around 1GB of disk space. Additionally, this model shows linear complexity (+8 minutes) per entity class to classify.
### Testing Data, Factors & Metrics
#### Testing Data
The testing data can be found on IEETA/SPACCC-Spanish-NER, which is further described on the dataset card.
#### Metrics
The models were evaluated using the micro-averaged F1-score metric, the standard for entity recognition tasks.
### Results
We provide 4 separate models with various hyperparameter changes:
| HLs per head | Augmentation | Percentage Tags | Augmentation Probability | F1 |
|--------------|--------------|-----------------|--------------------------|--------|
| 3 | Random | 0.25 | 0.50 | 78.73 |
| 3 | Unknown | 0.50 | 0.25 | 78.50 |
| 3 | None | - | - | **78.89** |
| 1 | Random | 0.25 | 0.50 | **78.89** |
All models are trained with a context size of 32 tokens for 60 epochs.
## Citation
**BibTeX:**
[Awaiting Publication]
| [
"DISTEMIST",
"PHARMACONER",
"SYMPTEMIST"
] |
pclucas14/library-phi3-4k_5ep-fixed | pclucas14 | null | [
"region:us"
] | "2024-05-10T21:22:46Z" | 2024-05-10T21:55:52+00:00 | 0 | 0 | ---
{}
---
Number of experts present in the library: 256
| Expert Name | Base Model | Trained on | Adapter Type |
| --- | --- | --- | --- |
| super_glue_wic_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_wic_1_0_2 | lora |
| quarel_logic_test | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quarel_logic_test | lora |
| ropes_prompt_beginning | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_prompt_beginning | lora |
| duorc_SelfRC_build_story_around_qa | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_build_story_around_qa | lora |
| lambada_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/lambada_1_0_0 | lora |
| dream_answer_to_dialogue | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dream_answer_to_dialogue | lora |
| glue_qnli_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_qnli_2_0_0 | lora |
| cos_e_v1_11_i_think | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_i_think | lora |
| wiki_hop_original_choose_best_object_affirmative_3 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_3 | lora |
| app_reviews_convert_to_star_rating | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/app_reviews_convert_to_star_rating | lora |
| duorc_SelfRC_decide_worth_it | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_decide_worth_it | lora |
| race_middle_Select_the_best_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Select_the_best_answer | lora |
| ag_news_subset_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ag_news_subset_1_0_0 | lora |
| quoref_Find_Answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Find_Answer | lora |
| social_i_qa_Show_choices_and_generate_index | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_index | lora |
| dream_generate_first_utterance | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dream_generate_first_utterance | lora |
| qasc_qa_with_separated_facts_4 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_4 | lora |
| web_questions_question_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/web_questions_question_answer | lora |
| wiki_qa_Jeopardy_style | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Jeopardy_style | lora |
| quoref_Guess_Title_For_Context | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Guess_Title_For_Context | lora |
| unified_qa_science_inst | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/unified_qa_science_inst | lora |
| app_reviews_generate_review | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/app_reviews_generate_review | lora |
| wiki_bio_who | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_bio_who | lora |
| race_high_Taking_a_test | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Taking_a_test | lora |
| duorc_SelfRC_movie_director | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_movie_director | lora |
| cot_creak_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_creak_ii | lora |
| wmt16_translate_tr_en_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wmt16_translate_tr_en_1_0_0 | lora |
| glue_wnli_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_wnli_2_0_0 | lora |
| super_glue_multirc_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_multirc_1_0_2 | lora |
| quail_description_context_question_answer_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_description_context_question_answer_id | lora |
| duorc_ParaphraseRC_question_answering | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_question_answering | lora |
| sciq_Multiple_Choice_Closed_Book_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/sciq_Multiple_Choice_Closed_Book_ | lora |
| wiki_qa_found_on_google | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_found_on_google | lora |
| cos_e_v1_11_question_description_option_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_id | lora |
| ropes_given_background_situation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_given_background_situation | lora |
| ropes_background_new_situation_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_background_new_situation_answer | lora |
| glue_mnli_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_mnli_2_0_0 | lora |
| duorc_SelfRC_answer_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_answer_question | lora |
| cos_e_v1_11_description_question_option_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_id | lora |
| glue_cola_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_cola_2_0_0 | lora |
| wiki_qa_Generate_Question_from_Topic | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Generate_Question_from_Topic | lora |
| wiqa_does_the_supposed_perturbation_have_an_effect | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_does_the_supposed_perturbation_have_an_effect | lora |
| ropes_plain_bottom_hint | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_plain_bottom_hint | lora |
| wiki_hop_original_choose_best_object_interrogative_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_2 | lora |
| cot_sensemaking | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_sensemaking | lora |
| ropes_new_situation_background_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_new_situation_background_answer | lora |
| glue_mrpc_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_mrpc_2_0_0 | lora |
| cot_gsm8k_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_gsm8k_ii | lora |
| imdb_reviews_plain_text_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/imdb_reviews_plain_text_1_0_0 | lora |
| huggingface_xsum | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/huggingface_xsum | lora |
| squad_v2_0_3_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/squad_v2_0_3_0_0 | lora |
| wmt16_translate_ro_en_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wmt16_translate_ro_en_1_0_0 | lora |
| quail_no_prompt_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_no_prompt_text | lora |
| social_i_qa_Generate_the_question_from_the_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/social_i_qa_Generate_the_question_from_the_answer | lora |
| social_i_qa_Check_if_a_random_answer_is_valid_or_not | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/social_i_qa_Check_if_a_random_answer_is_valid_or_not | lora |
| para_crawl_enes | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/para_crawl_enes | lora |
| adversarial_qa_dbidaf_tell_what_it_is | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbidaf_tell_what_it_is | lora |
| adversarial_qa_droberta_question_context_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_droberta_question_context_answer | lora |
| wiki_bio_key_content | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_bio_key_content | lora |
| social_i_qa_Generate_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/social_i_qa_Generate_answer | lora |
| cos_e_v1_11_question_option_description_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_text | lora |
| wiki_hop_original_generate_subject_and_object | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_generate_subject_and_object | lora |
| duorc_SelfRC_question_answering | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_question_answering | lora |
| cot_creak | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_creak | lora |
| cos_e_v1_11_rationale | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_rationale | lora |
| adversarial_qa_dbert_question_context_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbert_question_context_answer | lora |
| stream_aqua_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/stream_aqua_ii | lora |
| cos_e_v1_11_aligned_with_common_sense | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_aligned_with_common_sense | lora |
| trec_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/trec_1_0_0 | lora |
| quoref_Answer_Friend_Question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Answer_Friend_Question | lora |
| qasc_qa_with_separated_facts_1 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_1 | lora |
| adversarial_qa_dbert_answer_the_following_q | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbert_answer_the_following_q | lora |
| duorc_ParaphraseRC_extract_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_extract_answer | lora |
| quail_context_question_answer_description_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_question_answer_description_id | lora |
| ropes_prompt_bottom_no_hint | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_prompt_bottom_no_hint | lora |
| anli_r1_0_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/anli_r1_0_1_0 | lora |
| dream_read_the_following_conversation_and_answer_the_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dream_read_the_following_conversation_and_answer_the_question | lora |
| social_i_qa_I_was_wondering | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/social_i_qa_I_was_wondering | lora |
| glue_stsb_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_stsb_2_0_0 | lora |
| cos_e_v1_11_description_question_option_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_text | lora |
| wiqa_effect_with_string_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_effect_with_string_answer | lora |
| cot_qasc | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_qasc | lora |
| web_questions_whats_the_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/web_questions_whats_the_answer | lora |
| paws_wiki_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/paws_wiki_1_1_0 | lora |
| gem_dart_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/gem_dart_1_1_0 | lora |
| ropes_plain_background_situation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_plain_background_situation | lora |
| cosmos_qa_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cosmos_qa_1_0_0 | lora |
| quoref_Answer_Question_Given_Context | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Answer_Question_Given_Context | lora |
| quartz_read_passage_below_choose | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_read_passage_below_choose | lora |
| quail_context_description_question_answer_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_description_question_answer_text | lora |
| quoref_Answer_Test | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Answer_Test | lora |
| quartz_answer_question_below | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_answer_question_below | lora |
| quail_context_question_answer_description_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_question_answer_description_text | lora |
| wiki_qa_automatic_system | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_automatic_system | lora |
| gigaword_1_2_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/gigaword_1_2_0 | lora |
| wiqa_effect_with_label_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_effect_with_label_answer | lora |
| wiqa_what_is_the_final_step_of_the_following_process | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_what_is_the_final_step_of_the_following_process | lora |
| race_high_Is_this_the_right_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Is_this_the_right_answer | lora |
| duorc_ParaphraseRC_answer_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_answer_question | lora |
| multi_news_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/multi_news_1_0_0 | lora |
| wiki_hop_original_generate_subject | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_generate_subject | lora |
| squad_v1_1_3_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/squad_v1_1_3_0_0 | lora |
| wiki_hop_original_choose_best_object_interrogative_1 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_1 | lora |
| wiqa_what_might_be_the_first_step_of_the_process | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_what_might_be_the_first_step_of_the_process | lora |
| anli_r3_0_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/anli_r3_0_1_0 | lora |
| super_glue_rte_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_rte_1_0_2 | lora |
| gem_e2e_nlg_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/gem_e2e_nlg_1_1_0 | lora |
| adversarial_qa_droberta_answer_the_following_q | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_droberta_answer_the_following_q | lora |
| adversarial_qa_droberta_tell_what_it_is | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_droberta_tell_what_it_is | lora |
| qasc_qa_with_separated_facts_3 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_3 | lora |
| quartz_having_read_above_passage | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_having_read_above_passage | lora |
| stream_qed_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/stream_qed_ii | lora |
| super_glue_record_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_record_1_0_2 | lora |
| wiki_qa_Topic_Prediction_Question_and_Answer_Pair | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_and_Answer_Pair | lora |
| qasc_qa_with_separated_facts_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_2 | lora |
| wmt16_translate_fi_en_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wmt16_translate_fi_en_1_0_0 | lora |
| adversarial_qa_dbidaf_question_context_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbidaf_question_context_answer | lora |
| coqa_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/coqa_1_0_0 | lora |
| wiki_bio_what_content | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_bio_what_content | lora |
| stream_qed | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/stream_qed | lora |
| quail_description_context_question_answer_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_description_context_question_answer_text | lora |
| quoref_Found_Context_Online | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Found_Context_Online | lora |
| cos_e_v1_11_generate_explanation_given_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_generate_explanation_given_text | lora |
| dbpedia_14_pick_one_category_for_the_following_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dbpedia_14_pick_one_category_for_the_following_text | lora |
| super_glue_wsc_fixed_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_wsc_fixed_1_0_2 | lora |
| math_dataset_algebra__linear_1d_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/math_dataset_algebra__linear_1d_1_0_0 | lora |
| duorc_ParaphraseRC_title_generation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_title_generation | lora |
| ropes_background_situation_middle | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_background_situation_middle | lora |
| word_segment | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/word_segment | lora |
| wmt16_translate_de_en_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wmt16_translate_de_en_1_0_0 | lora |
| cnn_dailymail_3_4_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cnn_dailymail_3_4_0 | lora |
| race_high_Read_the_article_and_answer_the_question_no_option_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Read_the_article_and_answer_the_question_no_option_ | lora |
| ropes_prompt_bottom_hint_beginning | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_prompt_bottom_hint_beginning | lora |
| wiqa_what_might_be_the_last_step_of_the_process | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_what_might_be_the_last_step_of_the_process | lora |
| quail_no_prompt_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_no_prompt_id | lora |
| cot_gsm8k | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_gsm8k | lora |
| duorc_ParaphraseRC_generate_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question | lora |
| quoref_What_Is_The_Answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_What_Is_The_Answer | lora |
| duorc_SelfRC_extract_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_extract_answer | lora |
| duorc_SelfRC_title_generation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_title_generation | lora |
| wiki_hop_original_explain_relation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_explain_relation | lora |
| dream_baseline | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dream_baseline | lora |
| definite_pronoun_resolution_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/definite_pronoun_resolution_1_1_0 | lora |
| quail_context_description_question_answer_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_description_question_answer_id | lora |
| quarel_do_not_use | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quarel_do_not_use | lora |
| quartz_paragraph_question_plain_concat | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_paragraph_question_plain_concat | lora |
| cot_ecqa | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_ecqa | lora |
| qasc_is_correct_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_is_correct_2 | lora |
| adversarial_qa_dbert_generate_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbert_generate_question | lora |
| wiki_hop_original_choose_best_object_affirmative_1 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_1 | lora |
| wiki_qa_Direct_Answer_to_Question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Direct_Answer_to_Question | lora |
| sciq_Multiple_Choice_Question_First | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/sciq_Multiple_Choice_Question_First | lora |
| wiki_qa_Decide_good_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Decide_good_answer | lora |
| duorc_ParaphraseRC_movie_director | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_movie_director | lora |
| race_middle_Write_a_multi_choice_question_options_given_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_options_given_ | lora |
| drop_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/drop_2_0_0 | lora |
| kilt_tasks_hotpotqa_complex_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_complex_question | lora |
| natural_questions_open_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/natural_questions_open_1_0_0 | lora |
| duorc_ParaphraseRC_decide_worth_it | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_decide_worth_it | lora |
| cot_sensemaking_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_sensemaking_ii | lora |
| kilt_tasks_hotpotqa_straighforward_qa | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_straighforward_qa | lora |
| sciq_Direct_Question_Closed_Book_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/sciq_Direct_Question_Closed_Book_ | lora |
| adversarial_qa_dbidaf_based_on | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbidaf_based_on | lora |
| race_high_Select_the_best_answer_generate_span_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Select_the_best_answer_generate_span_ | lora |
| cot_esnli | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_esnli | lora |
| cos_e_v1_11_question_option_description_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_id | lora |
| quartz_use_info_from_paragraph_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_use_info_from_paragraph_question | lora |
| race_high_Select_the_best_answer_no_instructions_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Select_the_best_answer_no_instructions_ | lora |
| wiki_qa_Topic_Prediction_Answer_Only | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Answer_Only | lora |
| wmt14_translate_fr_en_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wmt14_translate_fr_en_1_0_0 | lora |
| trivia_qa_rc_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/trivia_qa_rc_1_1_0 | lora |
| kilt_tasks_hotpotqa_final_exam | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_final_exam | lora |
| ropes_prompt_mix | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_prompt_mix | lora |
| snli_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/snli_1_1_0 | lora |
| quarel_testing_students | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quarel_testing_students | lora |
| true_case | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/true_case | lora |
| dream_generate_last_utterance | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dream_generate_last_utterance | lora |
| quoref_Context_Contains_Answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Context_Contains_Answer | lora |
| gem_web_nlg_en_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/gem_web_nlg_en_1_1_0 | lora |
| cot_strategyqa_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_strategyqa_ii | lora |
| quartz_answer_question_based_on | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_answer_question_based_on | lora |
| quail_context_question_description_answer_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_question_description_answer_id | lora |
| social_i_qa_Show_choices_and_generate_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_answer | lora |
| yelp_polarity_reviews_0_2_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/yelp_polarity_reviews_0_2_0 | lora |
| adversarial_qa_dbidaf_answer_the_following_q | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbidaf_answer_the_following_q | lora |
| wiqa_which_of_the_following_is_the_supposed_perturbation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_which_of_the_following_is_the_supposed_perturbation | lora |
| duorc_ParaphraseRC_generate_question_by_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question_by_answer | lora |
| race_middle_Is_this_the_right_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Is_this_the_right_answer | lora |
| qasc_qa_with_combined_facts_1 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_qa_with_combined_facts_1 | lora |
| duorc_ParaphraseRC_build_story_around_qa | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_build_story_around_qa | lora |
| wiki_qa_exercise | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_exercise | lora |
| dbpedia_14_given_a_choice_of_categories_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dbpedia_14_given_a_choice_of_categories_ | lora |
| quartz_given_the_fact_answer_the_q | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_given_the_fact_answer_the_q | lora |
| ropes_read_background_situation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_read_background_situation | lora |
| quail_context_question_description_answer_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_question_description_answer_text | lora |
| dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | lora |
| race_middle_Taking_a_test | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Taking_a_test | lora |
| web_questions_short_general_knowledge_q | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/web_questions_short_general_knowledge_q | lora |
| race_middle_Select_the_best_answer_generate_span_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_generate_span_ | lora |
| wiki_bio_comprehension | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_bio_comprehension | lora |
| wiki_hop_original_choose_best_object_affirmative_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_2 | lora |
| kilt_tasks_hotpotqa_combining_facts | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_combining_facts | lora |
| duorc_SelfRC_generate_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_generate_question | lora |
| cos_e_v1_11_explain_why_human | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_explain_why_human | lora |
| quartz_use_info_from_question_paragraph | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_use_info_from_question_paragraph | lora |
| glue_sst2_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_sst2_2_0_0 | lora |
| race_high_Write_a_multi_choice_question_options_given_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_options_given_ | lora |
| race_high_Write_a_multi_choice_question_for_the_following_article | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_for_the_following_article | lora |
| duorc_SelfRC_generate_question_by_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_generate_question_by_answer | lora |
| anli_r2_0_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/anli_r2_0_1_0 | lora |
| web_questions_potential_correct_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/web_questions_potential_correct_answer | lora |
| quarel_heres_a_story | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quarel_heres_a_story | lora |
| wiki_qa_Is_This_True_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Is_This_True_ | lora |
| adversarial_qa_droberta_generate_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_droberta_generate_question | lora |
| wiki_qa_Topic_Prediction_Question_Only | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_Only | lora |
| race_high_Select_the_best_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Select_the_best_answer | lora |
| quac_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quac_1_0_0 | lora |
| quail_context_description_question_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_description_question_text | lora |
| adversarial_qa_dbert_tell_what_it_is | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbert_tell_what_it_is | lora |
| super_glue_copa_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_copa_1_0_2 | lora |
| adversarial_qa_droberta_based_on | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_droberta_based_on | lora |
| aeslc_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/aeslc_1_0_0 | lora |
| cot_strategyqa | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_strategyqa | lora |
| race_middle_Read_the_article_and_answer_the_question_no_option_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Read_the_article_and_answer_the_question_no_option_ | lora |
| quail_description_context_question_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_description_context_question_text | lora |
| adversarial_qa_dbert_based_on | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbert_based_on | lora |
| wiki_bio_guess_person | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_bio_guess_person | lora |
| app_reviews_convert_to_rating | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/app_reviews_convert_to_rating | lora |
| wiki_hop_original_generate_object | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_generate_object | lora |
| quail_context_question_description_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_question_description_text | lora |
| sciq_Direct_Question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/sciq_Direct_Question | lora |
| adversarial_qa_dbidaf_generate_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbidaf_generate_question | lora |
| quoref_Read_And_Extract_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Read_And_Extract_ | lora |
| race_middle_Select_the_best_answer_no_instructions_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_no_instructions_ | lora |
| stream_aqua | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/stream_aqua | lora |
| glue_qqp_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_qqp_2_0_0 | lora |
| web_questions_get_the_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/web_questions_get_the_answer | lora |
| app_reviews_categorize_rating_using_review | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/app_reviews_categorize_rating_using_review | lora |
| cot_esnli_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_esnli_ii | lora |
| quoref_Guess_Answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Guess_Answer | lora |
| quarel_choose_between | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quarel_choose_between | lora |
| gem_wiki_lingua_english_en_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/gem_wiki_lingua_english_en_1_1_0 | lora |
| qasc_qa_with_separated_facts_5 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_5 | lora |
| ropes_plain_no_background | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_plain_no_background | lora |
| sciq_Multiple_Choice | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/sciq_Multiple_Choice | lora |
| fix_punct | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/fix_punct | lora |
| cos_e_v1_11_question_description_option_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_text | lora |
| quoref_Given_Context_Answer_Question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Given_Context_Answer_Question | lora |
| gem_common_gen_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/gem_common_gen_1_1_0 | lora |
| race_middle_Write_a_multi_choice_question_for_the_following_article | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_for_the_following_article | lora |
| wiqa_what_is_the_missing_first_step | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_what_is_the_missing_first_step | lora |
| dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | lora |
| cot_ecqa_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_ecqa_ii | lora |
| kilt_tasks_hotpotqa_formulate | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_formulate | lora |
| qasc_is_correct_1 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_is_correct_1 | lora |
| super_glue_cb_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_cb_1_0_2 | lora |
Last updated on: 2024-05-10 21:22:47+00:00
| [
"SCIQ"
] |
gobean/Yi-1.5-34B-Chat-GGUF | gobean | null | [
"arxiv:2403.04652",
"license:apache-2.0",
"region:us"
] | "2024-05-12T20:30:54Z" | 2024-05-12T20:34:59+00:00 | 0 | 0 | ---
license: apache-2.0
---
gobean: the q4_k_m in the main model repo was hitting some tokenizer issues - after about five prompt exchanges it goes wild with repeats and unprintable tokens. I made this for comparison, it's not as bad as the q4_k_m but it isn't perfect. Both models fail the knowledge benchmark "describe the difference between a bear credit spread and a poor man's covered call," but they come really really close to getting it (just describes a standard covered call). Overall not bad. Inference is fast with q4_0 on 24gb vram - and the bug could very likely be with llama.cpp, so I may start looking into other frontends.
# ~ Original Model Card ~
<div align="center">
<picture>
<img src="https://raw.githubusercontent.com/01-ai/Yi/main/assets/img/Yi_logo_icon_light.svg" width="150px">
</picture>
</div>
<p align="center">
<a href="https://github.com/01-ai">🐙 GitHub</a> •
<a href="https://discord.gg/hYUwWddeAu">👾 Discord</a> •
<a href="https://twitter.com/01ai_yi">🐤 Twitter</a> •
<a href="https://github.com/01-ai/Yi-1.5/issues/2">💬 WeChat</a>
<br/>
<a href="https://arxiv.org/abs/2403.04652">📝 Paper</a> •
<a href="https://github.com/01-ai/Yi/tree/main?tab=readme-ov-file#faq">🙌 FAQ</a> •
<a href="https://github.com/01-ai/Yi/tree/main?tab=readme-ov-file#learning-hub">📗 Learning Hub</a>
</p>
# Intro
Yi-1.5 is an upgraded version of Yi. It is continuously pre-trained on Yi with a high-quality corpus of 500B tokens and fine-tuned on 3M diverse fine-tuning samples.
Compared with Yi, Yi-1.5 delivers stronger performance in coding, math, reasoning, and instruction-following capability, while still maintaining excellent capabilities in language understanding, commonsense reasoning, and reading comprehension.
<div align="center">
Model | Context Length | Pre-trained Tokens
| :------------: | :------------: | :------------: |
| Yi-1.5 | 4K | 3.6T
</div>
# Models
- Chat models
<div align="center">
| Name | Download |
| --------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Yi-1.5-34B-Chat | • [🤗 Hugging Face](https://huggingface.co/collections/01-ai/yi-15-2024-05-663f3ecab5f815a3eaca7ca8) • [🤖 ModelScope](https://www.modelscope.cn/organization/01ai) |
| Yi-1.5-9B-Chat | • [🤗 Hugging Face](https://huggingface.co/collections/01-ai/yi-15-2024-05-663f3ecab5f815a3eaca7ca8) • [🤖 ModelScope](https://www.modelscope.cn/organization/01ai) |
| Yi-1.5-6B-Chat | • [🤗 Hugging Face](https://huggingface.co/collections/01-ai/yi-15-2024-05-663f3ecab5f815a3eaca7ca8) • [🤖 ModelScope](https://www.modelscope.cn/organization/01ai) |
</div>
- Base models
<div align="center">
| Name | Download |
| ---------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Yi-1.5-34B | • [🤗 Hugging Face](https://huggingface.co/collections/01-ai/yi-15-2024-05-663f3ecab5f815a3eaca7ca8) • [🤖 ModelScope](https://www.modelscope.cn/organization/01ai) |
| Yi-1.5-9B | • [🤗 Hugging Face](https://huggingface.co/collections/01-ai/yi-15-2024-05-663f3ecab5f815a3eaca7ca8) • [🤖 ModelScope](https://www.modelscope.cn/organization/01ai) |
| Yi-1.5-6B | • [🤗 Hugging Face](https://huggingface.co/collections/01-ai/yi-15-2024-05-663f3ecab5f815a3eaca7ca8) • [🤖 ModelScope](https://www.modelscope.cn/organization/01ai) |
</div>
# Benchmarks
- Chat models
Yi-1.5-34B-Chat is on par with or excels beyond larger models in most benchmarks.

Yi-1.5-9B-Chat is the top performer among similarly sized open-source models.

- Base models
Yi-1.5-34B is on par with or excels beyond larger models in some benchmarks.

Yi-1.5-9B is the top performer among similarly sized open-source models.

# Quick Start
For getting up and running with Yi-1.5 models quickly, see [README](https://github.com/01-ai/Yi-1.5).
| [
"BEAR"
] |
EnjoyCodeX/MedLang-13B | EnjoyCodeX | null | [
"medical",
"en",
"arxiv:1910.09700",
"license:apache-2.0",
"region:us"
] | "2024-05-16T01:35:18Z" | 2024-05-17T14:11:55+00:00 | 0 | 1 | ---
language:
- en
license: apache-2.0
tags:
- medical
---
# MedLang-13B
<!-- Provide a quick summary of what the model is/does. -->
This modelcard aims to be a base template for new models. It has been generated using [Baichuan-13B](https://huggingface.co/baichuan-inc/Baichuan-13B-Base).
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** Huang chiang
<!-- - **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed] -->
- **Model type:** Medical LLM
- **Language(s) (NLP):** Chinese
<!-- - **License:** [More Information Needed] -->
- **Finetuned from model:** Baichuan-13B
<!-- ### Model Sources: -->
<!-- Provide the basic links for the model. -->
<!-- - **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed] -->
<!-- ## Uses -->
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
<!-- ### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
<!-- ### Downstream Use [optional] -->
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
<!-- [More Information Needed] -->
<!-- ### Out-of-Scope Use -->
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
<!-- [More Information Needed] -->
## How to Get Started with the Model
Use the code below to get started with the model.
<div class="relative group repo-copy-code"><pre><code class="language-python">
def init_model():
model = AutoModelForCausalLM.from_pretrained(
"EnjoyCodeX/MedLang-13B/MedLang-13B",
torch_dtype=torch.float16,
device_map="auto",
trust_remote_code=True
)
model.generation_config = GenerationConfig.from_pretrained(
"EnjoyCodeX/MedLang-13B/MedLang-13B",
)
tokenizer = AutoTokenizer.from_pretrained(
"EnjoyCodeX/MedLang-13B/MedLang-13B",
use_fast=False,
trust_remote_code=True
)
return model, tokenizer
</code></pre></div>
<div class="relative group repo-copy-code"><pre><code class="language-python"><span class="hljs-meta">>>> </span><span class="hljs-keyword">import</span> torch
<span class="hljs-meta">>>> </span><span class="hljs-keyword">from</span> transformers <span class="hljs-keyword">import</span> AutoModelForCausalLM, AutoTokenizer
<span class="hljs-meta">>>> </span><span class="hljs-keyword">from</span> transformers.generation.utils <span class="hljs-keyword">import</span> GenerationConfig
<span class="hljs-meta">>>> </span>tokenizer = AutoTokenizer.from_pretrained(<span class="hljs-string">"EnjoyCodeX/MedLang-13B/MedLang-13B"</span>, use_fast=<span class="hljs-literal">False</span>, trust_remote_code=<span class="hljs-literal">True</span>)
<span class="hljs-meta">>>> </span>model = AutoModelForCausalLM.from_pretrained(<span class="hljs-string">"EnjoyCodeX/MedLang-13B/MedLang-13B"</span>, device_map=<span class="hljs-string">"auto"</span>, torch_dtype=torch.float16, trust_remote_code=<span class="hljs-literal">True</span>)
<span class="hljs-meta">>>> </span>model.generation_config = GenerationConfig.from_pretrained(<span class="hljs-string">"EnjoyCodeX/MedLang-13B/MedLang-13B"</span>)
<span class="hljs-meta">>>> </span>messages = []
<span class="hljs-meta">>>> </span>messages.append({<span class="hljs-string">"role"</span>: <span class="hljs-string">"user"</span>, <span class="hljs-string">"content"</span>: <span class="hljs-string">"我感觉自己颈椎非常不舒服,每天睡醒都会头痛"</span>})
<span class="hljs-meta">>>> </span>response = model.chat(tokenizer, messages)
<span class="hljs-meta">>>> </span><span class="hljs-built_in">print</span>(response)
</code></pre></div>
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
MedDialog,cMedQA-v2,MedMCQA,DrugDB,Alpaca-GPT4-zh
<!-- ### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
<!-- #### Preprocessing [optional]
[More Information Needed] -->
<!-- #### Training Hyperparameters -->
<!-- - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
<!-- #### Speeds, Sizes, Times [optional] -->
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
<!-- [More Information Needed] -->
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
<!-- ### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
<!-- [More Information Needed] -->
MLEC-QA-Few-shot

MLEC-QA-Zero-shot

CMB

CMD

CMID

<!-- #### Factors -->
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
<!-- [More Information Needed] -->
<!-- #### Metrics -->
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
<!-- [More Information Needed] -->
<!-- ### Results
[More Information Needed] -->
### Model Architecture
The overall architecture of MedLang-13B is based on the standard Transformer decoder, employing the same model design as LLaMA. <br>
In terms of details, MedLang-13B adopts the following approaches:<br>
(1) Using RoPE as the positional encoding, which is widely adopted in most models at this stage and exhibits excellent scalability; <br>
(2) Setting the context window length to 4096;<br>
(3) Utilizing SwiGLU as the activation function, which excels in handling complex semantic relationships and long-dependency issues in language modeling, hence widely used in large language models including MedLang-13B; <br>
(4) Setting the hidden layer size to 11008 in the feedforward network; <br>
(5) Adopting Pre-Normalization based on RMSNorm as layer normalization.<br>
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Although MedLang-13B performs well on both single-turn and multi-turn QA medical datasets,
the high demands of reliability and safety in the medical field, especially concerning patient health and life,
render the current MedLang-13B unsuitable for deployment in practical medical applications.
I solemnly declare that this model is designed solely for the research and testing purposes of individual groups.
Users are strongly advised to critically evaluate any information or medical advice provided by the model.
<!-- #### Summary -->
<!-- ## Model Examination [optional] -->
<!-- Relevant interpretability work for the model goes here -->
<!-- [More Information Needed] -->
<!-- ## Environmental Impact -->
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
<!-- Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed] -->
<!-- ## Technical Specifications [optional] -->
<!-- [More Information Needed]
-->
<!-- ### Compute Infrastructure -->
<!-- [More Information Needed]
-->
#### GPU
<table style="undefined;table-layout: fixed; width: 442px" class="tg">
<colgroup>
<col style="width: 204.428571px">
<col style="width: 135.428571px">
<col style="width: 102.428571px">
</colgroup>
<thead>
<tr>
<th rowspan="2" class="tg-9wq8"><br>Precision</th>
<th rowspan="2" class="tg-9wq8"><br>GPU Mem (GB)</th>
</tr>
<tr>
</tr>
</thead>
<tbody>
<tr>
<td class="tg-9wq8">bf16 / fp16</td>
<td class="tg-9wq8">26.0</td>
</tr>
<tr>
<td class="tg-9wq8">int8</td>
<td class="tg-c3ow">15.8</td>
</tr>
<tr>
<td class="tg-9wq8">int4</td>
<td class="tg-c3ow">9.7</td>
</tr>
</tbody>
</table>
<!-- [More Information Needed] -->
<!-- #### Software
-->
<!-- [More Information Needed] -->
<!-- ## Citation [optional] -->
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
<!--
**BibTeX:**
[More Information Needed] -->
<!-- **APA:**
[More Information Needed] -->
<!-- ## Glossary [optional] -->
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
<!-- [More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed] -->
<!-- ## Model Card Contact
[More Information Needed] --> | [
"MEDDIALOG"
] |
raghavlight/SE_v1 | raghavlight | null | [
"safetensors",
"mteb",
"model-index",
"region:us"
] | "2024-05-20T20:52:47Z" | 2024-05-20T21:16:49+00:00 | 0 | 0 | ---
tags:
- mteb
model-index:
- name: SE_v1
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 78.83582089552237
- type: ap
value: 43.10721536172567
- type: f1
value: 73.02587501904198
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 92.756325
- type: ap
value: 89.5445783321967
- type: f1
value: 92.7501917603289
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 47.431999999999995
- type: f1
value: 46.49254790014476
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 35.135
- type: map_at_10
value: 52.298
- type: map_at_100
value: 52.298
- type: map_at_1000
value: 52.298
- type: map_at_20
value: 52.298
- type: map_at_3
value: 47.522999999999996
- type: map_at_5
value: 50.378
- type: mrr_at_1
value: 36.059999999999995
- type: mrr_at_10
value: 52.662
- type: mrr_at_100
value: 52.662
- type: mrr_at_1000
value: 52.662
- type: mrr_at_20
value: 52.662
- type: mrr_at_3
value: 47.783
- type: mrr_at_5
value: 50.753
- type: ndcg_at_1
value: 35.135
- type: ndcg_at_10
value: 61.419999999999995
- type: ndcg_at_100
value: 61.419999999999995
- type: ndcg_at_1000
value: 61.419999999999995
- type: ndcg_at_20
value: 61.419999999999995
- type: ndcg_at_3
value: 51.608
- type: ndcg_at_5
value: 56.759
- type: precision_at_1
value: 35.135
- type: precision_at_10
value: 9.04
- type: precision_at_100
value: 0.9039999999999999
- type: precision_at_1000
value: 0.09
- type: precision_at_20
value: 4.52
- type: precision_at_3
value: 21.147
- type: precision_at_5
value: 15.192
- type: recall_at_1
value: 35.135
- type: recall_at_10
value: 90.398
- type: recall_at_100
value: 90.398
- type: recall_at_1000
value: 90.398
- type: recall_at_20
value: 90.398
- type: recall_at_3
value: 63.442
- type: recall_at_5
value: 75.96000000000001
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 48.12363356263455
- type: v_measures
value:
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- 0.48027757481176014
- 0.4949903704757016
- 0.4803950319722686
- 0.4878191638148134
- 0.5010741012685239
- 0.4728978712118386
- 0.47889994605951863
- 0.4715761683007254
- 0.4860537160553702
- 0.45144233208889906
- 0.5540965696705281
- 0.5609258824425363
- 0.5602036049543669
- 0.5616617406378789
- 0.5603846165703955
- 0.5594661484162424
- 0.5593705019774027
- 0.558909644828681
- 0.5603056342160838
- 0.554891992797084
- 0.5226078787726881
- 0.29604710285086555
- 0.4750976228363287
- 0.41663565555894255
- 0.35606399125116023
- 0.29230278564333406
- 0.3015383931941945
- 0.24836040718326108
- 0.3298727045159869
- 1.0
- 0.28415725003932923
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 45.504015770335826
- type: v_measures
value:
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- 0.4598072529308091
- 0.45421147702259007
- 0.44502275183492146
- 0.4656845495215707
- 0.48111454474373505
- 0.4674964711557545
- 0.4552119434555637
- 0.46100067824974655
- 0.47234370855187907
- 0.4665930707139566
- 0.5268120998765866
- 0.528776598374819
- 0.5288757776478977
- 0.5303754454112128
- 0.5277140875765198
- 0.531600682196592
- 0.5264675415800343
- 0.5304023811612318
- 0.5258234645236721
- 0.527352458200513
- 0.485746860511318
- 0.26165193259851754
- 0.40901630466248123
- 0.38157334325562
- 0.32344287707970854
- 0.24649925022711508
- 0.278260392149981
- 0.2389604884816265
- 0.3048450947288335
- 1.0
- 0.26356136037929795
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 66.58809524589093
- type: mrr
value: 79.3222090313503
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 86.401710093136
- type: cos_sim_spearman
value: 85.91701791850129
- type: euclidean_pearson
value: 83.66475885640826
- type: euclidean_spearman
value: 84.01095039107408
- type: manhattan_pearson
value: 83.71400775916655
- type: manhattan_spearman
value: 84.27963121194402
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 88.96103896103895
- type: f1
value: 88.91409928169791
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 38.45196381563332
- type: v_measures
value:
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- 0.39072150007715634
- 0.40083729026179593
- 0.3679233197639931
- 0.3803274522450876
- 0.37343044991449664
- 0.3846528657139038
- 0.3832423677178897
- 0.39407124960839574
- 0.3841280359130708
- 0.3858618503475417
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 36.52850715706321
- type: v_measures
value:
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- 0.35719667938909777
- 0.3724418903896613
- 0.3679178896964583
- 0.35763891275910115
- 0.37333353708864264
- 0.34729391059072645
- 0.3792338756331596
- 0.36956505994495087
- 0.35201164441432914
- 0.37621731580019396
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: mteb/cqadupstack-android
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 40.073
- type: map_at_10
value: 54.022999999999996
- type: map_at_100
value: 54.022999999999996
- type: map_at_1000
value: 54.022999999999996
- type: map_at_20
value: 54.022999999999996
- type: map_at_3
value: 49.285000000000004
- type: map_at_5
value: 52.05
- type: mrr_at_1
value: 51.215999999999994
- type: mrr_at_10
value: 60.866
- type: mrr_at_100
value: 60.866
- type: mrr_at_1000
value: 60.866
- type: mrr_at_20
value: 60.866
- type: mrr_at_3
value: 58.274
- type: mrr_at_5
value: 59.754
- type: ndcg_at_1
value: 51.215999999999994
- type: ndcg_at_10
value: 61.312
- type: ndcg_at_100
value: 60.866
- type: ndcg_at_1000
value: 60.85
- type: ndcg_at_20
value: 60.980999999999995
- type: ndcg_at_3
value: 55.544000000000004
- type: ndcg_at_5
value: 58.299
- type: precision_at_1
value: 51.215999999999994
- type: precision_at_10
value: 12.117
- type: precision_at_100
value: 1.212
- type: precision_at_1000
value: 0.121
- type: precision_at_20
value: 6.059
- type: precision_at_3
value: 27.182000000000002
- type: precision_at_5
value: 19.857
- type: recall_at_1
value: 40.073
- type: recall_at_10
value: 74.51
- type: recall_at_100
value: 74.51
- type: recall_at_1000
value: 74.51
- type: recall_at_20
value: 74.51
- type: recall_at_3
value: 56.804
- type: recall_at_5
value: 64.863
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: mteb/cqadupstack-english
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 36.658
- type: map_at_10
value: 48.632
- type: map_at_100
value: 48.632
- type: map_at_1000
value: 48.632
- type: map_at_20
value: 48.632
- type: map_at_3
value: 45.287
- type: map_at_5
value: 47.243
- type: mrr_at_1
value: 47.134
- type: mrr_at_10
value: 55.047000000000004
- type: mrr_at_100
value: 55.047000000000004
- type: mrr_at_1000
value: 55.047000000000004
- type: mrr_at_20
value: 55.047000000000004
- type: mrr_at_3
value: 53.163000000000004
- type: mrr_at_5
value: 54.294
- type: ndcg_at_1
value: 47.134
- type: ndcg_at_10
value: 54.440999999999995
- type: ndcg_at_100
value: 53.981
- type: ndcg_at_1000
value: 53.981
- type: ndcg_at_20
value: 54.111
- type: ndcg_at_3
value: 50.734
- type: ndcg_at_5
value: 52.48199999999999
- type: precision_at_1
value: 47.134
- type: precision_at_10
value: 10.439
- type: precision_at_100
value: 1.044
- type: precision_at_1000
value: 0.104
- type: precision_at_20
value: 5.220000000000001
- type: precision_at_3
value: 25.096
- type: precision_at_5
value: 17.554
- type: recall_at_1
value: 36.658
- type: recall_at_10
value: 63.641999999999996
- type: recall_at_100
value: 63.641999999999996
- type: recall_at_1000
value: 63.641999999999996
- type: recall_at_20
value: 63.641999999999996
- type: recall_at_3
value: 51.564
- type: recall_at_5
value: 57.277
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: mteb/cqadupstack-gaming
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 47.929
- type: map_at_10
value: 62.132
- type: map_at_100
value: 62.132
- type: map_at_1000
value: 62.132
- type: map_at_20
value: 62.132
- type: map_at_3
value: 58.278
- type: map_at_5
value: 60.589000000000006
- type: mrr_at_1
value: 55.11000000000001
- type: mrr_at_10
value: 65.417
- type: mrr_at_100
value: 65.417
- type: mrr_at_1000
value: 65.417
- type: mrr_at_20
value: 65.417
- type: mrr_at_3
value: 62.832
- type: mrr_at_5
value: 64.371
- type: ndcg_at_1
value: 55.11000000000001
- type: ndcg_at_10
value: 68.293
- type: ndcg_at_100
value: 68.197
- type: ndcg_at_1000
value: 68.197
- type: ndcg_at_20
value: 68.2
- type: ndcg_at_3
value: 62.28
- type: ndcg_at_5
value: 65.379
- type: precision_at_1
value: 55.11000000000001
- type: precision_at_10
value: 10.947
- type: precision_at_100
value: 1.095
- type: precision_at_1000
value: 0.109
- type: precision_at_20
value: 5.473
- type: precision_at_3
value: 27.816000000000003
- type: precision_at_5
value: 19.122
- type: recall_at_1
value: 47.929
- type: recall_at_10
value: 83.009
- type: recall_at_100
value: 83.009
- type: recall_at_1000
value: 83.009
- type: recall_at_20
value: 83.009
- type: recall_at_3
value: 66.97999999999999
- type: recall_at_5
value: 74.638
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: mteb/cqadupstack-gis
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 29.831999999999997
- type: map_at_10
value: 40.278000000000006
- type: map_at_100
value: 40.278000000000006
- type: map_at_1000
value: 40.278000000000006
- type: map_at_20
value: 40.278000000000006
- type: map_at_3
value: 37.271
- type: map_at_5
value: 38.93
- type: mrr_at_1
value: 32.316
- type: mrr_at_10
value: 42.368
- type: mrr_at_100
value: 42.368
- type: mrr_at_1000
value: 42.368
- type: mrr_at_20
value: 42.368
- type: mrr_at_3
value: 39.774
- type: mrr_at_5
value: 41.249
- type: ndcg_at_1
value: 32.316
- type: ndcg_at_10
value: 46.024
- type: ndcg_at_100
value: 46.024
- type: ndcg_at_1000
value: 46.024
- type: ndcg_at_20
value: 46.024
- type: ndcg_at_3
value: 40.324
- type: ndcg_at_5
value: 43.059999999999995
- type: precision_at_1
value: 32.316
- type: precision_at_10
value: 7.153
- type: precision_at_100
value: 0.715
- type: precision_at_1000
value: 0.07200000000000001
- type: precision_at_20
value: 3.576
- type: precision_at_3
value: 17.363
- type: precision_at_5
value: 12.068
- type: recall_at_1
value: 29.831999999999997
- type: recall_at_10
value: 61.550000000000004
- type: recall_at_100
value: 61.550000000000004
- type: recall_at_1000
value: 61.550000000000004
- type: recall_at_20
value: 61.550000000000004
- type: recall_at_3
value: 46.285
- type: recall_at_5
value: 52.851000000000006
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: mteb/cqadupstack-mathematica
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 22.595000000000002
- type: map_at_10
value: 32.306000000000004
- type: map_at_100
value: 32.306000000000004
- type: map_at_1000
value: 32.306000000000004
- type: map_at_20
value: 32.306000000000004
- type: map_at_3
value: 28.931
- type: map_at_5
value: 30.675
- type: mrr_at_1
value: 28.731
- type: mrr_at_10
value: 37.663999999999994
- type: mrr_at_100
value: 37.663999999999994
- type: mrr_at_1000
value: 37.663999999999994
- type: mrr_at_20
value: 37.663999999999994
- type: mrr_at_3
value: 34.949999999999996
- type: mrr_at_5
value: 36.399
- type: ndcg_at_1
value: 28.731
- type: ndcg_at_10
value: 38.485
- type: ndcg_at_100
value: 38.48
- type: ndcg_at_1000
value: 38.48
- type: ndcg_at_20
value: 38.48
- type: ndcg_at_3
value: 32.762
- type: ndcg_at_5
value: 35.149
- type: precision_at_1
value: 28.731
- type: precision_at_10
value: 7.226000000000001
- type: precision_at_100
value: 0.7230000000000001
- type: precision_at_1000
value: 0.07200000000000001
- type: precision_at_20
value: 3.6130000000000004
- type: precision_at_3
value: 15.754999999999999
- type: precision_at_5
value: 11.468
- type: recall_at_1
value: 22.595000000000002
- type: recall_at_10
value: 52.081
- type: recall_at_100
value: 52.081
- type: recall_at_1000
value: 52.081
- type: recall_at_20
value: 52.081
- type: recall_at_3
value: 36.14
- type: recall_at_5
value: 42.254000000000005
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: mteb/cqadupstack-physics
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 35.363
- type: map_at_10
value: 48.292
- type: map_at_100
value: 48.292
- type: map_at_1000
value: 48.292
- type: map_at_20
value: 48.292
- type: map_at_3
value: 44.230999999999995
- type: map_at_5
value: 46.455
- type: mrr_at_1
value: 44.273
- type: mrr_at_10
value: 53.923
- type: mrr_at_100
value: 53.923
- type: mrr_at_1000
value: 53.923
- type: mrr_at_20
value: 53.923
- type: mrr_at_3
value: 51.283
- type: mrr_at_5
value: 52.717000000000006
- type: ndcg_at_1
value: 44.273
- type: ndcg_at_10
value: 54.93900000000001
- type: ndcg_at_100
value: 54.832
- type: ndcg_at_1000
value: 54.832
- type: ndcg_at_20
value: 54.858
- type: ndcg_at_3
value: 49.184
- type: ndcg_at_5
value: 51.693999999999996
- type: precision_at_1
value: 44.273
- type: precision_at_10
value: 10.241
- type: precision_at_100
value: 1.024
- type: precision_at_1000
value: 0.10200000000000001
- type: precision_at_20
value: 5.12
- type: precision_at_3
value: 23.933
- type: precision_at_5
value: 16.805
- type: recall_at_1
value: 35.363
- type: recall_at_10
value: 68.891
- type: recall_at_100
value: 68.891
- type: recall_at_1000
value: 68.891
- type: recall_at_20
value: 68.891
- type: recall_at_3
value: 51.946999999999996
- type: recall_at_5
value: 58.996
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: mteb/cqadupstack-programmers
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 31.527
- type: map_at_10
value: 45.208
- type: map_at_100
value: 45.208
- type: map_at_1000
value: 45.208
- type: map_at_20
value: 45.208
- type: map_at_3
value: 41.193999999999996
- type: map_at_5
value: 43.378
- type: mrr_at_1
value: 39.839999999999996
- type: mrr_at_10
value: 50.906
- type: mrr_at_100
value: 50.906
- type: mrr_at_1000
value: 50.906
- type: mrr_at_20
value: 50.906
- type: mrr_at_3
value: 48.211999999999996
- type: mrr_at_5
value: 49.730000000000004
- type: ndcg_at_1
value: 39.839999999999996
- type: ndcg_at_10
value: 52.209
- type: ndcg_at_100
value: 52.068999999999996
- type: ndcg_at_1000
value: 52.065
- type: ndcg_at_20
value: 52.11
- type: ndcg_at_3
value: 46.326
- type: ndcg_at_5
value: 48.906
- type: precision_at_1
value: 39.839999999999996
- type: precision_at_10
value: 9.954
- type: precision_at_100
value: 0.9950000000000001
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.977
- type: precision_at_3
value: 22.945
- type: precision_at_5
value: 16.438
- type: recall_at_1
value: 31.527
- type: recall_at_10
value: 66.97
- type: recall_at_100
value: 66.97
- type: recall_at_1000
value: 66.97
- type: recall_at_20
value: 66.97
- type: recall_at_3
value: 49.989
- type: recall_at_5
value: 57.023999999999994
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 32.47491666666667
- type: map_at_10
value: 43.86241666666667
- type: map_at_100
value: 43.86241666666667
- type: map_at_1000
value: 43.86241666666667
- type: map_at_20
value: 43.86241666666667
- type: map_at_3
value: 40.445166666666665
- type: map_at_5
value: 42.34416666666667
- type: mrr_at_1
value: 39.24941666666667
- type: mrr_at_10
value: 48.56083333333333
- type: mrr_at_100
value: 48.56083333333333
- type: mrr_at_1000
value: 48.56083333333333
- type: mrr_at_20
value: 48.56083333333333
- type: mrr_at_3
value: 46.09241666666667
- type: mrr_at_5
value: 47.48925
- type: ndcg_at_1
value: 39.24941666666667
- type: ndcg_at_10
value: 49.94208333333333
- type: ndcg_at_100
value: 49.754916666666674
- type: ndcg_at_1000
value: 49.751583333333336
- type: ndcg_at_20
value: 49.81033333333333
- type: ndcg_at_3
value: 44.72058333333334
- type: ndcg_at_5
value: 47.121249999999996
- type: precision_at_1
value: 39.24941666666667
- type: precision_at_10
value: 8.876166666666666
- type: precision_at_100
value: 0.8875833333333333
- type: precision_at_1000
value: 0.08875000000000001
- type: precision_at_20
value: 4.438083333333333
- type: precision_at_3
value: 20.939500000000002
- type: precision_at_5
value: 14.76966666666667
- type: recall_at_1
value: 32.47491666666667
- type: recall_at_10
value: 62.95991666666666
- type: recall_at_100
value: 62.95991666666666
- type: recall_at_1000
value: 62.95991666666666
- type: recall_at_20
value: 62.95991666666666
- type: recall_at_3
value: 48.01025
- type: recall_at_5
value: 54.43341666666666
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: mteb/cqadupstack-stats
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 31.147999999999996
- type: map_at_10
value: 38.9
- type: map_at_100
value: 38.9
- type: map_at_1000
value: 38.9
- type: map_at_20
value: 38.9
- type: map_at_3
value: 36.614999999999995
- type: map_at_5
value: 37.836999999999996
- type: mrr_at_1
value: 35.123
- type: mrr_at_10
value: 42.111
- type: mrr_at_100
value: 42.111
- type: mrr_at_1000
value: 42.111
- type: mrr_at_20
value: 42.111
- type: mrr_at_3
value: 39.954
- type: mrr_at_5
value: 41.112
- type: ndcg_at_1
value: 35.123
- type: ndcg_at_10
value: 43.403000000000006
- type: ndcg_at_100
value: 43.374
- type: ndcg_at_1000
value: 43.374
- type: ndcg_at_20
value: 43.374
- type: ndcg_at_3
value: 39.206
- type: ndcg_at_5
value: 41.102
- type: precision_at_1
value: 35.123
- type: precision_at_10
value: 6.641
- type: precision_at_100
value: 0.664
- type: precision_at_1000
value: 0.066
- type: precision_at_20
value: 3.321
- type: precision_at_3
value: 16.616
- type: precision_at_5
value: 11.35
- type: recall_at_1
value: 31.147999999999996
- type: recall_at_10
value: 53.988
- type: recall_at_100
value: 53.988
- type: recall_at_1000
value: 53.988
- type: recall_at_20
value: 53.988
- type: recall_at_3
value: 42.223
- type: recall_at_5
value: 47.075
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: mteb/cqadupstack-tex
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 21.503
- type: map_at_10
value: 30.711
- type: map_at_100
value: 30.711
- type: map_at_1000
value: 30.711
- type: map_at_20
value: 30.711
- type: map_at_3
value: 27.876
- type: map_at_5
value: 29.448
- type: mrr_at_1
value: 26.944000000000003
- type: mrr_at_10
value: 35.400999999999996
- type: mrr_at_100
value: 35.400999999999996
- type: mrr_at_1000
value: 35.400999999999996
- type: mrr_at_20
value: 35.400999999999996
- type: mrr_at_3
value: 32.943
- type: mrr_at_5
value: 34.365
- type: ndcg_at_1
value: 26.944000000000003
- type: ndcg_at_10
value: 36.18
- type: ndcg_at_100
value: 36.053000000000004
- type: ndcg_at_1000
value: 36.051
- type: ndcg_at_20
value: 36.094
- type: ndcg_at_3
value: 31.5
- type: ndcg_at_5
value: 33.672999999999995
- type: precision_at_1
value: 26.944000000000003
- type: precision_at_10
value: 6.7
- type: precision_at_100
value: 0.67
- type: precision_at_1000
value: 0.067
- type: precision_at_20
value: 3.35
- type: precision_at_3
value: 15.221000000000002
- type: precision_at_5
value: 10.943
- type: recall_at_1
value: 21.503
- type: recall_at_10
value: 47.802
- type: recall_at_100
value: 47.802
- type: recall_at_1000
value: 47.802
- type: recall_at_20
value: 47.802
- type: recall_at_3
value: 34.316
- type: recall_at_5
value: 40.178999999999995
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: mteb/cqadupstack-unix
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 34.516000000000005
- type: map_at_10
value: 45.867999999999995
- type: map_at_100
value: 45.867999999999995
- type: map_at_1000
value: 45.867999999999995
- type: map_at_20
value: 45.867999999999995
- type: map_at_3
value: 42.480000000000004
- type: map_at_5
value: 44.292
- type: mrr_at_1
value: 41.510999999999996
- type: mrr_at_10
value: 50.709
- type: mrr_at_100
value: 50.709
- type: mrr_at_1000
value: 50.709
- type: mrr_at_20
value: 50.709
- type: mrr_at_3
value: 48.087999999999994
- type: mrr_at_5
value: 49.585
- type: ndcg_at_1
value: 41.510999999999996
- type: ndcg_at_10
value: 52.010999999999996
- type: ndcg_at_100
value: 51.998999999999995
- type: ndcg_at_1000
value: 51.998999999999995
- type: ndcg_at_20
value: 51.998999999999995
- type: ndcg_at_3
value: 46.605000000000004
- type: ndcg_at_5
value: 48.968
- type: precision_at_1
value: 41.510999999999996
- type: precision_at_10
value: 8.862
- type: precision_at_100
value: 0.886
- type: precision_at_1000
value: 0.089
- type: precision_at_20
value: 4.431
- type: precision_at_3
value: 21.58
- type: precision_at_5
value: 14.943999999999999
- type: recall_at_1
value: 34.516000000000005
- type: recall_at_10
value: 65.874
- type: recall_at_100
value: 65.874
- type: recall_at_1000
value: 65.874
- type: recall_at_20
value: 65.874
- type: recall_at_3
value: 50.444
- type: recall_at_5
value: 56.87
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: mteb/cqadupstack-webmasters
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 32.486
- type: map_at_10
value: 44.13
- type: map_at_100
value: 44.13
- type: map_at_1000
value: 44.13
- type: map_at_20
value: 44.13
- type: map_at_3
value: 40.644999999999996
- type: map_at_5
value: 42.712
- type: mrr_at_1
value: 40.514
- type: mrr_at_10
value: 50.058
- type: mrr_at_100
value: 50.058
- type: mrr_at_1000
value: 50.058
- type: mrr_at_20
value: 50.058
- type: mrr_at_3
value: 47.53
- type: mrr_at_5
value: 49.2
- type: ndcg_at_1
value: 40.514
- type: ndcg_at_10
value: 50.722
- type: ndcg_at_100
value: 49.907000000000004
- type: ndcg_at_1000
value: 49.889
- type: ndcg_at_20
value: 50.212
- type: ndcg_at_3
value: 45.822
- type: ndcg_at_5
value: 48.468
- type: precision_at_1
value: 40.514
- type: precision_at_10
value: 9.783
- type: precision_at_100
value: 0.9780000000000001
- type: precision_at_1000
value: 0.098
- type: precision_at_20
value: 4.891
- type: precision_at_3
value: 21.871
- type: precision_at_5
value: 15.928999999999998
- type: recall_at_1
value: 32.486
- type: recall_at_10
value: 61.644
- type: recall_at_100
value: 61.644
- type: recall_at_1000
value: 61.644
- type: recall_at_20
value: 61.644
- type: recall_at_3
value: 47.502
- type: recall_at_5
value: 54.596999999999994
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: mteb/cqadupstack-wordpress
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 26.069
- type: map_at_10
value: 35.869
- type: map_at_100
value: 35.869
- type: map_at_1000
value: 35.869
- type: map_at_20
value: 35.869
- type: map_at_3
value: 33.249
- type: map_at_5
value: 34.521
- type: mrr_at_1
value: 28.281
- type: mrr_at_10
value: 38.26
- type: mrr_at_100
value: 38.26
- type: mrr_at_1000
value: 38.26
- type: mrr_at_20
value: 38.26
- type: mrr_at_3
value: 36.106
- type: mrr_at_5
value: 37.095
- type: ndcg_at_1
value: 28.281
- type: ndcg_at_10
value: 41.286
- type: ndcg_at_100
value: 41.277
- type: ndcg_at_1000
value: 41.277
- type: ndcg_at_20
value: 41.281
- type: ndcg_at_3
value: 36.36
- type: ndcg_at_5
value: 38.275
- type: precision_at_1
value: 28.281
- type: precision_at_10
value: 6.451
- type: precision_at_100
value: 0.645
- type: precision_at_1000
value: 0.065
- type: precision_at_20
value: 3.2259999999999995
- type: precision_at_3
value: 15.895999999999999
- type: precision_at_5
value: 10.758
- type: recall_at_1
value: 26.069
- type: recall_at_10
value: 55.55799999999999
- type: recall_at_100
value: 55.55799999999999
- type: recall_at_1000
value: 55.55799999999999
- type: recall_at_20
value: 55.55799999999999
- type: recall_at_3
value: 41.929
- type: recall_at_5
value: 46.577
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 11.675
- type: map_at_10
value: 21.314
- type: map_at_100
value: 21.314
- type: map_at_1000
value: 21.314
- type: map_at_20
value: 21.314
- type: map_at_3
value: 17.689
- type: map_at_5
value: 19.534000000000002
- type: mrr_at_1
value: 27.101
- type: mrr_at_10
value: 40.339999999999996
- type: mrr_at_100
value: 40.339999999999996
- type: mrr_at_1000
value: 40.339999999999996
- type: mrr_at_20
value: 40.339999999999996
- type: mrr_at_3
value: 37.014
- type: mrr_at_5
value: 39.050000000000004
- type: ndcg_at_1
value: 27.101
- type: ndcg_at_10
value: 30.325000000000003
- type: ndcg_at_100
value: 30.325000000000003
- type: ndcg_at_1000
value: 30.325000000000003
- type: ndcg_at_20
value: 30.325000000000003
- type: ndcg_at_3
value: 24.823
- type: ndcg_at_5
value: 26.729999999999997
- type: precision_at_1
value: 27.101
- type: precision_at_10
value: 9.622
- type: precision_at_100
value: 0.962
- type: precision_at_1000
value: 0.096
- type: precision_at_20
value: 4.811
- type: precision_at_3
value: 18.958
- type: precision_at_5
value: 14.527999999999999
- type: recall_at_1
value: 11.675
- type: recall_at_10
value: 37.037
- type: recall_at_100
value: 37.037
- type: recall_at_1000
value: 37.037
- type: recall_at_20
value: 37.037
- type: recall_at_3
value: 23.22
- type: recall_at_5
value: 28.967
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 9.715
- type: map_at_10
value: 23.903
- type: map_at_100
value: 23.903
- type: map_at_1000
value: 23.903
- type: map_at_20
value: 23.903
- type: map_at_3
value: 16.029
- type: map_at_5
value: 19.276
- type: mrr_at_1
value: 73.75
- type: mrr_at_10
value: 81.089
- type: mrr_at_100
value: 81.089
- type: mrr_at_1000
value: 81.089
- type: mrr_at_20
value: 81.089
- type: mrr_at_3
value: 79.75
- type: mrr_at_5
value: 80.55
- type: ndcg_at_1
value: 61.5
- type: ndcg_at_10
value: 49.027
- type: ndcg_at_100
value: 37.618
- type: ndcg_at_1000
value: 37.475
- type: ndcg_at_20
value: 41.513
- type: ndcg_at_3
value: 52.624
- type: ndcg_at_5
value: 50.409000000000006
- type: precision_at_1
value: 73.75
- type: precision_at_10
value: 40.025
- type: precision_at_100
value: 4.002
- type: precision_at_1000
value: 0.4
- type: precision_at_20
value: 20.012
- type: precision_at_3
value: 56.75
- type: precision_at_5
value: 49.3
- type: recall_at_1
value: 9.715
- type: recall_at_10
value: 30.368000000000002
- type: recall_at_100
value: 30.368000000000002
- type: recall_at_1000
value: 30.368000000000002
- type: recall_at_20
value: 30.368000000000002
- type: recall_at_3
value: 17.512
- type: recall_at_5
value: 22.381999999999998
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 50.964999999999996
- type: f1
value: 46.07175136684756
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 79.73400000000001
- type: map_at_10
value: 86.61999999999999
- type: map_at_100
value: 86.61999999999999
- type: map_at_1000
value: 86.61999999999999
- type: map_at_20
value: 86.61999999999999
- type: map_at_3
value: 85.774
- type: map_at_5
value: 86.343
- type: mrr_at_1
value: 85.884
- type: mrr_at_10
value: 91.51700000000001
- type: mrr_at_100
value: 91.51700000000001
- type: mrr_at_1000
value: 91.51700000000001
- type: mrr_at_20
value: 91.51700000000001
- type: mrr_at_3
value: 91.034
- type: mrr_at_5
value: 91.426
- type: ndcg_at_1
value: 85.884
- type: ndcg_at_10
value: 89.791
- type: ndcg_at_100
value: 89.79
- type: ndcg_at_1000
value: 89.79
- type: ndcg_at_20
value: 89.79
- type: ndcg_at_3
value: 88.593
- type: ndcg_at_5
value: 89.333
- type: precision_at_1
value: 85.884
- type: precision_at_10
value: 10.491999999999999
- type: precision_at_100
value: 1.049
- type: precision_at_1000
value: 0.105
- type: precision_at_20
value: 5.2459999999999996
- type: precision_at_3
value: 33.312999999999995
- type: precision_at_5
value: 20.516000000000002
- type: recall_at_1
value: 79.73400000000001
- type: recall_at_10
value: 94.797
- type: recall_at_100
value: 94.797
- type: recall_at_1000
value: 94.797
- type: recall_at_20
value: 94.797
- type: recall_at_3
value: 91.475
- type: recall_at_5
value: 93.46900000000001
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 27.916
- type: map_at_10
value: 45.756
- type: map_at_100
value: 45.756
- type: map_at_1000
value: 45.756
- type: map_at_20
value: 45.756
- type: map_at_3
value: 40.275
- type: map_at_5
value: 43.324
- type: mrr_at_1
value: 52.778000000000006
- type: mrr_at_10
value: 61.39
- type: mrr_at_100
value: 61.39
- type: mrr_at_1000
value: 61.39
- type: mrr_at_20
value: 61.39
- type: mrr_at_3
value: 59.541999999999994
- type: mrr_at_5
value: 60.661
- type: ndcg_at_1
value: 52.778000000000006
- type: ndcg_at_10
value: 53.823
- type: ndcg_at_100
value: 53.757999999999996
- type: ndcg_at_1000
value: 53.757999999999996
- type: ndcg_at_20
value: 53.757999999999996
- type: ndcg_at_3
value: 50.063
- type: ndcg_at_5
value: 51.224000000000004
- type: precision_at_1
value: 52.778000000000006
- type: precision_at_10
value: 14.691
- type: precision_at_100
value: 1.469
- type: precision_at_1000
value: 0.147
- type: precision_at_20
value: 7.346
- type: precision_at_3
value: 33.333
- type: precision_at_5
value: 24.227999999999998
- type: recall_at_1
value: 27.916
- type: recall_at_10
value: 60.638000000000005
- type: recall_at_100
value: 60.638000000000005
- type: recall_at_1000
value: 60.638000000000005
- type: recall_at_20
value: 60.638000000000005
- type: recall_at_3
value: 45.992
- type: recall_at_5
value: 52.413
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 40.284
- type: map_at_10
value: 69.45700000000001
- type: map_at_100
value: 69.45700000000001
- type: map_at_1000
value: 69.45700000000001
- type: map_at_20
value: 69.45700000000001
- type: map_at_3
value: 65.689
- type: map_at_5
value: 68.186
- type: mrr_at_1
value: 80.567
- type: mrr_at_10
value: 86.39099999999999
- type: mrr_at_100
value: 86.39099999999999
- type: mrr_at_1000
value: 86.39099999999999
- type: mrr_at_20
value: 86.39099999999999
- type: mrr_at_3
value: 85.53500000000001
- type: mrr_at_5
value: 86.095
- type: ndcg_at_1
value: 80.567
- type: ndcg_at_10
value: 76.81700000000001
- type: ndcg_at_100
value: 76.81700000000001
- type: ndcg_at_1000
value: 76.81700000000001
- type: ndcg_at_20
value: 76.81700000000001
- type: ndcg_at_3
value: 71.696
- type: ndcg_at_5
value: 74.736
- type: precision_at_1
value: 80.567
- type: precision_at_10
value: 16.359
- type: precision_at_100
value: 1.636
- type: precision_at_1000
value: 0.164
- type: precision_at_20
value: 8.18
- type: precision_at_3
value: 47.089999999999996
- type: precision_at_5
value: 30.639
- type: recall_at_1
value: 40.284
- type: recall_at_10
value: 81.796
- type: recall_at_100
value: 81.796
- type: recall_at_1000
value: 81.796
- type: recall_at_20
value: 81.796
- type: recall_at_3
value: 70.635
- type: recall_at_5
value: 76.59700000000001
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 89.42760000000001
- type: ap
value: 85.65673213756732
- type: f1
value: 89.39873631717961
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 19.54
- type: map_at_10
value: 32.481
- type: map_at_100
value: 32.481
- type: map_at_1000
value: 32.481
- type: map_at_20
value: 32.481
- type: map_at_3
value: 28.083000000000002
- type: map_at_5
value: 30.580000000000002
- type: mrr_at_1
value: 20.115
- type: mrr_at_10
value: 33.034
- type: mrr_at_100
value: 33.034
- type: mrr_at_1000
value: 33.034
- type: mrr_at_20
value: 33.034
- type: mrr_at_3
value: 28.755999999999997
- type: mrr_at_5
value: 31.195
- type: ndcg_at_1
value: 20.115
- type: ndcg_at_10
value: 40.006
- type: ndcg_at_100
value: 40.006
- type: ndcg_at_1000
value: 40.006
- type: ndcg_at_20
value: 40.006
- type: ndcg_at_3
value: 31.080000000000002
- type: ndcg_at_5
value: 35.524
- type: precision_at_1
value: 20.115
- type: precision_at_10
value: 6.638
- type: precision_at_100
value: 0.664
- type: precision_at_1000
value: 0.066
- type: precision_at_20
value: 3.319
- type: precision_at_3
value: 13.553
- type: precision_at_5
value: 10.384
- type: recall_at_1
value: 19.54
- type: recall_at_10
value: 63.344
- type: recall_at_100
value: 63.344
- type: recall_at_1000
value: 63.344
- type: recall_at_20
value: 63.344
- type: recall_at_3
value: 39.080999999999996
- type: recall_at_5
value: 49.769999999999996
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 96.67578659370726
- type: f1
value: 96.44660736409732
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 85.4719562243502
- type: f1
value: 70.478731125108
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 80.0268997982515
- type: f1
value: 77.36851682955398
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 82.77740416946872
- type: f1
value: 82.36168633333541
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 32.90073967924184
- type: v_measures
value:
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- 0.3211341289091321
- 0.3161692449183304
- 0.3246573332238926
- 0.3113280417238961
- 0.3112859280868748
- 0.33291300943229213
- 0.3374941840347436
- 0.34428297701826893
- 0.35504076095733555
- 0.33576835961941726
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 32.304535976211284
- type: v_measures
value:
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- 0.3113827930260938
- 0.30779426396733683
- 0.3140456120463529
- 0.318006788459812
- 0.32113804354598285
- 0.3329818163288202
- 0.32824449711793047
- 0.3243430439835618
- 0.34484914695347063
- 0.3276675921917667
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 32.030630164228974
- type: mrr
value: 33.18900877896056
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 7.213
- type: map_at_10
value: 15.870999999999999
- type: map_at_100
value: 15.870999999999999
- type: map_at_1000
value: 15.870999999999999
- type: map_at_20
value: 15.870999999999999
- type: map_at_3
value: 11.447000000000001
- type: map_at_5
value: 13.669
- type: mrr_at_1
value: 53.87
- type: mrr_at_10
value: 62.117999999999995
- type: mrr_at_100
value: 62.117999999999995
- type: mrr_at_1000
value: 62.117999999999995
- type: mrr_at_20
value: 62.117999999999995
- type: mrr_at_3
value: 60.01
- type: mrr_at_5
value: 61.480999999999995
- type: ndcg_at_1
value: 51.702999999999996
- type: ndcg_at_10
value: 40.859
- type: ndcg_at_100
value: 26.623
- type: ndcg_at_1000
value: 26.141
- type: ndcg_at_20
value: 32.744
- type: ndcg_at_3
value: 47.146
- type: ndcg_at_5
value: 44.887
- type: precision_at_1
value: 53.251000000000005
- type: precision_at_10
value: 30.092999999999996
- type: precision_at_100
value: 3.009
- type: precision_at_1000
value: 0.301
- type: precision_at_20
value: 15.046000000000001
- type: precision_at_3
value: 43.963
- type: precision_at_5
value: 39.009
- type: recall_at_1
value: 7.213
- type: recall_at_10
value: 19.909
- type: recall_at_100
value: 19.909
- type: recall_at_1000
value: 19.909
- type: recall_at_20
value: 19.909
- type: recall_at_3
value: 12.46
- type: recall_at_5
value: 16.107
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 38.688
- type: map_at_10
value: 56.261
- type: map_at_100
value: 56.261
- type: map_at_1000
value: 56.261
- type: map_at_20
value: 56.261
- type: map_at_3
value: 51.989
- type: map_at_5
value: 54.569
- type: mrr_at_1
value: 43.656
- type: mrr_at_10
value: 58.677
- type: mrr_at_100
value: 58.677
- type: mrr_at_1000
value: 58.677
- type: mrr_at_20
value: 58.677
- type: mrr_at_3
value: 55.456
- type: mrr_at_5
value: 57.391999999999996
- type: ndcg_at_1
value: 43.656
- type: ndcg_at_10
value: 64.149
- type: ndcg_at_100
value: 64.149
- type: ndcg_at_1000
value: 64.149
- type: ndcg_at_20
value: 64.149
- type: ndcg_at_3
value: 56.458
- type: ndcg_at_5
value: 60.556
- type: precision_at_1
value: 43.656
- type: precision_at_10
value: 10.242999999999999
- type: precision_at_100
value: 1.024
- type: precision_at_1000
value: 0.10200000000000001
- type: precision_at_20
value: 5.122
- type: precision_at_3
value: 25.657000000000004
- type: precision_at_5
value: 17.827
- type: recall_at_1
value: 38.688
- type: recall_at_10
value: 85.492
- type: recall_at_100
value: 85.492
- type: recall_at_1000
value: 85.492
- type: recall_at_20
value: 85.492
- type: recall_at_3
value: 65.86800000000001
- type: recall_at_5
value: 75.157
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: map_at_1
value: 71.178
- type: map_at_10
value: 85.453
- type: map_at_100
value: 85.453
- type: map_at_1000
value: 85.453
- type: map_at_20
value: 85.453
- type: map_at_3
value: 82.494
- type: map_at_5
value: 84.384
- type: mrr_at_1
value: 82.04
- type: mrr_at_10
value: 88.02600000000001
- type: mrr_at_100
value: 88.02600000000001
- type: mrr_at_1000
value: 88.02600000000001
- type: mrr_at_20
value: 88.02600000000001
- type: mrr_at_3
value: 87.118
- type: mrr_at_5
value: 87.74300000000001
- type: ndcg_at_1
value: 82.05
- type: ndcg_at_10
value: 88.986
- type: ndcg_at_100
value: 88.85300000000001
- type: ndcg_at_1000
value: 88.85300000000001
- type: ndcg_at_20
value: 88.874
- type: ndcg_at_3
value: 86.253
- type: ndcg_at_5
value: 87.803
- type: precision_at_1
value: 82.05
- type: precision_at_10
value: 13.566
- type: precision_at_100
value: 1.357
- type: precision_at_1000
value: 0.136
- type: precision_at_20
value: 6.783
- type: precision_at_3
value: 37.857
- type: precision_at_5
value: 24.942
- type: recall_at_1
value: 71.178
- type: recall_at_10
value: 95.771
- type: recall_at_100
value: 95.771
- type: recall_at_1000
value: 95.771
- type: recall_at_20
value: 95.771
- type: recall_at_3
value: 87.935
- type: recall_at_5
value: 92.293
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 62.0337414310567
- type: v_measures
value:
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- 0.522616622864363
- 0.6844265078631042
- 0.5518593590523799
- 0.6495496503246567
- 0.55707101381553
- 0.6276443750698807
- 0.6547592087920104
- 0.5993440123354254
- 0.5932708916180552
- 0.5791630341611333
- 0.5856462959142211
- 0.6308260728639485
- 0.6496676410376024
- 0.6460475500509058
- 0.7681883605457646
- 0.5747058941990597
- 0.6145751234573215
- 0.6922871659569485
- 0.6061915759419488
- 0.5745532893376831
- 0.6024719265547521
- 0.5650277167062964
- 0.7551094349004248
- 0.646959786191737
- 0.5764728482090206
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: v_measure
value: 64.90650880132416
- type: v_measures
value:
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- 0.6867730490627711
- 0.6695749873783695
- 0.6735466912158105
- 0.40760894716442037
- 0.7351821359552528
- 0.6331654231601911
- 0.3872061665636622
- 0.7884328196359962
- 0.7400923787719617
- 0.7690682812239796
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: map_at_1
value: 5.378
- type: map_at_10
value: 14.246
- type: map_at_100
value: 14.246
- type: map_at_1000
value: 14.246
- type: map_at_20
value: 14.246
- type: map_at_3
value: 10.168000000000001
- type: map_at_5
value: 12.107999999999999
- type: mrr_at_1
value: 26.5
- type: mrr_at_10
value: 38.282
- type: mrr_at_100
value: 38.282
- type: mrr_at_1000
value: 38.282
- type: mrr_at_20
value: 38.282
- type: mrr_at_3
value: 35.283
- type: mrr_at_5
value: 36.742999999999995
- type: ndcg_at_1
value: 26.5
- type: ndcg_at_10
value: 23.342
- type: ndcg_at_100
value: 23.342
- type: ndcg_at_1000
value: 23.342
- type: ndcg_at_20
value: 23.342
- type: ndcg_at_3
value: 22.426
- type: ndcg_at_5
value: 19.326999999999998
- type: precision_at_1
value: 26.5
- type: precision_at_10
value: 12.2
- type: precision_at_100
value: 1.22
- type: precision_at_1000
value: 0.122
- type: precision_at_20
value: 6.1
- type: precision_at_3
value: 21.367
- type: precision_at_5
value: 17.1
- type: recall_at_1
value: 5.378
- type: recall_at_10
value: 24.725
- type: recall_at_100
value: 24.725
- type: recall_at_1000
value: 24.725
- type: recall_at_20
value: 24.725
- type: recall_at_3
value: 12.998000000000001
- type: recall_at_5
value: 17.337
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cos_sim_pearson
value: 86.15886131719716
- type: cos_sim_spearman
value: 83.36422679444524
- type: euclidean_pearson
value: 83.60722543060551
- type: euclidean_spearman
value: 83.36615167108201
- type: manhattan_pearson
value: 83.49779776169743
- type: manhattan_spearman
value: 83.2970912000804
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 87.00040631393384
- type: cos_sim_spearman
value: 79.88695146958382
- type: euclidean_pearson
value: 82.87830951101813
- type: euclidean_spearman
value: 78.81342612822002
- type: manhattan_pearson
value: 83.27329725748622
- type: manhattan_spearman
value: 79.24487142779232
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 87.46403674552138
- type: cos_sim_spearman
value: 88.3968038981169
- type: euclidean_pearson
value: 87.9867395305952
- type: euclidean_spearman
value: 88.50084925112196
- type: manhattan_pearson
value: 87.84826650949114
- type: manhattan_spearman
value: 88.4264633694987
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 86.33051578266844
- type: cos_sim_spearman
value: 85.96906608318722
- type: euclidean_pearson
value: 86.08087652971557
- type: euclidean_spearman
value: 86.2871150658858
- type: manhattan_pearson
value: 85.96247026170943
- type: manhattan_spearman
value: 86.348084535251
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 89.65021183240658
- type: cos_sim_spearman
value: 90.01036051355563
- type: euclidean_pearson
value: 89.34610124059546
- type: euclidean_spearman
value: 89.89816462250603
- type: manhattan_pearson
value: 89.45034332532651
- type: manhattan_spearman
value: 90.12149502943
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 86.38445171139439
- type: cos_sim_spearman
value: 88.06405772882476
- type: euclidean_pearson
value: 87.1032900128079
- type: euclidean_spearman
value: 87.86639736438771
- type: manhattan_pearson
value: 87.25501396297628
- type: manhattan_spearman
value: 88.01175383067931
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 89.51886071808445
- type: cos_sim_spearman
value: 89.05111168557342
- type: euclidean_pearson
value: 89.8619983583313
- type: euclidean_spearman
value: 89.57381411572374
- type: manhattan_pearson
value: 89.54097114512615
- type: manhattan_spearman
value: 89.0124029528803
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 69.11659818631864
- type: cos_sim_spearman
value: 68.73933790764565
- type: euclidean_pearson
value: 69.3608440231078
- type: euclidean_spearman
value: 68.25989021427226
- type: manhattan_pearson
value: 69.32307009511591
- type: manhattan_spearman
value: 68.2229407569999
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 87.70774882997117
- type: cos_sim_spearman
value: 88.6908929434652
- type: euclidean_pearson
value: 88.2222605477348
- type: euclidean_spearman
value: 88.66834657653354
- type: manhattan_pearson
value: 88.0806219722353
- type: manhattan_spearman
value: 88.63261072811657
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 86.40231391278195
- type: mrr
value: 96.52358806770572
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 61.828
- type: map_at_10
value: 73.262
- type: map_at_100
value: 73.262
- type: map_at_1000
value: 73.262
- type: map_at_20
value: 73.262
- type: map_at_3
value: 70.61500000000001
- type: map_at_5
value: 72.18900000000001
- type: mrr_at_1
value: 65.0
- type: mrr_at_10
value: 74.223
- type: mrr_at_100
value: 74.223
- type: mrr_at_1000
value: 74.223
- type: mrr_at_20
value: 74.223
- type: mrr_at_3
value: 72.38900000000001
- type: mrr_at_5
value: 73.30600000000001
- type: ndcg_at_1
value: 65.0
- type: ndcg_at_10
value: 78.02799999999999
- type: ndcg_at_100
value: 78.02799999999999
- type: ndcg_at_1000
value: 78.02799999999999
- type: ndcg_at_20
value: 78.02799999999999
- type: ndcg_at_3
value: 73.73100000000001
- type: ndcg_at_5
value: 75.652
- type: precision_at_1
value: 65.0
- type: precision_at_10
value: 10.333
- type: precision_at_100
value: 1.0330000000000001
- type: precision_at_1000
value: 0.10300000000000001
- type: precision_at_20
value: 5.167
- type: precision_at_3
value: 29.110999999999997
- type: precision_at_5
value: 18.933
- type: recall_at_1
value: 61.828
- type: recall_at_10
value: 91.656
- type: recall_at_100
value: 91.656
- type: recall_at_1000
value: 91.656
- type: recall_at_20
value: 91.656
- type: recall_at_3
value: 79.767
- type: recall_at_5
value: 84.60600000000001
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.84257425742574
- type: cos_sim_ap
value: 96.37616198293242
- type: cos_sim_f1
value: 91.92483494159471
- type: cos_sim_precision
value: 93.3952528379773
- type: cos_sim_recall
value: 90.5
- type: dot_accuracy
value: 99.75544554455446
- type: dot_ap
value: 93.95361847363756
- type: dot_f1
value: 87.66849725411883
- type: dot_precision
value: 87.53738783649054
- type: dot_recall
value: 87.8
- type: euclidean_accuracy
value: 99.84257425742574
- type: euclidean_ap
value: 96.24059702931834
- type: euclidean_f1
value: 91.93302891933028
- type: euclidean_precision
value: 93.3058702368692
- type: euclidean_recall
value: 90.60000000000001
- type: manhattan_accuracy
value: 99.84752475247525
- type: manhattan_ap
value: 96.37270807643512
- type: manhattan_f1
value: 92.13483146067415
- type: manhattan_precision
value: 94.1544885177453
- type: manhattan_recall
value: 90.2
- type: max_accuracy
value: 99.84752475247525
- type: max_ap
value: 96.37616198293242
- type: max_f1
value: 92.13483146067415
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 71.17621756822422
- type: v_measures
value:
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- 0.7477950473623103
- 0.6728111695551933
- 0.6734562846919451
- 0.7059292233114443
- 0.7521876023808786
- 0.6774745453091855
- 0.7149463068624518
- 0.7959522106956355
- 0.692611822616584
- 0.7148665068884766
- 0.7780348326422128
- 0.7378351613592702
- 0.7521627066730262
- 0.8069083201805425
- 0.6328373767337706
- 0.6705605652326363
- 0.6963029850066897
- 0.6836438610130369
- 0.6729473481464826
- 0.6864686782949194
- 0.7074212562176637
- 0.7315185391604614
- 0.6852348961730277
- 0.7120267811240509
- 0.6921203644241556
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 34.84294873831607
- type: v_measures
value:
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- 0.3373734063407478
- 0.3333085574774491
- 0.3295550012443221
- 0.33101868124191464
- 0.32576828639487726
- 0.3824038367821479
- 0.3560433828709798
- 0.3616957787167791
- 0.3656066710810745
- 0.3615212716813148
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 55.93707898596513
- type: mrr
value: 57.00651472710297
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.351178802339017
- type: cos_sim_spearman
value: 30.93982299939136
- type: dot_pearson
value: 28.32212875009798
- type: dot_spearman
value: 30.378331720649506
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: map_at_1
value: 0.23700000000000002
- type: map_at_10
value: 1.9689999999999999
- type: map_at_100
value: 1.9689999999999999
- type: map_at_1000
value: 1.9689999999999999
- type: map_at_20
value: 1.9689999999999999
- type: map_at_3
value: 0.654
- type: map_at_5
value: 1.06
- type: mrr_at_1
value: 90.0
- type: mrr_at_10
value: 94.333
- type: mrr_at_100
value: 94.333
- type: mrr_at_1000
value: 94.333
- type: mrr_at_20
value: 94.333
- type: mrr_at_3
value: 94.333
- type: mrr_at_5
value: 94.333
- type: ndcg_at_1
value: 86.0
- type: ndcg_at_10
value: 79.069
- type: ndcg_at_100
value: 17.387
- type: ndcg_at_1000
value: 7.374
- type: ndcg_at_20
value: 51.028
- type: ndcg_at_3
value: 83.693
- type: ndcg_at_5
value: 82.56700000000001
- type: precision_at_1
value: 90.0
- type: precision_at_10
value: 82.0
- type: precision_at_100
value: 8.200000000000001
- type: precision_at_1000
value: 0.8200000000000001
- type: precision_at_20
value: 41.0
- type: precision_at_3
value: 87.333
- type: precision_at_5
value: 85.6
- type: recall_at_1
value: 0.23700000000000002
- type: recall_at_10
value: 2.154
- type: recall_at_100
value: 2.154
- type: recall_at_1000
value: 2.154
- type: recall_at_20
value: 2.154
- type: recall_at_3
value: 0.685
- type: recall_at_5
value: 1.124
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 2.703
- type: map_at_10
value: 9.223
- type: map_at_100
value: 9.223
- type: map_at_1000
value: 9.223
- type: map_at_20
value: 9.223
- type: map_at_3
value: 5.239
- type: map_at_5
value: 6.635000000000001
- type: mrr_at_1
value: 34.694
- type: mrr_at_10
value: 45.578
- type: mrr_at_100
value: 45.578
- type: mrr_at_1000
value: 45.578
- type: mrr_at_20
value: 45.578
- type: mrr_at_3
value: 42.516999999999996
- type: mrr_at_5
value: 44.864
- type: ndcg_at_1
value: 32.653
- type: ndcg_at_10
value: 22.653000000000002
- type: ndcg_at_100
value: 18.984
- type: ndcg_at_1000
value: 18.984
- type: ndcg_at_20
value: 19.167
- type: ndcg_at_3
value: 26.177
- type: ndcg_at_5
value: 24.663
- type: precision_at_1
value: 34.694
- type: precision_at_10
value: 18.98
- type: precision_at_100
value: 1.8980000000000001
- type: precision_at_1000
value: 0.19
- type: precision_at_20
value: 9.49
- type: precision_at_3
value: 25.85
- type: precision_at_5
value: 23.673
- type: recall_at_1
value: 2.703
- type: recall_at_10
value: 14.594999999999999
- type: recall_at_100
value: 14.594999999999999
- type: recall_at_1000
value: 14.594999999999999
- type: recall_at_20
value: 14.594999999999999
- type: recall_at_3
value: 5.946
- type: recall_at_5
value: 8.683
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 65.673828125
- type: ap
value: 12.447736297176911
- type: f1
value: 50.78594018688396
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 62.60611205432938
- type: f1
value: 62.82045205162571
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 54.37082277640638
- type: v_measures
value:
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- 0.5409183583796854
- 0.5370809703620781
- 0.5323756926259336
- 0.5434072749886981
- 0.5499501979980845
- 0.5460629240550339
- 0.5543320007194547
- 0.5515281113090142
- 0.5436396439745212
- 0.5377871032281346
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 88.86570900637778
- type: cos_sim_ap
value: 82.37268769366939
- type: cos_sim_f1
value: 74.7083775185578
- type: cos_sim_precision
value: 75.06659563132658
- type: cos_sim_recall
value: 74.35356200527704
- type: dot_accuracy
value: 86.2609524944865
- type: dot_ap
value: 74.8055407969579
- type: dot_f1
value: 68.9965903555772
- type: dot_precision
value: 64.06603346901855
- type: dot_recall
value: 74.74934036939314
- type: euclidean_accuracy
value: 88.85974846516064
- type: euclidean_ap
value: 82.41589558001058
- type: euclidean_f1
value: 74.70888394609446
- type: euclidean_precision
value: 74.09810537243706
- type: euclidean_recall
value: 75.32981530343008
- type: manhattan_accuracy
value: 88.71669547594921
- type: manhattan_ap
value: 82.2077641402819
- type: manhattan_f1
value: 74.43037974683546
- type: manhattan_precision
value: 75.18169582772543
- type: manhattan_recall
value: 73.6939313984169
- type: max_accuracy
value: 88.86570900637778
- type: max_ap
value: 82.41589558001058
- type: max_f1
value: 74.70888394609446
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.67283735009897
- type: cos_sim_ap
value: 87.24295583945482
- type: cos_sim_f1
value: 79.91967871485943
- type: cos_sim_precision
value: 77.28711162255466
- type: cos_sim_recall
value: 82.73791191869418
- type: dot_accuracy
value: 88.57259285132145
- type: dot_ap
value: 84.94911679167208
- type: dot_f1
value: 78.03707259434312
- type: dot_precision
value: 74.97516673761885
- type: dot_recall
value: 81.35971666153372
- type: euclidean_accuracy
value: 89.65925408468196
- type: euclidean_ap
value: 87.20753315344663
- type: euclidean_f1
value: 79.88855487939
- type: euclidean_precision
value: 76.24912526242127
- type: euclidean_recall
value: 83.89282414536495
- type: manhattan_accuracy
value: 89.62626615438352
- type: manhattan_ap
value: 87.25477616369629
- type: manhattan_f1
value: 79.90573332842362
- type: manhattan_precision
value: 76.5756228385913
- type: manhattan_recall
value: 83.53865106251925
- type: max_accuracy
value: 89.67283735009897
- type: max_ap
value: 87.25477616369629
- type: max_f1
value: 79.91967871485943
---
---
Private dataset
---
| [
"BIOSSES",
"SCIFACT"
] |
Shalie/ChisakaAiriPonyXL | Shalie | text-to-image | [
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"template:sd-lora",
"not-for-all-audiences",
"base_model:AstraliteHeart/pony-diffusion-v6",
"base_model:adapter:AstraliteHeart/pony-diffusion-v6",
"license:other",
"region:us"
] | "2024-05-25T00:26:33Z" | 2024-05-25T00:29:32+00:00 | 0 | 0 | ---
base_model: AstraliteHeart/pony-diffusion-v6
license: other
license_name: other
license_link: https://freedevproject.org/faipl-1.0-sd/
tags:
- text-to-image
- stable-diffusion
- lora
- diffusers
- template:sd-lora
- not-for-all-audiences
widget:
- text: score_9, score_8_up, score_7_up, uncensored, source_anime, 1girl, <lora:spchisakaAiriXLPony:1>
chisaka1st, hairclip, hair flower, wolf tail, hat, red necktie, necktie between
breasts, detached collar, cleavage, black dress, boots, white jacket, bad anatomy,
indoors, striped, stuffed animal, stuffed toy, teddy bear, blush, hand up, head
tilt, looking at viewer, parted lips, smile, solo, envious
parameters:
negative_prompt: 3d, monochrome, greyscale, source_pony, source_furry
output:
url: images/06205-2105012195-score_9, score_8_up, score_7_up, uncensored, source_anime,
1girl, _lora_spchisakaAiriXLPony_1_ chisaka1st, hairclip, hair flower.png
- text: score_9, score_8_up, score_7_up, uncensored, source_anime, 1girl, <lora:spchisakaAiriXLPony:1>
chisaka1st, hairclip, hair flower, wolf tail, hat, red necktie, necktie between
breasts, detached collar, cleavage, black dress, boots, white jacket, grey background,
simple background, white background, hand on own arm, looking at viewer, solo,
:o
parameters:
negative_prompt: 3d, monochrome, greyscale, source_pony, source_furry
output:
url: images/06207-3102168247-score_9, score_8_up, score_7_up, uncensored, source_anime,
1girl, _lora_spchisakaAiriXLPony_1_ chisaka1st, hairclip, hair flower.png
- text: score_9, score_8_up, score_7_up, uncensored, source_anime, 1girl, <lora:spchisakaAiriXLPony:1>
chisaka1st, hairclip, hair flower, wolf tail, hat, red necktie, necktie between
breasts, detached collar, cleavage, black dress, boots, white jacket, blue flower,
blue sky, cloud, cloudy sky, day, dutch angle, flower, hydrangea, moe2018, outdoors,
pink flower, pool, poolside, purple flower, sky, soles, water, wet, holding, holding
card, looking at viewer, parted lips, sitting, smile, solo, wariza, frustrated
parameters:
negative_prompt: 3d, monochrome, greyscale, source_pony, source_furry
output:
url: images/06206-3818715640-score_9, score_8_up, score_7_up, uncensored, source_anime,
1girl, _lora_spchisakaAiriXLPony_1_ chisaka1st, hairclip, hair flower.png
- text: score_9, score_8_up, score_7_up, uncensored, source_anime, 1girl, <lora:spchisakaAiriXLPony:1>
chisaka1st, hairclip, hair flower, wolf tail, hat, red necktie, necktie between
breasts, detached collar, cleavage, black dress, boots, white jacket, <lora:spdoHoonStyleXLPony:1>
by DoHoon, simple background, white background
parameters:
negative_prompt: 3d, monochrome, greyscale, source_pony, source_furry
output:
url: images/06200-3150281434-score_9, score_8_up, score_7_up, uncensored, source_anime,
1girl, _lora_spchisakaAiriXLPony_1_ chisaka1st, hairclip, hair flower.png
- text: 'score_9, score_8_up, score_7_up, uncensored, source_anime, 1girl, <lora:spchisakaAiriXLPony:1>
chisaka1st, hairclip, hair flower, wolf tail, hat, red necktie, necktie between
breasts, detached collar, cleavage, black dress, boots, white jacket, <lora:jermawhopper-pdxl-nvwls-v1:1>
jermaWhopper, eating, burger, looking at you, upper body '
parameters:
negative_prompt: 3d, monochrome, greyscale, source_pony, source_furry
output:
url: images/06203-1432510912-score_9, score_8_up, score_7_up, uncensored, source_anime,
1girl, _lora_spchisakaAiriXLPony_1_ chisaka1st, hairclip, hair flower.png
- text: score_9, score_8_up, score_7_up, uncensored, source_anime, 1girl, <lora:spchisakaAiriXLPony:1>
chisaka1st, hairclip, hair flower, wolf tail, hat, red necktie, necktie between
breasts, detached collar, cleavage, black dress, boots, white jacket, coffee cup,
cup, disposable cup, dutch angle, falling leaves, leaf, clenched hand, head tilt,
looking at viewer, open mouth, solo, confused
parameters:
negative_prompt: 3d, monochrome, greyscale, source_pony, source_furry
output:
url: images/06201-3590946186-score_9, score_8_up, score_7_up, uncensored, source_anime,
1girl, _lora_spchisakaAiriXLPony_1_ chisaka1st, hairclip, hair flower.png
- text: score_9, score_8_up, score_7_up, uncensored, source_anime, 1girl, <lora:spchisakaAiriXLPony:1>
chisaka1st, hairclip, hair flower, wolf tail, hat, red necktie, necktie between
breasts, detached collar, cleavage, black dress, boots, white jacket
parameters:
negative_prompt: 3d, monochrome, greyscale, source_pony, source_furry
output:
url: images/06198-472070332-score_9, score_8_up, score_7_up, uncensored, source_anime,
1girl, _lora_spchisakaAiriXLPony_1_ chisaka1st, hairclip, hair flower.png
---
# Chisaka Airi - Phase-Connect
<Gallery />
## Model description
Chisaka Airi - Phase-Connect!
Trained on 1 outfit, it has a trigger word corresponding to the appearance of the character and suggested prompts that summons related clothes and accesories.
Works well with 0.7-1.0 weight
## Trigger words
Debut Outfit:
`chisaka1st, hairclip, hair flower, wolf tail, hat, red necktie, necktie between breasts, detached collar, cleavage, black dress, boots, white jacket`
## Download model
Weights for this model are available in Safetensors format.
[Download](/Shalie/ChisakaAiriPonyXL/tree/main) them in the Files & versions tab.
### License
This LoRA model is provided under the [Fair AI Public License 1.0-SD](https://freedevproject.org/faipl-1.0-sd/) license.
## Restrictions:
- **Usage in Generation Services**: You are not allowed to use the model in any generation services without proper permission from the original creator.
- **Commercial Usage**: The sale of the model or any commercial usage is strictly prohibited without explicit written permission from the original creator. | [
"BEAR"
] |
twadada/tst | twadada | sentence-similarity | [
"mteb",
"sentence-similarity",
"arxiv:1910.09700",
"model-index",
"region:us"
] | "2024-05-31T10:17:44Z" | 2024-05-31T10:34:23+00:00 | 0 | 0 | ---
pipeline_tag: sentence-similarity
tags:
- mteb
model-index:
- name: sent2vec_main
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: None
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 76.08955223880598
- type: ap
value: 40.20915202871001
- type: f1
value: 70.4429232238474
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: None
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 64.50915
- type: ap
value: 60.2117357756878
- type: f1
value: 63.67969617829059
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: None
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 32.068
- type: f1
value: 31.04127014908394
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: None
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 16.999
- type: map_at_10
value: 29.578
- type: map_at_100
value: 30.833
- type: map_at_1000
value: 30.875999999999998
- type: map_at_3
value: 25.568999999999996
- type: map_at_5
value: 27.639000000000003
- type: mrr_at_1
value: 17.496000000000002
- type: mrr_at_10
value: 29.759999999999998
- type: mrr_at_100
value: 31.014000000000003
- type: mrr_at_1000
value: 31.057000000000002
- type: mrr_at_3
value: 25.735000000000003
- type: mrr_at_5
value: 27.819
- type: ndcg_at_1
value: 16.999
- type: ndcg_at_10
value: 36.963
- type: ndcg_at_100
value: 43.04
- type: ndcg_at_1000
value: 44.193
- type: ndcg_at_3
value: 28.474
- type: ndcg_at_5
value: 32.214
- type: precision_at_1
value: 16.999
- type: precision_at_10
value: 6.081
- type: precision_at_100
value: 0.8909999999999999
- type: precision_at_1000
value: 0.098
- type: precision_at_3
value: 12.304
- type: precision_at_5
value: 9.203
- type: recall_at_1
value: 16.999
- type: recall_at_10
value: 60.81100000000001
- type: recall_at_100
value: 89.118
- type: recall_at_1000
value: 98.222
- type: recall_at_3
value: 36.913000000000004
- type: recall_at_5
value: 46.017
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: None
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 38.31374272955139
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: None
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 27.560749219146825
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: None
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 50.09268232764077
- type: mrr
value: 63.88196368113266
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: None
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 54.94383374711268
- type: cos_sim_spearman
value: 55.2012577981062
- type: euclidean_pearson
value: 32.59396212870573
- type: euclidean_spearman
value: 39.225592576917336
- type: manhattan_pearson
value: 32.7408709072666
- type: manhattan_spearman
value: 39.742471824241136
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: None
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 70.88311688311688
- type: f1
value: 70.83929221583213
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: None
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 33.345307198335675
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: None
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 24.176637733738467
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: None
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 18.289
- type: map_at_10
value: 24.321
- type: map_at_100
value: 25.241999999999997
- type: map_at_1000
value: 25.394
- type: map_at_3
value: 22.570999999999998
- type: map_at_5
value: 23.491999999999997
- type: mrr_at_1
value: 23.176
- type: mrr_at_10
value: 29.019000000000002
- type: mrr_at_100
value: 29.781000000000002
- type: mrr_at_1000
value: 29.874000000000002
- type: mrr_at_3
value: 27.444000000000003
- type: mrr_at_5
value: 28.224
- type: ndcg_at_1
value: 23.176
- type: ndcg_at_10
value: 28.28
- type: ndcg_at_100
value: 32.598
- type: ndcg_at_1000
value: 36.064
- type: ndcg_at_3
value: 25.627
- type: ndcg_at_5
value: 26.589000000000002
- type: precision_at_1
value: 23.176
- type: precision_at_10
value: 5.236
- type: precision_at_100
value: 0.906
- type: precision_at_1000
value: 0.149
- type: precision_at_3
value: 12.065
- type: precision_at_5
value: 8.526
- type: recall_at_1
value: 18.289
- type: recall_at_10
value: 35.281
- type: recall_at_100
value: 54.63400000000001
- type: recall_at_1000
value: 78.901
- type: recall_at_3
value: 26.790999999999997
- type: recall_at_5
value: 29.894
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: None
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 16.345000000000002
- type: map_at_10
value: 21.703
- type: map_at_100
value: 22.656000000000002
- type: map_at_1000
value: 22.772000000000002
- type: map_at_3
value: 20.02
- type: map_at_5
value: 20.855999999999998
- type: mrr_at_1
value: 21.146
- type: mrr_at_10
value: 26.389000000000003
- type: mrr_at_100
value: 27.128999999999998
- type: mrr_at_1000
value: 27.200000000000003
- type: mrr_at_3
value: 24.724
- type: mrr_at_5
value: 25.624999999999996
- type: ndcg_at_1
value: 21.146
- type: ndcg_at_10
value: 25.361
- type: ndcg_at_100
value: 29.648000000000003
- type: ndcg_at_1000
value: 32.446000000000005
- type: ndcg_at_3
value: 22.628
- type: ndcg_at_5
value: 23.711
- type: precision_at_1
value: 21.146
- type: precision_at_10
value: 4.752
- type: precision_at_100
value: 0.885
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 10.934000000000001
- type: precision_at_5
value: 7.693999999999999
- type: recall_at_1
value: 16.345000000000002
- type: recall_at_10
value: 31.602000000000004
- type: recall_at_100
value: 50.501
- type: recall_at_1000
value: 70.082
- type: recall_at_3
value: 23.214000000000002
- type: recall_at_5
value: 26.476
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: None
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 23.907
- type: map_at_10
value: 31.323
- type: map_at_100
value: 32.302
- type: map_at_1000
value: 32.391
- type: map_at_3
value: 29.067999999999998
- type: map_at_5
value: 30.320999999999998
- type: mrr_at_1
value: 27.084999999999997
- type: mrr_at_10
value: 33.934
- type: mrr_at_100
value: 34.792
- type: mrr_at_1000
value: 34.855000000000004
- type: mrr_at_3
value: 31.839000000000002
- type: mrr_at_5
value: 33.049
- type: ndcg_at_1
value: 27.084999999999997
- type: ndcg_at_10
value: 35.587
- type: ndcg_at_100
value: 40.37
- type: ndcg_at_1000
value: 42.501
- type: ndcg_at_3
value: 31.385
- type: ndcg_at_5
value: 33.364
- type: precision_at_1
value: 27.084999999999997
- type: precision_at_10
value: 5.749
- type: precision_at_100
value: 0.89
- type: precision_at_1000
value: 0.11499999999999999
- type: precision_at_3
value: 13.793
- type: precision_at_5
value: 9.555
- type: recall_at_1
value: 23.907
- type: recall_at_10
value: 45.953
- type: recall_at_100
value: 67.647
- type: recall_at_1000
value: 83.00800000000001
- type: recall_at_3
value: 34.587
- type: recall_at_5
value: 39.516
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: None
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 9.209
- type: map_at_10
value: 12.376
- type: map_at_100
value: 13.123999999999999
- type: map_at_1000
value: 13.232
- type: map_at_3
value: 11.328000000000001
- type: map_at_5
value: 11.934000000000001
- type: mrr_at_1
value: 9.944
- type: mrr_at_10
value: 13.254
- type: mrr_at_100
value: 14.041
- type: mrr_at_1000
value: 14.14
- type: mrr_at_3
value: 12.203
- type: mrr_at_5
value: 12.808
- type: ndcg_at_1
value: 9.944
- type: ndcg_at_10
value: 14.302999999999999
- type: ndcg_at_100
value: 18.289
- type: ndcg_at_1000
value: 21.494
- type: ndcg_at_3
value: 12.211
- type: ndcg_at_5
value: 13.26
- type: precision_at_1
value: 9.944
- type: precision_at_10
value: 2.237
- type: precision_at_100
value: 0.44400000000000006
- type: precision_at_1000
value: 0.076
- type: precision_at_3
value: 5.1979999999999995
- type: precision_at_5
value: 3.7289999999999996
- type: recall_at_1
value: 9.209
- type: recall_at_10
value: 19.426
- type: recall_at_100
value: 38.268
- type: recall_at_1000
value: 63.400999999999996
- type: recall_at_3
value: 13.822999999999999
- type: recall_at_5
value: 16.262
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: None
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 4.894
- type: map_at_10
value: 8.065
- type: map_at_100
value: 8.776
- type: map_at_1000
value: 8.892
- type: map_at_3
value: 6.998
- type: map_at_5
value: 7.632999999999999
- type: mrr_at_1
value: 6.5920000000000005
- type: mrr_at_10
value: 10.409
- type: mrr_at_100
value: 11.185
- type: mrr_at_1000
value: 11.283
- type: mrr_at_3
value: 9.121
- type: mrr_at_5
value: 9.83
- type: ndcg_at_1
value: 6.5920000000000005
- type: ndcg_at_10
value: 10.349
- type: ndcg_at_100
value: 14.243
- type: ndcg_at_1000
value: 17.638
- type: ndcg_at_3
value: 8.277
- type: ndcg_at_5
value: 9.286999999999999
- type: precision_at_1
value: 6.5920000000000005
- type: precision_at_10
value: 2.139
- type: precision_at_100
value: 0.48
- type: precision_at_1000
value: 0.08800000000000001
- type: precision_at_3
value: 4.353
- type: precision_at_5
value: 3.3329999999999997
- type: recall_at_1
value: 4.894
- type: recall_at_10
value: 15.133
- type: recall_at_100
value: 32.687
- type: recall_at_1000
value: 58.281000000000006
- type: recall_at_3
value: 9.399000000000001
- type: recall_at_5
value: 11.971
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: None
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 18.002000000000002
- type: map_at_10
value: 22.833000000000002
- type: map_at_100
value: 23.989
- type: map_at_1000
value: 24.126
- type: map_at_3
value: 21.093999999999998
- type: map_at_5
value: 22.105
- type: mrr_at_1
value: 21.848
- type: mrr_at_10
value: 26.978
- type: mrr_at_100
value: 27.967
- type: mrr_at_1000
value: 28.050000000000004
- type: mrr_at_3
value: 25.217
- type: mrr_at_5
value: 26.184
- type: ndcg_at_1
value: 21.848
- type: ndcg_at_10
value: 26.412000000000003
- type: ndcg_at_100
value: 32.193
- type: ndcg_at_1000
value: 35.429
- type: ndcg_at_3
value: 23.419
- type: ndcg_at_5
value: 24.866
- type: precision_at_1
value: 21.848
- type: precision_at_10
value: 4.61
- type: precision_at_100
value: 0.894
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 10.619
- type: precision_at_5
value: 7.603
- type: recall_at_1
value: 18.002000000000002
- type: recall_at_10
value: 33.256
- type: recall_at_100
value: 58.913000000000004
- type: recall_at_1000
value: 81.596
- type: recall_at_3
value: 24.73
- type: recall_at_5
value: 28.571
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: None
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 12.327
- type: map_at_10
value: 16.701
- type: map_at_100
value: 17.706
- type: map_at_1000
value: 17.849999999999998
- type: map_at_3
value: 15.065999999999999
- type: map_at_5
value: 16.012999999999998
- type: mrr_at_1
value: 15.183
- type: mrr_at_10
value: 19.828000000000003
- type: mrr_at_100
value: 20.752000000000002
- type: mrr_at_1000
value: 20.848
- type: mrr_at_3
value: 18.189
- type: mrr_at_5
value: 19.067999999999998
- type: ndcg_at_1
value: 15.183
- type: ndcg_at_10
value: 19.799
- type: ndcg_at_100
value: 24.886
- type: ndcg_at_1000
value: 28.453
- type: ndcg_at_3
value: 16.794999999999998
- type: ndcg_at_5
value: 18.176000000000002
- type: precision_at_1
value: 15.183
- type: precision_at_10
value: 3.642
- type: precision_at_100
value: 0.726
- type: precision_at_1000
value: 0.124
- type: precision_at_3
value: 7.800999999999999
- type: precision_at_5
value: 5.799
- type: recall_at_1
value: 12.327
- type: recall_at_10
value: 26.215
- type: recall_at_100
value: 49.038
- type: recall_at_1000
value: 74.297
- type: recall_at_3
value: 18.099999999999998
- type: recall_at_5
value: 21.438
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: None
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 8.457
- type: map_at_10
value: 12.277000000000001
- type: map_at_100
value: 12.956999999999999
- type: map_at_1000
value: 13.052
- type: map_at_3
value: 10.931000000000001
- type: map_at_5
value: 11.578
- type: mrr_at_1
value: 10.428999999999998
- type: mrr_at_10
value: 14.289
- type: mrr_at_100
value: 15.023
- type: mrr_at_1000
value: 15.109
- type: mrr_at_3
value: 13.139000000000001
- type: mrr_at_5
value: 13.691
- type: ndcg_at_1
value: 10.428999999999998
- type: ndcg_at_10
value: 14.753
- type: ndcg_at_100
value: 18.581
- type: ndcg_at_1000
value: 21.272
- type: ndcg_at_3
value: 12.399000000000001
- type: ndcg_at_5
value: 13.297
- type: precision_at_1
value: 10.428999999999998
- type: precision_at_10
value: 2.6229999999999998
- type: precision_at_100
value: 0.49100000000000005
- type: precision_at_1000
value: 0.079
- type: precision_at_3
value: 5.827999999999999
- type: precision_at_5
value: 4.141
- type: recall_at_1
value: 8.457
- type: recall_at_10
value: 20.515
- type: recall_at_100
value: 38.675
- type: recall_at_1000
value: 58.999
- type: recall_at_3
value: 13.779
- type: recall_at_5
value: 16.13
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: None
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 6.755999999999999
- type: map_at_10
value: 9.440999999999999
- type: map_at_100
value: 10.006
- type: map_at_1000
value: 10.116
- type: map_at_3
value: 8.423
- type: map_at_5
value: 8.976
- type: mrr_at_1
value: 8.706
- type: mrr_at_10
value: 11.613
- type: mrr_at_100
value: 12.2
- type: mrr_at_1000
value: 12.293
- type: mrr_at_3
value: 10.57
- type: mrr_at_5
value: 11.124
- type: ndcg_at_1
value: 8.706
- type: ndcg_at_10
value: 11.529
- type: ndcg_at_100
value: 14.59
- type: ndcg_at_1000
value: 17.8
- type: ndcg_at_3
value: 9.666
- type: ndcg_at_5
value: 10.471
- type: precision_at_1
value: 8.706
- type: precision_at_10
value: 2.182
- type: precision_at_100
value: 0.44999999999999996
- type: precision_at_1000
value: 0.087
- type: precision_at_3
value: 4.611
- type: precision_at_5
value: 3.4070000000000005
- type: recall_at_1
value: 6.755999999999999
- type: recall_at_10
value: 15.803
- type: recall_at_100
value: 30.062
- type: recall_at_1000
value: 54.057
- type: recall_at_3
value: 10.401
- type: recall_at_5
value: 12.559999999999999
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: None
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 11.104
- type: map_at_10
value: 13.952
- type: map_at_100
value: 14.631
- type: map_at_1000
value: 14.735999999999999
- type: map_at_3
value: 12.895999999999999
- type: map_at_5
value: 13.447999999999999
- type: mrr_at_1
value: 13.34
- type: mrr_at_10
value: 16.384
- type: mrr_at_100
value: 17.064
- type: mrr_at_1000
value: 17.161
- type: mrr_at_3
value: 15.235999999999999
- type: mrr_at_5
value: 15.828999999999999
- type: ndcg_at_1
value: 13.34
- type: ndcg_at_10
value: 16.172
- type: ndcg_at_100
value: 20.012
- type: ndcg_at_1000
value: 23.247999999999998
- type: ndcg_at_3
value: 14.149999999999999
- type: ndcg_at_5
value: 15.001000000000001
- type: precision_at_1
value: 13.34
- type: precision_at_10
value: 2.649
- type: precision_at_100
value: 0.508
- type: precision_at_1000
value: 0.08800000000000001
- type: precision_at_3
value: 6.311999999999999
- type: precision_at_5
value: 4.347
- type: recall_at_1
value: 11.104
- type: recall_at_10
value: 20.756
- type: recall_at_100
value: 39.066
- type: recall_at_1000
value: 63.626000000000005
- type: recall_at_3
value: 14.943999999999999
- type: recall_at_5
value: 17.331
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: None
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 13.211999999999998
- type: map_at_10
value: 17.443
- type: map_at_100
value: 18.364
- type: map_at_1000
value: 18.55
- type: map_at_3
value: 16.042
- type: map_at_5
value: 16.885
- type: mrr_at_1
value: 16.403000000000002
- type: mrr_at_10
value: 20.865000000000002
- type: mrr_at_100
value: 21.624
- type: mrr_at_1000
value: 21.718
- type: mrr_at_3
value: 19.401
- type: mrr_at_5
value: 20.25
- type: ndcg_at_1
value: 16.403000000000002
- type: ndcg_at_10
value: 20.677
- type: ndcg_at_100
value: 24.727
- type: ndcg_at_1000
value: 28.391
- type: ndcg_at_3
value: 18.382
- type: ndcg_at_5
value: 19.572
- type: precision_at_1
value: 16.403000000000002
- type: precision_at_10
value: 3.755
- type: precision_at_100
value: 0.9209999999999999
- type: precision_at_1000
value: 0.172
- type: precision_at_3
value: 8.498
- type: precision_at_5
value: 6.2059999999999995
- type: recall_at_1
value: 13.211999999999998
- type: recall_at_10
value: 26.532
- type: recall_at_100
value: 45.253
- type: recall_at_1000
value: 70.62
- type: recall_at_3
value: 19.024
- type: recall_at_5
value: 22.448999999999998
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: None
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 8.386000000000001
- type: map_at_10
value: 11.834999999999999
- type: map_at_100
value: 12.559999999999999
- type: map_at_1000
value: 12.662999999999998
- type: map_at_3
value: 10.632
- type: map_at_5
value: 11.343
- type: mrr_at_1
value: 9.612
- type: mrr_at_10
value: 13.158
- type: mrr_at_100
value: 13.888
- type: mrr_at_1000
value: 13.988
- type: mrr_at_3
value: 12.015
- type: mrr_at_5
value: 12.662
- type: ndcg_at_1
value: 9.612
- type: ndcg_at_10
value: 14.155000000000001
- type: ndcg_at_100
value: 18.174
- type: ndcg_at_1000
value: 21.448
- type: ndcg_at_3
value: 11.755
- type: ndcg_at_5
value: 12.955
- type: precision_at_1
value: 9.612
- type: precision_at_10
value: 2.311
- type: precision_at_100
value: 0.464
- type: precision_at_1000
value: 0.08
- type: precision_at_3
value: 5.176
- type: precision_at_5
value: 3.8080000000000003
- type: recall_at_1
value: 8.386000000000001
- type: recall_at_10
value: 20.225
- type: recall_at_100
value: 39.532000000000004
- type: recall_at_1000
value: 65.33
- type: recall_at_3
value: 13.629
- type: recall_at_5
value: 16.556
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: None
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 8.164
- type: map_at_10
value: 14.027999999999999
- type: map_at_100
value: 15.817
- type: map_at_1000
value: 16.047
- type: map_at_3
value: 11.501
- type: map_at_5
value: 12.674
- type: mrr_at_1
value: 18.502
- type: mrr_at_10
value: 28.503
- type: mrr_at_100
value: 29.686
- type: mrr_at_1000
value: 29.742
- type: mrr_at_3
value: 24.995
- type: mrr_at_5
value: 26.76
- type: ndcg_at_1
value: 18.502
- type: ndcg_at_10
value: 20.954
- type: ndcg_at_100
value: 28.532999999999998
- type: ndcg_at_1000
value: 32.732
- type: ndcg_at_3
value: 16.3
- type: ndcg_at_5
value: 17.681
- type: precision_at_1
value: 18.502
- type: precision_at_10
value: 6.977
- type: precision_at_100
value: 1.496
- type: precision_at_1000
value: 0.22599999999999998
- type: precision_at_3
value: 12.313
- type: precision_at_5
value: 9.668000000000001
- type: recall_at_1
value: 8.164
- type: recall_at_10
value: 26.41
- type: recall_at_100
value: 52.81
- type: recall_at_1000
value: 76.554
- type: recall_at_3
value: 14.974000000000002
- type: recall_at_5
value: 18.961
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: None
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 3.769
- type: map_at_10
value: 9.778
- type: map_at_100
value: 14.66
- type: map_at_1000
value: 15.863
- type: map_at_3
value: 6.691999999999999
- type: map_at_5
value: 8.03
- type: mrr_at_1
value: 39.5
- type: mrr_at_10
value: 50.370000000000005
- type: mrr_at_100
value: 51.09
- type: mrr_at_1000
value: 51.117000000000004
- type: mrr_at_3
value: 47.833
- type: mrr_at_5
value: 49.233
- type: ndcg_at_1
value: 28.999999999999996
- type: ndcg_at_10
value: 24.253
- type: ndcg_at_100
value: 28.88
- type: ndcg_at_1000
value: 36.449
- type: ndcg_at_3
value: 26.119999999999997
- type: ndcg_at_5
value: 25.023
- type: precision_at_1
value: 39.5
- type: precision_at_10
value: 22.375
- type: precision_at_100
value: 7.605
- type: precision_at_1000
value: 1.5709999999999997
- type: precision_at_3
value: 32.083
- type: precision_at_5
value: 28.349999999999998
- type: recall_at_1
value: 3.769
- type: recall_at_10
value: 14.913000000000002
- type: recall_at_100
value: 36.785000000000004
- type: recall_at_1000
value: 63.002
- type: recall_at_3
value: 8.312999999999999
- type: recall_at_5
value: 10.679
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: None
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 32.775
- type: f1
value: 30.107262205231955
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: None
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 15.365
- type: map_at_10
value: 23.764
- type: map_at_100
value: 24.849
- type: map_at_1000
value: 24.926000000000002
- type: map_at_3
value: 20.857999999999997
- type: map_at_5
value: 22.488
- type: mrr_at_1
value: 16.412
- type: mrr_at_10
value: 25.202
- type: mrr_at_100
value: 26.273000000000003
- type: mrr_at_1000
value: 26.339000000000002
- type: mrr_at_3
value: 22.172
- type: mrr_at_5
value: 23.860999999999997
- type: ndcg_at_1
value: 16.412
- type: ndcg_at_10
value: 29.026000000000003
- type: ndcg_at_100
value: 34.43
- type: ndcg_at_1000
value: 36.522
- type: ndcg_at_3
value: 23.027
- type: ndcg_at_5
value: 25.946
- type: precision_at_1
value: 16.412
- type: precision_at_10
value: 4.8149999999999995
- type: precision_at_100
value: 0.771
- type: precision_at_1000
value: 0.097
- type: precision_at_3
value: 10.030999999999999
- type: precision_at_5
value: 7.558
- type: recall_at_1
value: 15.365
- type: recall_at_10
value: 44.224999999999994
- type: recall_at_100
value: 69.169
- type: recall_at_1000
value: 85.272
- type: recall_at_3
value: 28.015
- type: recall_at_5
value: 34.958
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: None
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 5.4190000000000005
- type: map_at_10
value: 9.495000000000001
- type: map_at_100
value: 10.551
- type: map_at_1000
value: 10.725
- type: map_at_3
value: 7.845000000000001
- type: map_at_5
value: 8.661000000000001
- type: mrr_at_1
value: 11.574
- type: mrr_at_10
value: 17.357
- type: mrr_at_100
value: 18.298000000000002
- type: mrr_at_1000
value: 18.403
- type: mrr_at_3
value: 15.432000000000002
- type: mrr_at_5
value: 16.543
- type: ndcg_at_1
value: 11.574
- type: ndcg_at_10
value: 13.574
- type: ndcg_at_100
value: 18.847
- type: ndcg_at_1000
value: 23.105999999999998
- type: ndcg_at_3
value: 11.16
- type: ndcg_at_5
value: 12.015
- type: precision_at_1
value: 11.574
- type: precision_at_10
value: 4.167
- type: precision_at_100
value: 0.9259999999999999
- type: precision_at_1000
value: 0.166
- type: precision_at_3
value: 7.716000000000001
- type: precision_at_5
value: 6.08
- type: recall_at_1
value: 5.4190000000000005
- type: recall_at_10
value: 17.76
- type: recall_at_100
value: 39.080999999999996
- type: recall_at_1000
value: 65.713
- type: recall_at_3
value: 10.348
- type: recall_at_5
value: 13.274
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: None
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 18.697
- type: map_at_10
value: 26.466
- type: map_at_100
value: 27.464
- type: map_at_1000
value: 27.581
- type: map_at_3
value: 24.284
- type: map_at_5
value: 25.478
- type: mrr_at_1
value: 37.394
- type: mrr_at_10
value: 44.827
- type: mrr_at_100
value: 45.553
- type: mrr_at_1000
value: 45.601
- type: mrr_at_3
value: 42.82
- type: mrr_at_5
value: 43.980999999999995
- type: ndcg_at_1
value: 37.394
- type: ndcg_at_10
value: 33.726
- type: ndcg_at_100
value: 38.244
- type: ndcg_at_1000
value: 40.931
- type: ndcg_at_3
value: 29.660999999999998
- type: ndcg_at_5
value: 31.627
- type: precision_at_1
value: 37.394
- type: precision_at_10
value: 7.453
- type: precision_at_100
value: 1.107
- type: precision_at_1000
value: 0.147
- type: precision_at_3
value: 18.708
- type: precision_at_5
value: 12.786
- type: recall_at_1
value: 18.697
- type: recall_at_10
value: 37.265
- type: recall_at_100
value: 55.361000000000004
- type: recall_at_1000
value: 73.309
- type: recall_at_3
value: 28.061999999999998
- type: recall_at_5
value: 31.965
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: None
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 62.14919999999999
- type: ap
value: 57.925637150355854
- type: f1
value: 61.50139519699174
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: None
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 4.475
- type: map_at_10
value: 7.548000000000001
- type: map_at_100
value: 8.303
- type: map_at_1000
value: 8.408999999999999
- type: map_at_3
value: 6.4079999999999995
- type: map_at_5
value: 6.97
- type: mrr_at_1
value: 4.585
- type: mrr_at_10
value: 7.732
- type: mrr_at_100
value: 8.498999999999999
- type: mrr_at_1000
value: 8.604000000000001
- type: mrr_at_3
value: 6.557
- type: mrr_at_5
value: 7.154000000000001
- type: ndcg_at_1
value: 4.569999999999999
- type: ndcg_at_10
value: 9.514
- type: ndcg_at_100
value: 13.806
- type: ndcg_at_1000
value: 17.055
- type: ndcg_at_3
value: 7.093000000000001
- type: ndcg_at_5
value: 8.122
- type: precision_at_1
value: 4.569999999999999
- type: precision_at_10
value: 1.628
- type: precision_at_100
value: 0.388
- type: precision_at_1000
value: 0.067
- type: precision_at_3
value: 3.061
- type: precision_at_5
value: 2.367
- type: recall_at_1
value: 4.475
- type: recall_at_10
value: 15.67
- type: recall_at_100
value: 36.923
- type: recall_at_1000
value: 63.080999999999996
- type: recall_at_3
value: 8.949
- type: recall_at_5
value: 11.415000000000001
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: None
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 84.9954400364797
- type: f1
value: 84.58277754536348
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: None
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 68.75512995896032
- type: f1
value: 51.118465985982844
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: None
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 61.43577673167451
- type: f1
value: 59.61787483592468
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: None
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 66.0659045057162
- type: f1
value: 65.62318389091126
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: None
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 28.347157458398097
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: None
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 23.70662046779991
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: None
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 29.006999445620252
- type: mrr
value: 29.93142961414551
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: None
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 2.938
- type: map_at_10
value: 6.718
- type: map_at_100
value: 8.602
- type: map_at_1000
value: 9.879999999999999
- type: map_at_3
value: 5.066
- type: map_at_5
value: 5.9799999999999995
- type: mrr_at_1
value: 26.006
- type: mrr_at_10
value: 37.143
- type: mrr_at_100
value: 38.007000000000005
- type: mrr_at_1000
value: 38.056
- type: mrr_at_3
value: 33.953
- type: mrr_at_5
value: 35.980000000000004
- type: ndcg_at_1
value: 24.768
- type: ndcg_at_10
value: 21.893
- type: ndcg_at_100
value: 21.193
- type: ndcg_at_1000
value: 30.911
- type: ndcg_at_3
value: 23.912
- type: ndcg_at_5
value: 23.749000000000002
- type: precision_at_1
value: 26.006
- type: precision_at_10
value: 16.378
- type: precision_at_100
value: 6.059
- type: precision_at_1000
value: 1.934
- type: precision_at_3
value: 22.601
- type: precision_at_5
value: 20.929000000000002
- type: recall_at_1
value: 2.938
- type: recall_at_10
value: 11.195
- type: recall_at_100
value: 24.473
- type: recall_at_1000
value: 58.553
- type: recall_at_3
value: 6.487
- type: recall_at_5
value: 9.02
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: None
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 8.761
- type: map_at_10
value: 15.726
- type: map_at_100
value: 17.130000000000003
- type: map_at_1000
value: 17.244999999999997
- type: map_at_3
value: 13.001
- type: map_at_5
value: 14.438999999999998
- type: mrr_at_1
value: 9.994
- type: mrr_at_10
value: 17.455000000000002
- type: mrr_at_100
value: 18.736
- type: mrr_at_1000
value: 18.828
- type: mrr_at_3
value: 14.634
- type: mrr_at_5
value: 16.158
- type: ndcg_at_1
value: 9.994
- type: ndcg_at_10
value: 20.453
- type: ndcg_at_100
value: 27.514
- type: ndcg_at_1000
value: 30.45
- type: ndcg_at_3
value: 14.802000000000001
- type: ndcg_at_5
value: 17.394000000000002
- type: precision_at_1
value: 9.994
- type: precision_at_10
value: 3.914
- type: precision_at_100
value: 0.7939999999999999
- type: precision_at_1000
value: 0.107
- type: precision_at_3
value: 7.0680000000000005
- type: precision_at_5
value: 5.655
- type: recall_at_1
value: 8.761
- type: recall_at_10
value: 33.534000000000006
- type: recall_at_100
value: 66.28500000000001
- type: recall_at_1000
value: 88.458
- type: recall_at_3
value: 18.436
- type: recall_at_5
value: 24.508
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: None
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 50.617000000000004
- type: map_at_10
value: 62.446999999999996
- type: map_at_100
value: 63.410999999999994
- type: map_at_1000
value: 63.461
- type: map_at_3
value: 59.382999999999996
- type: map_at_5
value: 61.17
- type: mrr_at_1
value: 58.160000000000004
- type: mrr_at_10
value: 67.015
- type: mrr_at_100
value: 67.472
- type: mrr_at_1000
value: 67.49000000000001
- type: mrr_at_3
value: 65.007
- type: mrr_at_5
value: 66.24
- type: ndcg_at_1
value: 58.209999999999994
- type: ndcg_at_10
value: 67.907
- type: ndcg_at_100
value: 71.194
- type: ndcg_at_1000
value: 72.02
- type: ndcg_at_3
value: 63.429
- type: ndcg_at_5
value: 65.655
- type: precision_at_1
value: 58.209999999999994
- type: precision_at_10
value: 10.537
- type: precision_at_100
value: 1.355
- type: precision_at_1000
value: 0.15
- type: precision_at_3
value: 27.677000000000003
- type: precision_at_5
value: 18.6
- type: recall_at_1
value: 50.617000000000004
- type: recall_at_10
value: 79.323
- type: recall_at_100
value: 92.571
- type: recall_at_1000
value: 97.94
- type: recall_at_3
value: 66.81899999999999
- type: recall_at_5
value: 72.738
- type: map_at_1
value: 2.5829999999999997
- type: map_at_10
value: 6.2059999999999995
- type: map_at_100
value: 7.46
- type: map_at_1000
value: 7.724
- type: map_at_3
value: 4.515000000000001
- type: map_at_5
value: 5.313
- type: mrr_at_1
value: 12.7
- type: mrr_at_10
value: 20.615
- type: mrr_at_100
value: 21.841
- type: mrr_at_1000
value: 21.931
- type: mrr_at_3
value: 17.983
- type: mrr_at_5
value: 19.468
- type: ndcg_at_1
value: 12.7
- type: ndcg_at_10
value: 11.366
- type: ndcg_at_100
value: 17.448
- type: ndcg_at_1000
value: 22.86
- type: ndcg_at_3
value: 10.541
- type: ndcg_at_5
value: 9.27
- type: precision_at_1
value: 12.7
- type: precision_at_10
value: 5.96
- type: precision_at_100
value: 1.4949999999999999
- type: precision_at_1000
value: 0.27999999999999997
- type: precision_at_3
value: 9.833
- type: precision_at_5
value: 8.16
- type: recall_at_1
value: 2.5829999999999997
- type: recall_at_10
value: 12.107999999999999
- type: recall_at_100
value: 30.368000000000002
- type: recall_at_1000
value: 57.01500000000001
- type: recall_at_3
value: 5.997
- type: recall_at_5
value: 8.267
- type: map_at_1
value: 0.174
- type: map_at_10
value: 0.9730000000000001
- type: map_at_100
value: 4.8629999999999995
- type: map_at_1000
value: 11.895999999999999
- type: map_at_3
value: 0.373
- type: map_at_5
value: 0.575
- type: mrr_at_1
value: 68.0
- type: mrr_at_10
value: 75.888
- type: mrr_at_100
value: 76.254
- type: mrr_at_1000
value: 76.254
- type: mrr_at_3
value: 73.0
- type: mrr_at_5
value: 74.4
- type: ndcg_at_1
value: 59.0
- type: ndcg_at_10
value: 49.874
- type: ndcg_at_100
value: 34.993
- type: ndcg_at_1000
value: 31.941999999999997
- type: ndcg_at_3
value: 54.06100000000001
- type: ndcg_at_5
value: 52.995000000000005
- type: precision_at_1
value: 68.0
- type: precision_at_10
value: 53.0
- type: precision_at_100
value: 36.5
- type: precision_at_1000
value: 15.387999999999998
- type: precision_at_3
value: 57.333
- type: precision_at_5
value: 56.00000000000001
- type: recall_at_1
value: 0.174
- type: recall_at_10
value: 1.2309999999999999
- type: recall_at_100
value: 7.992000000000001
- type: recall_at_1000
value: 31.196
- type: recall_at_3
value: 0.402
- type: recall_at_5
value: 0.6629999999999999
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: None
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 22.47157712756689
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: None
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 39.657540667597004
- task:
type: STS
dataset:
name: MTEB SICK-R
type: None
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 71.20191861331476
- type: cos_sim_spearman
value: 63.188421134907536
- type: euclidean_pearson
value: 61.127069815899574
- type: euclidean_spearman
value: 55.45301288952067
- type: manhattan_pearson
value: 61.12020983926607
- type: manhattan_spearman
value: 55.44326332941407
- task:
type: STS
dataset:
name: MTEB STS12
type: None
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 62.09810060891208
- type: cos_sim_spearman
value: 54.06092904130544
- type: euclidean_pearson
value: 48.01643701901603
- type: euclidean_spearman
value: 47.53699133066794
- type: manhattan_pearson
value: 48.01051627115819
- type: manhattan_spearman
value: 47.52526171851921
- task:
type: STS
dataset:
name: MTEB STS13
type: None
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 65.43822791053617
- type: cos_sim_spearman
value: 66.28982173651268
- type: euclidean_pearson
value: 53.35861667092793
- type: euclidean_spearman
value: 53.573281944958396
- type: manhattan_pearson
value: 53.37330137272439
- type: manhattan_spearman
value: 53.66127448601703
- task:
type: STS
dataset:
name: MTEB STS14
type: None
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 68.85545681538693
- type: cos_sim_spearman
value: 65.84235828443764
- type: euclidean_pearson
value: 53.90454137357774
- type: euclidean_spearman
value: 55.04356559669665
- type: manhattan_pearson
value: 53.88757630215708
- type: manhattan_spearman
value: 54.99042045615275
- task:
type: STS
dataset:
name: MTEB STS15
type: None
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 77.62859061994905
- type: cos_sim_spearman
value: 78.00000765816837
- type: euclidean_pearson
value: 54.21852924201095
- type: euclidean_spearman
value: 55.76757372388098
- type: manhattan_pearson
value: 54.19368821813792
- type: manhattan_spearman
value: 55.76610614713326
- task:
type: STS
dataset:
name: MTEB STS16
type: None
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 70.01795969481884
- type: cos_sim_spearman
value: 70.63693611196106
- type: euclidean_pearson
value: 45.34914757394818
- type: euclidean_spearman
value: 45.98188595239444
- type: manhattan_pearson
value: 45.305829577266636
- type: manhattan_spearman
value: 45.92921356525472
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: None
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 81.38227164562063
- type: cos_sim_spearman
value: 82.2302028734954
- type: euclidean_pearson
value: 53.41158385946949
- type: euclidean_spearman
value: 57.10238770345087
- type: manhattan_pearson
value: 53.32952199052525
- type: manhattan_spearman
value: 57.08232219963219
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: None
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 48.87003140356382
- type: cos_sim_spearman
value: 54.52763255764367
- type: euclidean_pearson
value: 41.28501055455825
- type: euclidean_spearman
value: 49.32890902859729
- type: manhattan_pearson
value: 41.25611150219887
- type: manhattan_spearman
value: 49.29228511720397
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: None
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 70.6609946505673
- type: cos_sim_spearman
value: 68.040381040423
- type: euclidean_pearson
value: 54.23209719233177
- type: euclidean_spearman
value: 52.27300805535425
- type: manhattan_pearson
value: 54.174455364046246
- type: manhattan_spearman
value: 52.25145471352592
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: None
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 70.94408788023235
- type: mrr
value: 90.05190252739271
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: None
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 26.694000000000003
- type: map_at_10
value: 34.503
- type: map_at_100
value: 35.494
- type: map_at_1000
value: 35.582
- type: map_at_3
value: 32.255
- type: map_at_5
value: 33.312999999999995
- type: mrr_at_1
value: 28.333000000000002
- type: mrr_at_10
value: 35.782000000000004
- type: mrr_at_100
value: 36.681000000000004
- type: mrr_at_1000
value: 36.756
- type: mrr_at_3
value: 33.667
- type: mrr_at_5
value: 34.8
- type: ndcg_at_1
value: 28.333000000000002
- type: ndcg_at_10
value: 38.799
- type: ndcg_at_100
value: 44.086
- type: ndcg_at_1000
value: 46.472
- type: ndcg_at_3
value: 34.215
- type: ndcg_at_5
value: 36.172
- type: precision_at_1
value: 28.333000000000002
- type: precision_at_10
value: 5.7
- type: precision_at_100
value: 0.8630000000000001
- type: precision_at_1000
value: 0.108
- type: precision_at_3
value: 13.889000000000001
- type: precision_at_5
value: 9.4
- type: recall_at_1
value: 26.694000000000003
- type: recall_at_10
value: 50.917
- type: recall_at_100
value: 76.656
- type: recall_at_1000
value: 95.267
- type: recall_at_3
value: 38.25
- type: recall_at_5
value: 43.25
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: None
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.48415841584158
- type: cos_sim_ap
value: 75.52946987159491
- type: cos_sim_f1
value: 71.24183006535948
- type: cos_sim_precision
value: 78.22966507177034
- type: cos_sim_recall
value: 65.4
- type: dot_accuracy
value: 99.02772277227723
- type: dot_ap
value: 19.64765748531683
- type: dot_f1
value: 27.603388141504738
- type: dot_precision
value: 27.507447864945384
- type: dot_recall
value: 27.700000000000003
- type: euclidean_accuracy
value: 99.22871287128713
- type: euclidean_ap
value: 47.656308810039974
- type: euclidean_f1
value: 49.277108433734945
- type: euclidean_precision
value: 61.969696969696976
- type: euclidean_recall
value: 40.9
- type: manhattan_accuracy
value: 99.23069306930692
- type: manhattan_ap
value: 47.58371084446927
- type: manhattan_f1
value: 49.56949569495694
- type: manhattan_precision
value: 64.37699680511182
- type: manhattan_recall
value: 40.300000000000004
- type: max_accuracy
value: 99.48415841584158
- type: max_ap
value: 75.52946987159491
- type: max_f1
value: 71.24183006535948
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: None
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 32.904600491347175
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: None
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 25.999421501651447
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: None
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 39.18037729762096
- type: mrr
value: 39.22605784738137
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: None
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.48883626808509
- type: cos_sim_spearman
value: 29.51032126703428
- type: dot_pearson
value: 18.805588622011378
- type: dot_spearman
value: 21.097033106663606
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: None
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 1.409
- type: map_at_10
value: 6.394
- type: map_at_100
value: 11.241
- type: map_at_1000
value: 12.983
- type: map_at_3
value: 3.3009999999999997
- type: map_at_5
value: 4.623
- type: mrr_at_1
value: 22.448999999999998
- type: mrr_at_10
value: 37.69
- type: mrr_at_100
value: 38.684000000000005
- type: mrr_at_1000
value: 38.684000000000005
- type: mrr_at_3
value: 32.653
- type: mrr_at_5
value: 35.918
- type: ndcg_at_1
value: 20.408
- type: ndcg_at_10
value: 18.78
- type: ndcg_at_100
value: 31.513999999999996
- type: ndcg_at_1000
value: 43.881
- type: ndcg_at_3
value: 20.888
- type: ndcg_at_5
value: 19.969
- type: precision_at_1
value: 22.448999999999998
- type: precision_at_10
value: 18.163
- type: precision_at_100
value: 7.469
- type: precision_at_1000
value: 1.533
- type: precision_at_3
value: 23.128999999999998
- type: precision_at_5
value: 21.633
- type: recall_at_1
value: 1.409
- type: recall_at_10
value: 12.661
- type: recall_at_100
value: 46.255
- type: recall_at_1000
value: 83.985
- type: recall_at_3
value: 4.627
- type: recall_at_5
value: 7.64
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: None
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 67.81439999999999
- type: ap
value: 12.540860961937348
- type: f1
value: 51.90378710236624
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: None
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 48.91624221844935
- type: f1
value: 48.908124293596636
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: None
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 22.898809101910505
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: None
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 81.9991655242296
- type: cos_sim_ap
value: 58.77296113061388
- type: cos_sim_f1
value: 55.504807692307686
- type: cos_sim_precision
value: 50.971302428256074
- type: cos_sim_recall
value: 60.92348284960423
- type: dot_accuracy
value: 77.71949693032127
- type: dot_ap
value: 40.10856851866763
- type: dot_f1
value: 45.98438855160452
- type: dot_precision
value: 34.25064599483204
- type: dot_recall
value: 69.94722955145119
- type: euclidean_accuracy
value: 80.25272694760685
- type: euclidean_ap
value: 51.49892372756935
- type: euclidean_f1
value: 50.08739076154806
- type: euclidean_precision
value: 47.535545023696685
- type: euclidean_recall
value: 52.9287598944591
- type: manhattan_accuracy
value: 80.21696370030399
- type: manhattan_ap
value: 51.41297690359896
- type: manhattan_f1
value: 49.91362432339053
- type: manhattan_precision
value: 44.287758021663606
- type: manhattan_recall
value: 57.17678100263852
- type: max_accuracy
value: 81.9991655242296
- type: max_ap
value: 58.77296113061388
- type: max_f1
value: 55.504807692307686
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: None
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 87.04544572515232
- type: cos_sim_ap
value: 81.3265297042776
- type: cos_sim_f1
value: 74.06016421812292
- type: cos_sim_precision
value: 70.96888010726131
- type: cos_sim_recall
value: 77.4330150908531
- type: dot_accuracy
value: 83.6864982341755
- type: dot_ap
value: 72.13210682134748
- type: dot_f1
value: 68.104330639184
- type: dot_precision
value: 62.87390029325513
- type: dot_recall
value: 74.28395441946411
- type: euclidean_accuracy
value: 83.30616680249932
- type: euclidean_ap
value: 69.96764856529526
- type: euclidean_f1
value: 62.12407829208972
- type: euclidean_precision
value: 62.299651567944245
- type: euclidean_recall
value: 61.94949183862026
- type: manhattan_accuracy
value: 83.35079753172663
- type: manhattan_ap
value: 69.95287311927201
- type: manhattan_f1
value: 62.172966554522056
- type: manhattan_precision
value: 63.973283375417445
- type: manhattan_recall
value: 60.47120418848168
- type: max_accuracy
value: 87.04544572515232
- type: max_ap
value: 81.3265297042776
- type: max_f1
value: 74.06016421812292
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1).
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"BIOSSES",
"SCIFACT"
] |
salforis/vistral_sentiment | salforis | text-generation | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"conversational",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | "2024-06-02T13:41:23Z" | 2024-06-03T03:29:17+00:00 | 0 | 0 | ---
license: apache-2.0
---
## Usage
This model is finetuned from [Viet-Mistral/Vistral-7B-Chat](https://huggingface.co/Viet-Mistral/Vistral-7B-Chat).
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
system_prompt: str = "Bạn là một trợ lý tiếng Việt nhiệt tình và trung thực. Hãy luôn trả lời một cách hữu ích nhất và chính xác nhất có thể, tránh việc đưa ra thông tin sai lệch."
system_prompt += "Khái niệm bài viết tiêu cực: các bài viết tiêu cực được hiểu là bài viết không đúng sự thật, hoặc dựa trên những sự việc đã và đang xảy ra nhưng đã bị xuyên tạc; "
system_prompt += "những bài viết chỉ trích, đánh vào những khuyết điểm còn tồn đọng nhưng không nhằm mục đích góp ý xây dựng mà nhằm mục đích định hướng người đọc, khơi gợi hiềm khích đối với nhà nước, với chế độ; những bài viết bôi xấu xã hội, văn hóa người Việt, bôi nhọ danh dự của các cá nhân hoặc cơ quan nhà nước. "
system_prompt += "Đặc điểm chung của những bài viết này là thường có ít tính khách quan, thể hiện rất nhiều những ý kiến chủ quan, ngôn từ thù ghét, không có lòng tin với chính quyền, luôn kêu gọi, kích động người đọc quay lại chống phá nhà nước, v.v."
system_prompt += "Khái niệm bài viết tích cực: các bài viết tích cực được hiểu là bài viết ca ngợi Đảng, Bác Hồ, Nhà nước, chính quyền Việt Nam; các bài viết ca ngợi đất nước, con người Việt Nam."
system_prompt += "Các bài viết tích cực cũng có thể là bài viết tập trung khẳng định, củng cố vai trò lãnh đạo của đảng, thể hiện sự tin tưởng, ủng hộ vào đường lối, mục tiêu của đảng; các bài viết phân tích, phản bác các quan điểm sai trái, thù địch chống phá đảng, góp phần bảo vệ nền tảng tư tưởng, định hướng dư luận xã hội, củng cố hệ thống chính trị."
system_prompt += "Khái niệm bài viết trung lập: các bài viết trung lập là các bài viết đưa tin tức, sự kiện, kể chuyện, không chia sẻ bất kỳ quan điểm cá nhân nào."
model_name = "salforis/vistral_sentiment"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype = torch.bfloat16,
device_map = "auto",
use_cache = True,
)
content = """
Dân trí) - Xuất sắc đánh bại đối thủ người Phần Lan Vilma Viitanmen, nữ võ sĩ Việt Nam Hà Thị Linh giành vé dự Olympic Paris 2024.
Tối 2/6, Hà Thị Linh bước vào trận play-off tranh vé dự Olympic 2024 gặp đối thủ người Phần Lan - Vilma Viitanmen.
Đây là cơ hội cuối cùng để boxing Việt Nam tìm thêm vé tới Thế vận hội.
Trước đối thủ mạnh, nữ võ sĩ Việt Nam chơi rất tự tin, phòng thủ kín kẽ và khi ra đòn có sự chính xác cao, giành chiến thắng chung cuộc với tỷ số 4-1 (30/27, 29/27, 28/29, 29/28, 29/28).
Chiến thắng đồng nghĩa Hà Thị Linh xếp hạng 3 nội dung 60kg nữ tại giải vòng loại Olympic ở Bangkok (Thái Lan) nhưng quan trọng hơn cô giành suất chính thức dự Olympic Paris 2024.
"""
conversation = [{"role": "system", "content": system_prompt }]
# while True:
human = (f"Câu hỏi: Bài viết sau đây là Tiêu cực, Tích cực hay Trung lập?:/n{content}")
conversation.append({"role": "user", "content": human })
input_ids = tokenizer.apply_chat_template(conversation, return_tensors="pt").to(model.device) ### nếu dùng GPU
with torch.no_grad():
out_ids = model.generate(
input_ids = input_ids,
max_new_tokens = 1024,
top_p = 1.0,
top_k = 40,
do_sample = True,
temperature = 0.5,
repetition_penalty = 1.0,
eos_token_id = tokenizer.eos_token_id,
use_cache = True,
)
assistant = tokenizer.batch_decode(out_ids[:, input_ids.size(1): ], skip_special_tokens=True)[0].strip()
print("Kết quả: ", assistant)
``` | [
"CHIA"
] |
zhan1993/private_library_phi2_epoch_4_orca | zhan1993 | null | [
"region:us"
] | "2024-06-03T21:10:19Z" | 2024-06-06T14:56:55+00:00 | 0 | 0 | ---
{}
---
Number of experts present in the library: 263
| Expert Name | Base Model | Trained on | Adapter Type |
| --- | --- | --- | --- |
| sciq_Multiple_Choice | phi-2 | zhan1993/openorca_task_id_cluster/sciq_Multiple_Choice | lora |
| wiki_hop_original_choose_best_object_interrogative_1 | phi-2 | zhan1993/openorca_task_id_cluster/wiki_hop_original_choose_best_object_interrogative_1 | lora |
| squad_v2_0_3_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/squad_v2_0_3_0_0 | lora |
| wiki_qa_exercise | phi-2 | zhan1993/openorca_task_id_cluster/wiki_qa_exercise | lora |
| race_high_Taking_a_test | phi-2 | zhan1993/openorca_task_id_cluster/race_high_Taking_a_test | lora |
| adversarial_qa_dbert_generate_question | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_dbert_generate_question | lora |
| quoref_Found_Context_Online | phi-2 | zhan1993/openorca_task_id_cluster/quoref_Found_Context_Online | lora |
| web_questions_get_the_answer | phi-2 | zhan1993/openorca_task_id_cluster/web_questions_get_the_answer | lora |
| duorc_SelfRC_generate_question_by_answer | phi-2 | zhan1993/openorca_task_id_cluster/duorc_SelfRC_generate_question_by_answer | lora |
| quarel_testing_students | phi-2 | zhan1993/openorca_task_id_cluster/quarel_testing_students | lora |
| qasc_qa_with_separated_facts_1 | phi-2 | zhan1993/openorca_task_id_cluster/qasc_qa_with_separated_facts_1 | lora |
| wiki_qa_Is_This_True_ | phi-2 | zhan1993/openorca_task_id_cluster/wiki_qa_Is_This_True_ | lora |
| race_high_Read_the_article_and_answer_the_question_no_option_ | phi-2 | zhan1993/openorca_task_id_cluster/race_high_Read_the_article_and_answer_the_question_no_option_ | lora |
| cot_gsm8k_ii | phi-2 | zhan1993/openorca_task_id_cluster/cot_gsm8k_ii | lora |
| gem_wiki_lingua_english_en_1_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/gem_wiki_lingua_english_en_1_1_0 | lora |
| unified_qa_science_inst | phi-2 | zhan1993/openorca_task_id_cluster/unified_qa_science_inst | lora |
| quartz_use_info_from_paragraph_question | phi-2 | zhan1993/openorca_task_id_cluster/quartz_use_info_from_paragraph_question | lora |
| wiki_hop_original_generate_object | phi-2 | zhan1993/openorca_task_id_cluster/wiki_hop_original_generate_object | lora |
| quoref_What_Is_The_Answer | phi-2 | zhan1993/openorca_task_id_cluster/quoref_What_Is_The_Answer | lora |
| adversarial_qa_droberta_generate_question | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_droberta_generate_question | lora |
| wiki_bio_comprehension | phi-2 | zhan1993/openorca_task_id_cluster/wiki_bio_comprehension | lora |
| adversarial_qa_dbidaf_question_context_answer | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_dbidaf_question_context_answer | lora |
| wiki_bio_what_content | phi-2 | zhan1993/openorca_task_id_cluster/wiki_bio_what_content | lora |
| web_questions_whats_the_answer | phi-2 | zhan1993/openorca_task_id_cluster/web_questions_whats_the_answer | lora |
| wiqa_what_is_the_missing_first_step | phi-2 | zhan1993/openorca_task_id_cluster/wiqa_what_is_the_missing_first_step | lora |
| adversarial_qa_droberta_question_context_answer | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_droberta_question_context_answer | lora |
| ropes_plain_bottom_hint | phi-2 | zhan1993/openorca_task_id_cluster/ropes_plain_bottom_hint | lora |
| kilt_tasks_hotpotqa_combining_facts | phi-2 | zhan1993/openorca_task_id_cluster/kilt_tasks_hotpotqa_combining_facts | lora |
| cos_e_v1_11_aligned_with_common_sense | phi-2 | zhan1993/openorca_task_id_cluster/cos_e_v1_11_aligned_with_common_sense | lora |
| gem_web_nlg_en_1_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/gem_web_nlg_en_1_1_0 | lora |
| web_questions_potential_correct_answer | phi-2 | zhan1993/openorca_task_id_cluster/web_questions_potential_correct_answer | lora |
| wiki_qa_found_on_google | phi-2 | zhan1993/openorca_task_id_cluster/wiki_qa_found_on_google | lora |
| duorc_ParaphraseRC_extract_answer | phi-2 | zhan1993/openorca_task_id_cluster/duorc_ParaphraseRC_extract_answer | lora |
| wmt16_translate_de_en_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/wmt16_translate_de_en_1_0_0 | lora |
| quail_no_prompt_id | phi-2 | zhan1993/openorca_task_id_cluster/quail_no_prompt_id | lora |
| quoref_Guess_Title_For_Context | phi-2 | zhan1993/openorca_task_id_cluster/quoref_Guess_Title_For_Context | lora |
| duorc_SelfRC_decide_worth_it | phi-2 | zhan1993/openorca_task_id_cluster/duorc_SelfRC_decide_worth_it | lora |
| ropes_prompt_mix | phi-2 | zhan1993/openorca_task_id_cluster/ropes_prompt_mix | lora |
| adversarial_qa_droberta_tell_what_it_is | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_droberta_tell_what_it_is | lora |
| quail_context_question_answer_description_id | phi-2 | zhan1993/openorca_task_id_cluster/quail_context_question_answer_description_id | lora |
| gem_common_gen_1_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/gem_common_gen_1_1_0 | lora |
| duorc_ParaphraseRC_answer_question | phi-2 | zhan1993/openorca_task_id_cluster/duorc_ParaphraseRC_answer_question | lora |
| super_glue_cb_1_0_2 | phi-2 | zhan1993/openorca_task_id_cluster/super_glue_cb_1_0_2 | lora |
| cnn_dailymail_3_4_0 | phi-2 | zhan1993/openorca_task_id_cluster/cnn_dailymail_3_4_0 | lora |
| race_high_Write_a_multi_choice_question_options_given_ | phi-2 | zhan1993/openorca_task_id_cluster/race_high_Write_a_multi_choice_question_options_given_ | lora |
| winogrande_1_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/winogrande_1_1_0 | lora |
| duorc_SelfRC_extract_answer | phi-2 | zhan1993/openorca_task_id_cluster/duorc_SelfRC_extract_answer | lora |
| trec_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/trec_1_0_0 | lora |
| yelp_polarity_reviews_0_2_0 | phi-2 | zhan1993/openorca_task_id_cluster/yelp_polarity_reviews_0_2_0 | lora |
| race_high_Select_the_best_answer | phi-2 | zhan1993/openorca_task_id_cluster/race_high_Select_the_best_answer | lora |
| para_crawl_enes | phi-2 | zhan1993/openorca_task_id_cluster/para_crawl_enes | lora |
| qasc_is_correct_1 | phi-2 | zhan1993/openorca_task_id_cluster/qasc_is_correct_1 | lora |
| app_reviews_generate_review | phi-2 | zhan1993/openorca_task_id_cluster/app_reviews_generate_review | lora |
| ropes_read_background_situation | phi-2 | zhan1993/openorca_task_id_cluster/ropes_read_background_situation | lora |
| dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | phi-2 | zhan1993/openorca_task_id_cluster/dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | lora |
| stream_aqua | phi-2 | zhan1993/openorca_task_id_cluster/stream_aqua | lora |
| drop_2_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/drop_2_0_0 | lora |
| wiki_hop_original_choose_best_object_affirmative_1 | phi-2 | zhan1993/openorca_task_id_cluster/wiki_hop_original_choose_best_object_affirmative_1 | lora |
| adversarial_qa_dbidaf_answer_the_following_q | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_dbidaf_answer_the_following_q | lora |
| social_i_qa_Generate_answer | phi-2 | zhan1993/openorca_task_id_cluster/social_i_qa_Generate_answer | lora |
| stream_aqua_ii | phi-2 | zhan1993/openorca_task_id_cluster/stream_aqua_ii | lora |
| glue_sst2_2_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/glue_sst2_2_0_0 | lora |
| cot_esnli | phi-2 | zhan1993/openorca_task_id_cluster/cot_esnli | lora |
| race_high_Select_the_best_answer_no_instructions_ | phi-2 | zhan1993/openorca_task_id_cluster/race_high_Select_the_best_answer_no_instructions_ | lora |
| duorc_SelfRC_build_story_around_qa | phi-2 | zhan1993/openorca_task_id_cluster/duorc_SelfRC_build_story_around_qa | lora |
| cot_esnli_ii | phi-2 | zhan1993/openorca_task_id_cluster/cot_esnli_ii | lora |
| quail_no_prompt_text | phi-2 | zhan1993/openorca_task_id_cluster/quail_no_prompt_text | lora |
| ropes_given_background_situation | phi-2 | zhan1993/openorca_task_id_cluster/ropes_given_background_situation | lora |
| quarel_logic_test | phi-2 | zhan1993/openorca_task_id_cluster/quarel_logic_test | lora |
| adversarial_qa_dbidaf_based_on | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_dbidaf_based_on | lora |
| super_glue_copa_1_0_2 | phi-2 | zhan1993/openorca_task_id_cluster/super_glue_copa_1_0_2 | lora |
| cos_e_v1_11_i_think | phi-2 | zhan1993/openorca_task_id_cluster/cos_e_v1_11_i_think | lora |
| quail_context_question_description_answer_text | phi-2 | zhan1993/openorca_task_id_cluster/quail_context_question_description_answer_text | lora |
| math_dataset_algebra__linear_1d_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/math_dataset_algebra__linear_1d_1_0_0 | lora |
| cosmos_qa_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/cosmos_qa_1_0_0 | lora |
| wiqa_effect_with_label_answer | phi-2 | zhan1993/openorca_task_id_cluster/wiqa_effect_with_label_answer | lora |
| app_reviews_convert_to_star_rating | phi-2 | zhan1993/openorca_task_id_cluster/app_reviews_convert_to_star_rating | lora |
| qasc_qa_with_separated_facts_2 | phi-2 | zhan1993/openorca_task_id_cluster/qasc_qa_with_separated_facts_2 | lora |
| race_middle_Select_the_best_answer | phi-2 | zhan1993/openorca_task_id_cluster/race_middle_Select_the_best_answer | lora |
| quartz_having_read_above_passage | phi-2 | zhan1993/openorca_task_id_cluster/quartz_having_read_above_passage | lora |
| glue_qqp_2_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/glue_qqp_2_0_0 | lora |
| cos_e_v1_11_question_description_option_id | phi-2 | zhan1993/openorca_task_id_cluster/cos_e_v1_11_question_description_option_id | lora |
| stream_qed_ii | phi-2 | zhan1993/openorca_task_id_cluster/stream_qed_ii | lora |
| cos_e_v1_11_question_option_description_text | phi-2 | zhan1993/openorca_task_id_cluster/cos_e_v1_11_question_option_description_text | lora |
| imdb_reviews_plain_text_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/imdb_reviews_plain_text_1_0_0 | lora |
| wiki_hop_original_choose_best_object_affirmative_2 | phi-2 | zhan1993/openorca_task_id_cluster/wiki_hop_original_choose_best_object_affirmative_2 | lora |
| natural_questions_open_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/natural_questions_open_1_0_0 | lora |
| wiqa_effect_with_string_answer | phi-2 | zhan1993/openorca_task_id_cluster/wiqa_effect_with_string_answer | lora |
| cos_e_v1_11_rationale | phi-2 | zhan1993/openorca_task_id_cluster/cos_e_v1_11_rationale | lora |
| race_middle_Write_a_multi_choice_question_options_given_ | phi-2 | zhan1993/openorca_task_id_cluster/race_middle_Write_a_multi_choice_question_options_given_ | lora |
| wiki_bio_guess_person | phi-2 | zhan1993/openorca_task_id_cluster/wiki_bio_guess_person | lora |
| wiqa_does_the_supposed_perturbation_have_an_effect | phi-2 | zhan1993/openorca_task_id_cluster/wiqa_does_the_supposed_perturbation_have_an_effect | lora |
| trivia_qa_rc_1_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/trivia_qa_rc_1_1_0 | lora |
| lambada_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/lambada_1_0_0 | lora |
| quoref_Read_And_Extract_ | phi-2 | zhan1993/openorca_task_id_cluster/quoref_Read_And_Extract_ | lora |
| quail_context_description_question_answer_id | phi-2 | zhan1993/openorca_task_id_cluster/quail_context_description_question_answer_id | lora |
| quail_context_description_question_answer_text | phi-2 | zhan1993/openorca_task_id_cluster/quail_context_description_question_answer_text | lora |
| duorc_SelfRC_question_answering | phi-2 | zhan1993/openorca_task_id_cluster/duorc_SelfRC_question_answering | lora |
| cot_sensemaking_ii | phi-2 | zhan1993/openorca_task_id_cluster/cot_sensemaking_ii | lora |
| fix_punct | phi-2 | zhan1993/openorca_task_id_cluster/fix_punct | lora |
| squad_v1_1_3_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/squad_v1_1_3_0_0 | lora |
| coqa_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/coqa_1_0_0 | lora |
| glue_qnli_2_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/glue_qnli_2_0_0 | lora |
| wiki_qa_Jeopardy_style | phi-2 | zhan1993/openorca_task_id_cluster/wiki_qa_Jeopardy_style | lora |
| qasc_qa_with_separated_facts_5 | phi-2 | zhan1993/openorca_task_id_cluster/qasc_qa_with_separated_facts_5 | lora |
| glue_mnli_2_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/glue_mnli_2_0_0 | lora |
| wiki_bio_key_content | phi-2 | zhan1993/openorca_task_id_cluster/wiki_bio_key_content | lora |
| dream_generate_first_utterance | phi-2 | zhan1993/openorca_task_id_cluster/dream_generate_first_utterance | lora |
| quartz_read_passage_below_choose | phi-2 | zhan1993/openorca_task_id_cluster/quartz_read_passage_below_choose | lora |
| web_questions_question_answer | phi-2 | zhan1993/openorca_task_id_cluster/web_questions_question_answer | lora |
| glue_stsb_2_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/glue_stsb_2_0_0 | lora |
| wmt16_translate_tr_en_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/wmt16_translate_tr_en_1_0_0 | lora |
| cot_qasc | phi-2 | zhan1993/openorca_task_id_cluster/cot_qasc | lora |
| duorc_ParaphraseRC_title_generation | phi-2 | zhan1993/openorca_task_id_cluster/duorc_ParaphraseRC_title_generation | lora |
| quail_description_context_question_answer_id | phi-2 | zhan1993/openorca_task_id_cluster/quail_description_context_question_answer_id | lora |
| wiki_qa_Topic_Prediction_Question_Only | phi-2 | zhan1993/openorca_task_id_cluster/wiki_qa_Topic_Prediction_Question_Only | lora |
| quoref_Find_Answer | phi-2 | zhan1993/openorca_task_id_cluster/quoref_Find_Answer | lora |
| social_i_qa_I_was_wondering | phi-2 | zhan1993/openorca_task_id_cluster/social_i_qa_I_was_wondering | lora |
| wiki_hop_original_choose_best_object_affirmative_3 | phi-2 | zhan1993/openorca_task_id_cluster/wiki_hop_original_choose_best_object_affirmative_3 | lora |
| duorc_ParaphraseRC_build_story_around_qa | phi-2 | zhan1993/openorca_task_id_cluster/duorc_ParaphraseRC_build_story_around_qa | lora |
| qasc_qa_with_separated_facts_3 | phi-2 | zhan1993/openorca_task_id_cluster/qasc_qa_with_separated_facts_3 | lora |
| race_middle_Is_this_the_right_answer | phi-2 | zhan1993/openorca_task_id_cluster/race_middle_Is_this_the_right_answer | lora |
| paws_wiki_1_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/paws_wiki_1_1_0 | lora |
| app_reviews_categorize_rating_using_review | phi-2 | zhan1993/openorca_task_id_cluster/app_reviews_categorize_rating_using_review | lora |
| anli_r3_0_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/anli_r3_0_1_0 | lora |
| app_reviews_convert_to_rating | phi-2 | zhan1993/openorca_task_id_cluster/app_reviews_convert_to_rating | lora |
| wiqa_what_is_the_final_step_of_the_following_process | phi-2 | zhan1993/openorca_task_id_cluster/wiqa_what_is_the_final_step_of_the_following_process | lora |
| adversarial_qa_droberta_answer_the_following_q | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_droberta_answer_the_following_q | lora |
| wiki_qa_Decide_good_answer | phi-2 | zhan1993/openorca_task_id_cluster/wiki_qa_Decide_good_answer | lora |
| adversarial_qa_dbert_answer_the_following_q | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_dbert_answer_the_following_q | lora |
| gem_dart_1_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/gem_dart_1_1_0 | lora |
| adversarial_qa_dbert_tell_what_it_is | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_dbert_tell_what_it_is | lora |
| quarel_choose_between | phi-2 | zhan1993/openorca_task_id_cluster/quarel_choose_between | lora |
| duorc_ParaphraseRC_generate_question_by_answer | phi-2 | zhan1993/openorca_task_id_cluster/duorc_ParaphraseRC_generate_question_by_answer | lora |
| wiki_hop_original_generate_subject | phi-2 | zhan1993/openorca_task_id_cluster/wiki_hop_original_generate_subject | lora |
| dream_baseline | phi-2 | zhan1993/openorca_task_id_cluster/dream_baseline | lora |
| cos_e_v1_11_question_description_option_text | phi-2 | zhan1993/openorca_task_id_cluster/cos_e_v1_11_question_description_option_text | lora |
| aeslc_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/aeslc_1_0_0 | lora |
| anli_r2_0_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/anli_r2_0_1_0 | lora |
| dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | phi-2 | zhan1993/openorca_task_id_cluster/dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | lora |
| quail_context_question_description_answer_id | phi-2 | zhan1993/openorca_task_id_cluster/quail_context_question_description_answer_id | lora |
| race_middle_Select_the_best_answer_no_instructions_ | phi-2 | zhan1993/openorca_task_id_cluster/race_middle_Select_the_best_answer_no_instructions_ | lora |
| wmt16_translate_ro_en_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/wmt16_translate_ro_en_1_0_0 | lora |
| race_high_Is_this_the_right_answer | phi-2 | zhan1993/openorca_task_id_cluster/race_high_Is_this_the_right_answer | lora |
| quail_description_context_question_text | phi-2 | zhan1993/openorca_task_id_cluster/quail_description_context_question_text | lora |
| sciq_Direct_Question_Closed_Book_ | phi-2 | zhan1993/openorca_task_id_cluster/sciq_Direct_Question_Closed_Book_ | lora |
| openbookqa_0_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/openbookqa_0_1_0 | lora |
| duorc_SelfRC_title_generation | phi-2 | zhan1993/openorca_task_id_cluster/duorc_SelfRC_title_generation | lora |
| cot_gsm8k | phi-2 | zhan1993/openorca_task_id_cluster/cot_gsm8k | lora |
| quartz_answer_question_below | phi-2 | zhan1993/openorca_task_id_cluster/quartz_answer_question_below | lora |
| snli_1_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/snli_1_1_0 | lora |
| sciq_Multiple_Choice_Closed_Book_ | phi-2 | zhan1993/openorca_task_id_cluster/sciq_Multiple_Choice_Closed_Book_ | lora |
| cot_strategyqa | phi-2 | zhan1993/openorca_task_id_cluster/cot_strategyqa | lora |
| qasc_qa_with_separated_facts_4 | phi-2 | zhan1993/openorca_task_id_cluster/qasc_qa_with_separated_facts_4 | lora |
| ropes_prompt_bottom_no_hint | phi-2 | zhan1993/openorca_task_id_cluster/ropes_prompt_bottom_no_hint | lora |
| duorc_SelfRC_generate_question | phi-2 | zhan1993/openorca_task_id_cluster/duorc_SelfRC_generate_question | lora |
| quartz_given_the_fact_answer_the_q | phi-2 | zhan1993/openorca_task_id_cluster/quartz_given_the_fact_answer_the_q | lora |
| anli_r1_0_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/anli_r1_0_1_0 | lora |
| wiki_qa_Topic_Prediction_Question_and_Answer_Pair | phi-2 | zhan1993/openorca_task_id_cluster/wiki_qa_Topic_Prediction_Question_and_Answer_Pair | lora |
| wiki_qa_Direct_Answer_to_Question | phi-2 | zhan1993/openorca_task_id_cluster/wiki_qa_Direct_Answer_to_Question | lora |
| qasc_is_correct_2 | phi-2 | zhan1993/openorca_task_id_cluster/qasc_is_correct_2 | lora |
| wiki_hop_original_generate_subject_and_object | phi-2 | zhan1993/openorca_task_id_cluster/wiki_hop_original_generate_subject_and_object | lora |
| ai2_arc_ARC_Challenge_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/ai2_arc_ARC_Challenge_1_0_0 | lora |
| race_middle_Select_the_best_answer_generate_span_ | phi-2 | zhan1993/openorca_task_id_cluster/race_middle_Select_the_best_answer_generate_span_ | lora |
| quail_context_question_answer_description_text | phi-2 | zhan1993/openorca_task_id_cluster/quail_context_question_answer_description_text | lora |
| quail_context_question_description_text | phi-2 | zhan1993/openorca_task_id_cluster/quail_context_question_description_text | lora |
| wiki_hop_original_choose_best_object_interrogative_2 | phi-2 | zhan1993/openorca_task_id_cluster/wiki_hop_original_choose_best_object_interrogative_2 | lora |
| duorc_SelfRC_movie_director | phi-2 | zhan1993/openorca_task_id_cluster/duorc_SelfRC_movie_director | lora |
| quoref_Given_Context_Answer_Question | phi-2 | zhan1993/openorca_task_id_cluster/quoref_Given_Context_Answer_Question | lora |
| wiki_hop_original_explain_relation | phi-2 | zhan1993/openorca_task_id_cluster/wiki_hop_original_explain_relation | lora |
| super_glue_record_1_0_2 | phi-2 | zhan1993/openorca_task_id_cluster/super_glue_record_1_0_2 | lora |
| adversarial_qa_dbidaf_tell_what_it_is | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_dbidaf_tell_what_it_is | lora |
| cot_ecqa_ii | phi-2 | zhan1993/openorca_task_id_cluster/cot_ecqa_ii | lora |
| ropes_background_new_situation_answer | phi-2 | zhan1993/openorca_task_id_cluster/ropes_background_new_situation_answer | lora |
| web_questions_short_general_knowledge_q | phi-2 | zhan1993/openorca_task_id_cluster/web_questions_short_general_knowledge_q | lora |
| wiqa_what_might_be_the_first_step_of_the_process | phi-2 | zhan1993/openorca_task_id_cluster/wiqa_what_might_be_the_first_step_of_the_process | lora |
| duorc_SelfRC_answer_question | phi-2 | zhan1993/openorca_task_id_cluster/duorc_SelfRC_answer_question | lora |
| ag_news_subset_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/ag_news_subset_1_0_0 | lora |
| race_middle_Write_a_multi_choice_question_for_the_following_article | phi-2 | zhan1993/openorca_task_id_cluster/race_middle_Write_a_multi_choice_question_for_the_following_article | lora |
| wmt14_translate_fr_en_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/wmt14_translate_fr_en_1_0_0 | lora |
| sciq_Direct_Question | phi-2 | zhan1993/openorca_task_id_cluster/sciq_Direct_Question | lora |
| super_glue_multirc_1_0_2 | phi-2 | zhan1993/openorca_task_id_cluster/super_glue_multirc_1_0_2 | lora |
| dbpedia_14_given_a_choice_of_categories_ | phi-2 | zhan1993/openorca_task_id_cluster/dbpedia_14_given_a_choice_of_categories_ | lora |
| super_glue_wic_1_0_2 | phi-2 | zhan1993/openorca_task_id_cluster/super_glue_wic_1_0_2 | lora |
| social_i_qa_Show_choices_and_generate_answer | phi-2 | zhan1993/openorca_task_id_cluster/social_i_qa_Show_choices_and_generate_answer | lora |
| wiqa_what_might_be_the_last_step_of_the_process | phi-2 | zhan1993/openorca_task_id_cluster/wiqa_what_might_be_the_last_step_of_the_process | lora |
| quoref_Answer_Question_Given_Context | phi-2 | zhan1993/openorca_task_id_cluster/quoref_Answer_Question_Given_Context | lora |
| quoref_Context_Contains_Answer | phi-2 | zhan1993/openorca_task_id_cluster/quoref_Context_Contains_Answer | lora |
| cos_e_v1_11_description_question_option_text | phi-2 | zhan1993/openorca_task_id_cluster/cos_e_v1_11_description_question_option_text | lora |
| adversarial_qa_dbert_based_on | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_dbert_based_on | lora |
| multi_news_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/multi_news_1_0_0 | lora |
| cos_e_v1_11_generate_explanation_given_text | phi-2 | zhan1993/openorca_task_id_cluster/cos_e_v1_11_generate_explanation_given_text | lora |
| true_case | phi-2 | zhan1993/openorca_task_id_cluster/true_case | lora |
| duorc_ParaphraseRC_movie_director | phi-2 | zhan1993/openorca_task_id_cluster/duorc_ParaphraseRC_movie_director | lora |
| quartz_answer_question_based_on | phi-2 | zhan1993/openorca_task_id_cluster/quartz_answer_question_based_on | lora |
| bool_q_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/bool_q_1_0_0 | lora |
| quoref_Guess_Answer | phi-2 | zhan1993/openorca_task_id_cluster/quoref_Guess_Answer | lora |
| quarel_do_not_use | phi-2 | zhan1993/openorca_task_id_cluster/quarel_do_not_use | lora |
| cos_e_v1_11_explain_why_human | phi-2 | zhan1993/openorca_task_id_cluster/cos_e_v1_11_explain_why_human | lora |
| wiki_qa_Generate_Question_from_Topic | phi-2 | zhan1993/openorca_task_id_cluster/wiki_qa_Generate_Question_from_Topic | lora |
| kilt_tasks_hotpotqa_straighforward_qa | phi-2 | zhan1993/openorca_task_id_cluster/kilt_tasks_hotpotqa_straighforward_qa | lora |
| adversarial_qa_dbidaf_generate_question | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_dbidaf_generate_question | lora |
| dbpedia_14_pick_one_category_for_the_following_text | phi-2 | zhan1993/openorca_task_id_cluster/dbpedia_14_pick_one_category_for_the_following_text | lora |
| kilt_tasks_hotpotqa_final_exam | phi-2 | zhan1993/openorca_task_id_cluster/kilt_tasks_hotpotqa_final_exam | lora |
| quoref_Answer_Friend_Question | phi-2 | zhan1993/openorca_task_id_cluster/quoref_Answer_Friend_Question | lora |
| race_high_Write_a_multi_choice_question_for_the_following_article | phi-2 | zhan1993/openorca_task_id_cluster/race_high_Write_a_multi_choice_question_for_the_following_article | lora |
| ropes_prompt_beginning | phi-2 | zhan1993/openorca_task_id_cluster/ropes_prompt_beginning | lora |
| adversarial_qa_dbert_question_context_answer | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_dbert_question_context_answer | lora |
| cot_creak | phi-2 | zhan1993/openorca_task_id_cluster/cot_creak | lora |
| gem_e2e_nlg_1_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/gem_e2e_nlg_1_1_0 | lora |
| cos_e_v1_11_description_question_option_id | phi-2 | zhan1993/openorca_task_id_cluster/cos_e_v1_11_description_question_option_id | lora |
| social_i_qa_Generate_the_question_from_the_answer | phi-2 | zhan1993/openorca_task_id_cluster/social_i_qa_Generate_the_question_from_the_answer | lora |
| quarel_heres_a_story | phi-2 | zhan1993/openorca_task_id_cluster/quarel_heres_a_story | lora |
| social_i_qa_Check_if_a_random_answer_is_valid_or_not | phi-2 | zhan1993/openorca_task_id_cluster/social_i_qa_Check_if_a_random_answer_is_valid_or_not | lora |
| ropes_background_situation_middle | phi-2 | zhan1993/openorca_task_id_cluster/ropes_background_situation_middle | lora |
| sciq_Multiple_Choice_Question_First | phi-2 | zhan1993/openorca_task_id_cluster/sciq_Multiple_Choice_Question_First | lora |
| cot_strategyqa_ii | phi-2 | zhan1993/openorca_task_id_cluster/cot_strategyqa_ii | lora |
| huggingface_xsum | phi-2 | zhan1993/openorca_task_id_cluster/huggingface_xsum | lora |
| kilt_tasks_hotpotqa_complex_question | phi-2 | zhan1993/openorca_task_id_cluster/kilt_tasks_hotpotqa_complex_question | lora |
| wmt16_translate_fi_en_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/wmt16_translate_fi_en_1_0_0 | lora |
| ai2_arc_ARC_Easy_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/ai2_arc_ARC_Easy_1_0_0 | lora |
| stream_qed | phi-2 | zhan1993/openorca_task_id_cluster/stream_qed | lora |
| definite_pronoun_resolution_1_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/definite_pronoun_resolution_1_1_0 | lora |
| super_glue_rte_1_0_2 | phi-2 | zhan1993/openorca_task_id_cluster/super_glue_rte_1_0_2 | lora |
| ropes_new_situation_background_answer | phi-2 | zhan1993/openorca_task_id_cluster/ropes_new_situation_background_answer | lora |
| dream_read_the_following_conversation_and_answer_the_question | phi-2 | zhan1993/openorca_task_id_cluster/dream_read_the_following_conversation_and_answer_the_question | lora |
| cot_sensemaking | phi-2 | zhan1993/openorca_task_id_cluster/cot_sensemaking | lora |
| wiki_qa_Topic_Prediction_Answer_Only | phi-2 | zhan1993/openorca_task_id_cluster/wiki_qa_Topic_Prediction_Answer_Only | lora |
| duorc_ParaphraseRC_generate_question | phi-2 | zhan1993/openorca_task_id_cluster/duorc_ParaphraseRC_generate_question | lora |
| dream_generate_last_utterance | phi-2 | zhan1993/openorca_task_id_cluster/dream_generate_last_utterance | lora |
| race_middle_Taking_a_test | phi-2 | zhan1993/openorca_task_id_cluster/race_middle_Taking_a_test | lora |
| piqa_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/piqa_1_0_0 | lora |
| cot_ecqa | phi-2 | zhan1993/openorca_task_id_cluster/cot_ecqa | lora |
| glue_mrpc_2_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/glue_mrpc_2_0_0 | lora |
| race_middle_Read_the_article_and_answer_the_question_no_option_ | phi-2 | zhan1993/openorca_task_id_cluster/race_middle_Read_the_article_and_answer_the_question_no_option_ | lora |
| ropes_plain_background_situation | phi-2 | zhan1993/openorca_task_id_cluster/ropes_plain_background_situation | lora |
| quail_description_context_question_answer_text | phi-2 | zhan1993/openorca_task_id_cluster/quail_description_context_question_answer_text | lora |
| qasc_qa_with_combined_facts_1 | phi-2 | zhan1993/openorca_task_id_cluster/qasc_qa_with_combined_facts_1 | lora |
| cot_creak_ii | phi-2 | zhan1993/openorca_task_id_cluster/cot_creak_ii | lora |
| duorc_ParaphraseRC_decide_worth_it | phi-2 | zhan1993/openorca_task_id_cluster/duorc_ParaphraseRC_decide_worth_it | lora |
| quoref_Answer_Test | phi-2 | zhan1993/openorca_task_id_cluster/quoref_Answer_Test | lora |
| wiki_bio_who | phi-2 | zhan1993/openorca_task_id_cluster/wiki_bio_who | lora |
| kilt_tasks_hotpotqa_formulate | phi-2 | zhan1993/openorca_task_id_cluster/kilt_tasks_hotpotqa_formulate | lora |
| glue_wnli_2_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/glue_wnli_2_0_0 | lora |
| gigaword_1_2_0 | phi-2 | zhan1993/openorca_task_id_cluster/gigaword_1_2_0 | lora |
| quail_context_description_question_text | phi-2 | zhan1993/openorca_task_id_cluster/quail_context_description_question_text | lora |
| dream_answer_to_dialogue | phi-2 | zhan1993/openorca_task_id_cluster/dream_answer_to_dialogue | lora |
| cos_e_v1_11_question_option_description_id | phi-2 | zhan1993/openorca_task_id_cluster/cos_e_v1_11_question_option_description_id | lora |
| duorc_ParaphraseRC_question_answering | phi-2 | zhan1993/openorca_task_id_cluster/duorc_ParaphraseRC_question_answering | lora |
| wiki_qa_automatic_system | phi-2 | zhan1993/openorca_task_id_cluster/wiki_qa_automatic_system | lora |
| adversarial_qa_droberta_based_on | phi-2 | zhan1993/openorca_task_id_cluster/adversarial_qa_droberta_based_on | lora |
| super_glue_wsc_fixed_1_0_2 | phi-2 | zhan1993/openorca_task_id_cluster/super_glue_wsc_fixed_1_0_2 | lora |
| word_segment | phi-2 | zhan1993/openorca_task_id_cluster/word_segment | lora |
| quac_1_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/quac_1_0_0 | lora |
| quartz_paragraph_question_plain_concat | phi-2 | zhan1993/openorca_task_id_cluster/quartz_paragraph_question_plain_concat | lora |
| wiqa_which_of_the_following_is_the_supposed_perturbation | phi-2 | zhan1993/openorca_task_id_cluster/wiqa_which_of_the_following_is_the_supposed_perturbation | lora |
| quartz_use_info_from_question_paragraph | phi-2 | zhan1993/openorca_task_id_cluster/quartz_use_info_from_question_paragraph | lora |
| ropes_plain_no_background | phi-2 | zhan1993/openorca_task_id_cluster/ropes_plain_no_background | lora |
| race_high_Select_the_best_answer_generate_span_ | phi-2 | zhan1993/openorca_task_id_cluster/race_high_Select_the_best_answer_generate_span_ | lora |
| glue_cola_2_0_0 | phi-2 | zhan1993/openorca_task_id_cluster/glue_cola_2_0_0 | lora |
| social_i_qa_Show_choices_and_generate_index | phi-2 | zhan1993/openorca_task_id_cluster/social_i_qa_Show_choices_and_generate_index | lora |
| ropes_prompt_bottom_hint_beginning | phi-2 | zhan1993/openorca_task_id_cluster/ropes_prompt_bottom_hint_beginning | lora |
| hellaswag_1_1_0 | phi-2 | zhan1993/openorca_task_id_cluster/hellaswag_1_1_0 | lora |
Last updated on: 2024-06-06 14:39:19+00:00
| [
"SCIQ"
] |
Mohamedfadil369/brainsait | Mohamedfadil369 | reinforcement-learning | [
"fastai",
"medical",
"reinforcement-learning",
"ar",
"dataset:HuggingFaceFW/fineweb",
"doi:10.57967/hf/2426",
"license:mit",
"region:us"
] | "2024-06-08T01:56:44Z" | 2024-06-08T02:20:46+00:00 | 0 | 0 | ---
datasets:
- HuggingFaceFW/fineweb
language:
- ar
library_name: fastai
license: mit
pipeline_tag: reinforcement-learning
tags:
- medical
---
# Model Card for BrainSAIT Model
<!-- Provide a quick summary of what the model is/does. -->
This model card aims to provide detailed information about the BrainSAIT model. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1).
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
BrainSAIT is a reinforcement learning model developed for medical applications. It has been fine-tuned using the Arabic language dataset from HuggingFaceFW/fineweb. The model utilizes the fastai library for its implementation.
- **Developed by:** BrainSAIT Team
- **Funded by [optional]:** [Dr.Mohamed El Fadil]
- **Shared by [optional]:** [Dr.Mohamed El Fadil]
- **Model type:** Reinforcement Learning
- **Language(s) (NLP):** Arabic
- **License:** MIT
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
The BrainSAIT model can be used directly for tasks related to medical data analysis and decision-making support in the Arabic language.
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
The model can be further fine-tuned for specific medical applications or integrated into larger medical decision support systems.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
The model is not suitable for non-medical applications or for tasks requiring expertise in languages other than Arabic.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
The model may have biases originating from the training data, which is specific to Arabic medical content. It may not perform well on non-Arabic data or non-medical contexts.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be aware of the risks, biases, and limitations of the model. Proper validation in the specific use case is recommended before deployment.
## How to Get Started with the Model
Use the code below to get started with the model.
```python
from fastai.text.all import *
# Load the model
learn = load_learner('path_to_your_model.pkl')
# Use the model for prediction
text = "Your input text here"
prediction = learn.predict(text)
print(prediction)
| [
"MEDICAL DATA"
] |
ThorBaller/Pubmed_llama3 | ThorBaller | question-answering | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"llama",
"trl",
"question-answering",
"en",
"dataset:qiaojin/PubMedQA",
"base_model:unsloth/llama-3-8b-bnb-4bit",
"base_model:finetune:unsloth/llama-3-8b-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | "2024-06-08T19:32:04Z" | 2024-06-08T22:46:16+00:00 | 0 | 0 | ---
base_model: unsloth/llama-3-8b-bnb-4bit
datasets:
- qiaojin/PubMedQA
language:
- en
license: apache-2.0
pipeline_tag: question-answering
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
---
# Uploaded model
- **Developed by:** ThorBaller
- **License:** apache-2.0
- **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth) | [
"PUBMEDQA"
] |
clinicalnlplab/me-llama | clinicalnlplab | null | [
"transformers",
"medical",
"health",
"llama",
"llama2",
"en",
"dataset:togethercomputer/RedPajama-Data-1T",
"dataset:bigbio/med_qa",
"arxiv:2402.12749",
"license:llama2",
"endpoints_compatible",
"region:us"
] | "2024-06-10T04:02:12Z" | 2024-06-10T04:14:29+00:00 | 0 | 12 | ---
datasets:
- togethercomputer/RedPajama-Data-1T
- bigbio/med_qa
language:
- en
library_name: transformers
license: llama2
tags:
- medical
- health
- llama
- llama2
---
# Me-LLaMA
## Model Overview
The Me-LLaMA model consists of two foundation models: Me-LLaMA 13B and Me-LLaMA 70B, along with their chat-enhanced counterparts, Me-LLaMA 13B-chat and Me-LLaMA 70B-chat. These models are designed for superior chat and instruction-following capabilities. The Me-LLaMA 13B and 70B were continually pretrained from the base LLaMA 2 13B and 70B models with the addition of biomedical, clinical, and general domain data. The chat versions were further instruction-tuned using comprehensive medical instruction tuning data.
## Pretraining and Data
Me-LLaMA was developed through continual pre-training and instruction tuning of LLaMA2, incorporating 129B tokens and 214K instruction tuning samples from general, biomedical, and clinical domains. The pretraining data consists of biomedical literature, clinical notes, and general domain data in a 15:1:4 ratio, sourced from:
- **Biomedical:** PubMed Central and PubMed Abstracts (Pile dataset)
- **Clinical:** De-identified free-text clinical notes from MIMIC III, MIMIC-IV, and MIMIC-CXR
- **General Domain:** Subset from the RedPajama dataset
The instruction tuning dataset includes:
- **General Domain:** Alpaca, Dolly, and ShareGPT datasets
- **Biomedical:** HealthCareMagic, Icliniq, MedInstruct, Medical Flash Cards, MEDIQA, MedicationQA, LiveQA, WikiDocPatient, Guideline QA, Pubmed Central, Pubmed, UMLS Knowledge graph
- **Clinical:** MIMIC-III and MIMIC-IV
## Evaluation
Me-LLaMA was evaluated on 12 datasets across different tasks:
- **QA:** PubMedQA, MedQA, MedMCQA, EmrQA
- **NER:** 2010 i2b2
- **Relation Extraction:** 2013 DDI
- **Classification:** HoC, MTSample
- **Text Summarization:** PubMed, MIMIC-CXR
- **NLI:** BioNLI, MedNLI
### Performance
- **Me-LLaMA 13B:** Surpassed PMC-LLaMA 13B on 11/12 datasets and LLaMA2 13B on 10/12 datasets, with competitive performance against larger models like LLaMA2 70B and Meditron 70B on 8/12 datasets.
- **Me-LLaMA 70B:** Outperformed LLaMA2 70B and Meditron 70B on 9/12 datasets.
- **Zero-shot setting:** Outperformed ChatGPT on 5/8 datasets without privacy concerns, and on 1/8 against GPT-4.
- **Task-specific instruction tuning:** Surpassed ChatGPT on 7/8 and GPT-4 on 5/8 datasets.
Despite having significantly fewer parameters (13B/70B vs. 175B+ for ChatGPT and GPT-4), Me-LLaMA models demonstrated impressive performance and strong abilities in supervised and in-context learning across various medical tasks.
## Model Details
Included in this repository are four models:
1. **Me-LLaMA 13B:** Continually pretrained from LLaMA 2 13B.
2. **Me-LLaMA 70B:** Continually pretrained from LLaMA 2 70B.
3. **Me-LLaMA 13B-chat:** Further instruction-tuned from Me-LLaMA 13B using a variety of general, biomedical, and clinical datasets.
4. **Me-LLaMA 70B-chat:** Further instruction-tuned from Me-LLaMA 70B using a variety of general, biomedical, and clinical datasets.
Each model contains several files, which are standard with the transformers library:
- **config.json:** Information about the model
- **model-x-of-y.safetensors:** Model weights
- **generation_config.json:** Settings for text generation
- **special_tokens_map.json:** Special tokens used in training
- **tokenizer.json:** Mapping from indices to tokens
- **tokenizer_config.json:** Configuration file for the tokenizer
## Usage
For more details and to access the models, please visit the [Me-LLaMA repository on PhysioNet](https://physionet.org/content/me-llama/1.0.0/).
For more technical details, please visit [our paper on arXiv](https://arxiv.org/abs/2402.12749).
| [
"MEDNLI",
"MEDQA",
"PUBMEDQA"
] |
Sreedev11/olympics_prediction_model | Sreedev11 | null | [
"region:us"
] | "2024-06-10T14:48:07Z" | 2024-06-10T14:57:56+00:00 | 0 | 0 | ---
{}
---
# %%
import numpy as np
import pandas as pd
import seaborn as sns
import matplotlib.pyplot as plt
from sklearn import preprocessing
from sklearn.linear_model import LogisticRegression
from sklearn.metrics import classification_report,accuracy_score
from sklearn.model_selection import TimeSeriesSplit,train_test_split
from sklearn.cluster import KMeans
import matplotlib
from sklearn.tree import DecisionTreeClassifier
from sklearn.metrics import classification_report,accuracy_score
from sklearn.naive_bayes import GaussianNB
from sklearn import metrics
from sklearn.svm import LinearSVC
import pylab as pl
from sklearn.ensemble import RandomForestClassifier
import warnings
warnings.filterwarnings('ignore')
df=pd.read_csv("athlete_events.csv")
# %%
df
# %%
df.head()
# %%
df.info()
# %%
df.describe()
# %%
df.dtypes
# %%
df.ndim
# %%
df.shape
# %%
df.isna().sum()
# %%
#DNW:Did Not win , missing values of medal are filled with DNW
df['Medal'].fillna("DNW",inplace=True)
# %%
df_noc=pd.read_csv("noc_regions.csv")
# %%
df_noc
# %%
df_noc=df_noc.drop("notes",axis=1)
# %%
df_noc
# %%
df_noc.rename(columns={"region":"country"},inplace=True)
# %%
df_noc
# %%
df.sample(4)
# %%
#joining both dataset
olympics_merge=df.merge(df_noc,left_on='NOC',right_on='NOC',how='left')
# %%
olympics_merge.sample()
# %%
print(olympics_merge.loc[olympics_merge['country'].isnull(),['NOC', 'Team']].drop_duplicates())
# %%
# Replace missing Teams by the values 1. SGP - Singapore
# 2. ROT - Refugee Olympic Athletes
# 3. UNK - Unknown
# 4. TUV - Tuvalu
#olympics_merge.loc[olympics_merge['Country'].isnull(), ['Country']] = olympics_merge['Team']
# %%
olympics_merge.loc[olympics_merge['country'].isnull(), ['country']] = olympics_merge['Team']
# %%
olympics_merge
# %%
print(olympics_merge.loc[olympics_merge['country'].isnull(),['NOC', 'Team']].drop_duplicates())
# %%
olympics_merge['country'] = np.where(olympics_merge['NOC']=='SGP', 'Singapore', olympics_merge['country'])
olympics_merge['country'] = np.where(olympics_merge['NOC']=='ROT', 'Refugee Olympic Athletes', olympics_merge['country'])
olympics_merge['country'] = np.where(olympics_merge['NOC']=='UNK', 'Unknown', olympics_merge['country'])
olympics_merge['country'] = np.where(olympics_merge['NOC']=='TUV', 'Tuvalu', olympics_merge['country'])
# %%
olympics_merge
# %%
olympics_merge.drop("Team",axis=1,inplace=True)
# %%
olympics_merge.sample()
# %%
olympics_merge.rename(columns={'country':'Team'},inplace=True)
# %%
olympics_merge.head(2)
# %%
print(olympics_merge.loc[olympics_merge['Team'].isnull(),['NOC', 'Team']].drop_duplicates())
# %%
olympics_merge.isnull().sum()
# %%
for i in ["Age","Height","Weight"]:
sns.histplot(olympics_merge[i],kde=True)
plt.show()
# %%
for i in ["Age","Weight",]:
olympics_merge[i]=olympics_merge[i].fillna(olympics_merge[i].mean())
# %%
olympics_merge["Height"]=olympics_merge["Height"].fillna(olympics_merge["Height"].mean())
# %%
olympics_merge.isnull().sum()
# %%
olympics_merge.info()
# %%
olympics_merge['Sex']=np.where(olympics_merge['Sex']=='M',1,0)
# %%
olympics_merge.sample(2)
# %%
olympics_merge["Medal"].unique()
# %%
olympics_merge['Event'].unique()
# %%
olympics_merge['Sport'].unique()
# %%
olympics_merge1=olympics_merge
# %%
olympics_merge1
# %%
from sklearn.preprocessing import LabelEncoder
le=LabelEncoder()
# %%
olympics_merge1['Medal']=le.fit_transform(olympics_merge1['Medal'])
# %%
olympics_merge1
# %%
olympics_merge1['Medal'].unique()
# %%
summer=olympics_merge1.loc[(olympics_merge1['Year']>1960)&(olympics_merge1['Season']=="Summer"), :]
summer.head(5)
# %%
summer=summer.reset_index()
summer.head(10)
# %%
summer.sample()
# %%
#extracting unique events in a new list
# %%
summerlistunique=summer.Event.unique()
len(summerlistunique)
# %%
summerlistunique
# %%
summer.drop(['Season'],axis=1,inplace=True)
summer.drop(['NOC'],axis=1,inplace=True)
summer.drop(['Games'],axis=1,inplace=True)
summer.drop(['City'],axis=1,inplace=True)
summer.drop(['Year'],axis=1,inplace=True)
summer.drop(['Sport'],axis=1,inplace=True)
summer.drop(['ID'],axis=1,inplace=True)
summer.drop(['Name'],axis=1,inplace=True)
summer.drop(['index'],axis=1,inplace=True)
# %%
summer
# %%
#created a column for encoded team and encoded events in numerical form in original dataset
summer['Team_encode']=le.fit_transform(summer['Team'])
summer['Event_encode']=le.fit_transform(summer['Event'])
# %%
#storing the team names and corresponding encoded numerical values into a new csv file after sorting them according to team name
TeamKeys=summer[['Team','Team_encode']].copy()
TeamKeys.drop_duplicates(subset="Team",inplace=True)
TeamKeys.to_csv("keystoteam.csv")
# %%
TeamKeys.head(4)
# %%
#storing event names and corresponding encoded numerical values into a new csv file after sorting them according to the event name
EventKeys=summer[['Event','Event_encode']].copy()
EventKeys.drop_duplicates(subset="Event",inplace=True)
EventKeys.to_csv("keystoevent.csv")
# %%
EventKeys.head(4)
# %%
summer
# %%
summer.drop(['Event'],axis=1,inplace=True)
summer.drop(['Team'],axis=1,inplace=True)
# %%
summer
# %%
y=summer['Medal']
# %%
y
# %%
x=summer.drop("Medal",axis=1)
# %%
x
# %%
X_train, X_test, Y_train, Y_test = train_test_split(x,y,test_size=0.30, random_state=99)
# %%
x
# %%
y
# %%
X_test
# %%
Y_test
# %%
#ALGORITHM 1 LOGISTIC REGRESSION
# %%
lr=LogisticRegression()
lr.fit(X_train,Y_train)
Y_pred=lr.predict(X_test)
sk_report=classification_report(digits=6,y_true=Y_test,y_pred=Y_pred)
print("Accuracy",round(accuracy_score(Y_pred,Y_test)*100,2))
print(sk_report)
print(pd.crosstab(Y_test,Y_pred,rownames=['Actual'],colnames=['Predicted'],margins=True))
# %%
#ALGORITHM 2 DECESSION TREE
# %%
decision_tree = DecisionTreeClassifier()
decision_tree.fit(X_train, Y_train)
Y_pred = decision_tree.predict(X_test)
acc_decision_tree1 = round(decision_tree.score(X_test, Y_test) * 100, 2)
sk_report = classification_report(digits=6, y_true=Y_test, y_pred=Y_pred)
print("Accuracy", acc_decision_tree1)
print(sk_report)
### Confusion Matrix
print(pd.crosstab(Y_test, Y_pred,rownames=['Actual'],colnames=['Predicted'],margins=True))
# %%
#ALGORITHM 3 RANDOM FOREST
# %%
random_forest = RandomForestClassifier(n_estimators=200)
random_forest.fit(X_train,Y_train)
Y_pred = random_forest.predict(X_test)
random_forest.score(X_test, Y_test)
acc_random_forest1=round(random_forest.score(X_test, Y_test)*100,2)
k_report = classification_report(
digits=6,
y_true=Y_test,
y_pred=Y_pred)
print("Accuracy" , acc_random_forest1)
print(sk_report)
pd.crosstab(Y_test, Y_pred,rownames=['Actual'],colnames=['Predicted'],margins=True)
# %%
x.sample(5)
# %%
y.sample(5)
# %%
summer.sample(4)
# %%
random_forest.predict([[1,19.0,173.0,70.0,87,163]])
# %%
import pickle
from joblib import dump,load
dump(random_forest,'olympics_model.pkl')
model_file = open(r"Projects\Olympics\olympics_model1.pkl","wb")
pickle.dump(random_forest,model_file)
| [
"MEDAL"
] |
ahricat/ONI | ahricat | graph-ml | [
"graph-ml",
"en",
"dataset:zalando-datasets/fashion_mnist",
"dataset:ylecun/mnist",
"dataset:SamPIngram/tinyshakespeare",
"dataset:ncbi/pubmed",
"dataset:ncbi/ncbi_disease",
"dataset:hendrycks/ethics",
"dataset:microsoft/orca-math-word-problems-200k",
"dataset:taesiri/imagenet-hard",
"doi:10.57967/hf/2473",
"license:other",
"region:us"
] | "2024-06-11T19:49:29Z" | 2024-06-11T20:47:04+00:00 | 0 | 0 | ---
datasets:
- zalando-datasets/fashion_mnist
- ylecun/mnist
- SamPIngram/tinyshakespeare
- ncbi/pubmed
- ncbi/ncbi_disease
- hendrycks/ethics
- microsoft/orca-math-word-problems-200k
- taesiri/imagenet-hard
language:
- en
license: other
license_name: pantheum-license
license_link: LICENSE
metrics:
- accuracy
model_name: ONI-Hyper-MM
pipeline_tag: graph-ml
---
# Oni from Pretrained
## this is a MultiModal Hypergraph model. Low params (800 million) | [
"NCBI DISEASE"
] |
raghavlight/TDTE | raghavlight | null | [
"safetensors",
"mteb",
"model-index",
"region:us"
] | "2024-06-13T00:41:23Z" | 2024-06-13T02:52:26+00:00 | 0 | 3 | ---
tags:
- mteb
model-index:
- name: 0523_mistralv2_sum3echo512_bbcc_8_16_16
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 79.65671641791045
- type: ap
value: 44.24063991266868
- type: f1
value: 73.91766997954294
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 94.480125
- type: ap
value: 92.21829806116952
- type: f1
value: 94.47801150800291
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 48.157999999999994
- type: f1
value: 47.11858175135973
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 31.935000000000002
- type: map_at_10
value: 49.482
- type: map_at_100
value: 49.482
- type: map_at_1000
value: 49.482
- type: map_at_20
value: 49.482
- type: map_at_3
value: 44.464
- type: map_at_5
value: 47.569
- type: mrr_at_1
value: 33.001000000000005
- type: mrr_at_10
value: 49.989
- type: mrr_at_100
value: 49.989
- type: mrr_at_1000
value: 49.989
- type: mrr_at_20
value: 49.989
- type: mrr_at_3
value: 44.903
- type: mrr_at_5
value: 48.054
- type: ndcg_at_1
value: 31.935000000000002
- type: ndcg_at_10
value: 58.819
- type: ndcg_at_100
value: 58.819
- type: ndcg_at_1000
value: 58.819
- type: ndcg_at_20
value: 58.819
- type: ndcg_at_3
value: 48.620000000000005
- type: ndcg_at_5
value: 54.230000000000004
- type: precision_at_1
value: 31.935000000000002
- type: precision_at_10
value: 8.841000000000001
- type: precision_at_100
value: 0.8840000000000001
- type: precision_at_1000
value: 0.08800000000000001
- type: precision_at_20
value: 4.42
- type: precision_at_3
value: 20.223
- type: precision_at_5
value: 14.865
- type: recall_at_1
value: 31.935000000000002
- type: recall_at_10
value: 88.407
- type: recall_at_100
value: 88.407
- type: recall_at_1000
value: 88.407
- type: recall_at_20
value: 88.407
- type: recall_at_3
value: 60.669
- type: recall_at_5
value: 74.324
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 48.7848435754835
- type: v_measures
value:
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- 0.4921178407713082
- 0.4900811693910433
- 0.5035743243257481
- 0.49769690824686913
- 0.484482240428649
- 0.48877156706650865
- 0.4917783921004695
- 0.490848646915023
- 0.49292827306716547
- 0.4667863103804292
- 0.5663892295430093
- 0.5668130433770879
- 0.5621288042146693
- 0.5658463909906998
- 0.5669889138453401
- 0.5678202745454832
- 0.5686559823111067
- 0.5672351018082963
- 0.554891045405333
- 0.5661694307954689
- 0.5309350425293812
- 0.2938608518329288
- 0.4844129096095996
- 0.4282763304977941
- 0.3635291849887843
- 0.2962076070268785
- 0.30324674572414795
- 0.24299400753636727
- 0.34506718232232675
- 1.0
- 0.28276775680196714
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 46.10665257880071
- type: v_measures
value:
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- 0.4791303592299426
- 0.47312049608032497
- 0.4855223164775998
- 0.4571429771751102
- 0.4762861002816672
- 0.48218700555188587
- 0.4774159340612887
- 0.4706669107168955
- 0.4817074105941521
- 0.46122831822845595
- 0.5323998509009684
- 0.5366144743504581
- 0.5350659892124341
- 0.5348097376189661
- 0.5361859305887842
- 0.5401424740226736
- 0.5386301513493418
- 0.536195294071538
- 0.5307019767098927
- 0.529430500641798
- 0.48993023034390504
- 0.24840671183096288
- 0.41882293476660615
- 0.3892318610333167
- 0.325751283253651
- 0.24324245195504823
- 0.2853604795144245
- 0.23061705991870918
- 0.31166614557038164
- 1.0
- 0.2554489333770363
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 66.7285956124022
- type: mrr
value: 79.72233214615486
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 88.73245869702066
- type: cos_sim_spearman
value: 87.28451895745819
- type: euclidean_pearson
value: 86.44569617089661
- type: euclidean_spearman
value: 86.7236628044763
- type: manhattan_pearson
value: 86.50853979799092
- type: manhattan_spearman
value: 86.75920578302187
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 88.91233766233766
- type: f1
value: 88.86315189747688
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 38.7850808112868
- type: v_measures
value:
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- 0.387862617887449
- 0.38352827892371627
- 0.371265066952095
- 0.3774981384705982
- 0.37131831220293676
- 0.39149988570912153
- 0.38703497665413544
- 0.40930675826264357
- 0.3910216974623904
- 0.4081723486035933
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 37.37318034700008
- type: v_measures
value:
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- 0.36845423004088185
- 0.38992061254062366
- 0.3717948730004672
- 0.36026627188254456
- 0.3669860108798917
- 0.36731355824516293
- 0.375291529012098
- 0.38550090432534534
- 0.36577228218454805
- 0.38601776258844467
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: mteb/cqadupstack-android
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 39.232
- type: map_at_10
value: 53.04299999999999
- type: map_at_100
value: 53.04299999999999
- type: map_at_1000
value: 53.04299999999999
- type: map_at_20
value: 53.04299999999999
- type: map_at_3
value: 48.588
- type: map_at_5
value: 51.17699999999999
- type: mrr_at_1
value: 49.356
- type: mrr_at_10
value: 59.550000000000004
- type: mrr_at_100
value: 59.550000000000004
- type: mrr_at_1000
value: 59.550000000000004
- type: mrr_at_20
value: 59.550000000000004
- type: mrr_at_3
value: 56.986000000000004
- type: mrr_at_5
value: 58.638999999999996
- type: ndcg_at_1
value: 49.356
- type: ndcg_at_10
value: 60.156
- type: ndcg_at_100
value: 59.714999999999996
- type: ndcg_at_1000
value: 59.699000000000005
- type: ndcg_at_20
value: 59.831
- type: ndcg_at_3
value: 54.75299999999999
- type: ndcg_at_5
value: 57.443999999999996
- type: precision_at_1
value: 49.356
- type: precision_at_10
value: 11.86
- type: precision_at_100
value: 1.1860000000000002
- type: precision_at_1000
value: 0.11900000000000001
- type: precision_at_20
value: 5.93
- type: precision_at_3
value: 26.895999999999997
- type: precision_at_5
value: 19.570999999999998
- type: recall_at_1
value: 39.232
- type: recall_at_10
value: 72.98400000000001
- type: recall_at_100
value: 72.98400000000001
- type: recall_at_1000
value: 72.98400000000001
- type: recall_at_20
value: 72.98400000000001
- type: recall_at_3
value: 56.213
- type: recall_at_5
value: 64.318
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: mteb/cqadupstack-english
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 37.157000000000004
- type: map_at_10
value: 49.512
- type: map_at_100
value: 49.512
- type: map_at_1000
value: 49.512
- type: map_at_20
value: 49.512
- type: map_at_3
value: 46.099000000000004
- type: map_at_5
value: 48.061
- type: mrr_at_1
value: 47.516000000000005
- type: mrr_at_10
value: 55.803999999999995
- type: mrr_at_100
value: 55.803999999999995
- type: mrr_at_1000
value: 55.803999999999995
- type: mrr_at_20
value: 55.803999999999995
- type: mrr_at_3
value: 53.885000000000005
- type: mrr_at_5
value: 54.967999999999996
- type: ndcg_at_1
value: 47.516000000000005
- type: ndcg_at_10
value: 55.386
- type: ndcg_at_100
value: 54.952
- type: ndcg_at_1000
value: 54.952
- type: ndcg_at_20
value: 55.07300000000001
- type: ndcg_at_3
value: 51.458000000000006
- type: ndcg_at_5
value: 53.189
- type: precision_at_1
value: 47.516000000000005
- type: precision_at_10
value: 10.567
- type: precision_at_100
value: 1.057
- type: precision_at_1000
value: 0.106
- type: precision_at_20
value: 5.283
- type: precision_at_3
value: 25.393
- type: precision_at_5
value: 17.656
- type: recall_at_1
value: 37.157000000000004
- type: recall_at_10
value: 65.026
- type: recall_at_100
value: 65.026
- type: recall_at_1000
value: 65.026
- type: recall_at_20
value: 65.026
- type: recall_at_3
value: 52.36300000000001
- type: recall_at_5
value: 57.989999999999995
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: mteb/cqadupstack-gaming
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 48.522999999999996
- type: map_at_10
value: 62.844
- type: map_at_100
value: 62.844
- type: map_at_1000
value: 62.844
- type: map_at_20
value: 62.844
- type: map_at_3
value: 59.150999999999996
- type: map_at_5
value: 61.403
- type: mrr_at_1
value: 55.925000000000004
- type: mrr_at_10
value: 66.113
- type: mrr_at_100
value: 66.113
- type: mrr_at_1000
value: 66.113
- type: mrr_at_20
value: 66.113
- type: mrr_at_3
value: 63.783
- type: mrr_at_5
value: 65.212
- type: ndcg_at_1
value: 55.925000000000004
- type: ndcg_at_10
value: 68.869
- type: ndcg_at_100
value: 68.774
- type: ndcg_at_1000
value: 68.774
- type: ndcg_at_20
value: 68.777
- type: ndcg_at_3
value: 63.31400000000001
- type: ndcg_at_5
value: 66.247
- type: precision_at_1
value: 55.925000000000004
- type: precision_at_10
value: 10.997
- type: precision_at_100
value: 1.0999999999999999
- type: precision_at_1000
value: 0.11
- type: precision_at_20
value: 5.498
- type: precision_at_3
value: 28.359
- type: precision_at_5
value: 19.386
- type: recall_at_1
value: 48.522999999999996
- type: recall_at_10
value: 83.045
- type: recall_at_100
value: 83.045
- type: recall_at_1000
value: 83.045
- type: recall_at_20
value: 83.045
- type: recall_at_3
value: 68.449
- type: recall_at_5
value: 75.62100000000001
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: mteb/cqadupstack-gis
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 30.726
- type: map_at_10
value: 40.433
- type: map_at_100
value: 40.433
- type: map_at_1000
value: 40.433
- type: map_at_20
value: 40.433
- type: map_at_3
value: 37.135
- type: map_at_5
value: 39.17
- type: mrr_at_1
value: 33.672000000000004
- type: mrr_at_10
value: 42.836
- type: mrr_at_100
value: 42.836
- type: mrr_at_1000
value: 42.836
- type: mrr_at_20
value: 42.836
- type: mrr_at_3
value: 39.755
- type: mrr_at_5
value: 41.631
- type: ndcg_at_1
value: 33.672000000000004
- type: ndcg_at_10
value: 46.092
- type: ndcg_at_100
value: 46.092
- type: ndcg_at_1000
value: 46.092
- type: ndcg_at_20
value: 46.092
- type: ndcg_at_3
value: 39.797
- type: ndcg_at_5
value: 43.171
- type: precision_at_1
value: 33.672000000000004
- type: precision_at_10
value: 7.073
- type: precision_at_100
value: 0.707
- type: precision_at_1000
value: 0.07100000000000001
- type: precision_at_20
value: 3.537
- type: precision_at_3
value: 16.648
- type: precision_at_5
value: 11.91
- type: recall_at_1
value: 30.726
- type: recall_at_10
value: 61.24000000000001
- type: recall_at_100
value: 61.24000000000001
- type: recall_at_1000
value: 61.24000000000001
- type: recall_at_20
value: 61.24000000000001
- type: recall_at_3
value: 44.557
- type: recall_at_5
value: 52.608999999999995
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: mteb/cqadupstack-mathematica
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 21.554000000000002
- type: map_at_10
value: 31.508000000000003
- type: map_at_100
value: 31.508000000000003
- type: map_at_1000
value: 31.508000000000003
- type: map_at_20
value: 31.508000000000003
- type: map_at_3
value: 28.225
- type: map_at_5
value: 30.043
- type: mrr_at_1
value: 27.114
- type: mrr_at_10
value: 36.631
- type: mrr_at_100
value: 36.631
- type: mrr_at_1000
value: 36.631
- type: mrr_at_20
value: 36.631
- type: mrr_at_3
value: 34.059
- type: mrr_at_5
value: 35.601
- type: ndcg_at_1
value: 27.114
- type: ndcg_at_10
value: 37.592999999999996
- type: ndcg_at_100
value: 37.588
- type: ndcg_at_1000
value: 37.588
- type: ndcg_at_20
value: 37.588
- type: ndcg_at_3
value: 32.038
- type: ndcg_at_5
value: 34.689
- type: precision_at_1
value: 27.114
- type: precision_at_10
value: 7.090000000000001
- type: precision_at_100
value: 0.709
- type: precision_at_1000
value: 0.07100000000000001
- type: precision_at_20
value: 3.5450000000000004
- type: precision_at_3
value: 15.506
- type: precision_at_5
value: 11.393
- type: recall_at_1
value: 21.554000000000002
- type: recall_at_10
value: 50.879
- type: recall_at_100
value: 50.879
- type: recall_at_1000
value: 50.879
- type: recall_at_20
value: 50.879
- type: recall_at_3
value: 35.827999999999996
- type: recall_at_5
value: 42.476
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: mteb/cqadupstack-physics
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 35.36
- type: map_at_10
value: 48.483
- type: map_at_100
value: 48.483
- type: map_at_1000
value: 48.483
- type: map_at_20
value: 48.483
- type: map_at_3
value: 44.639
- type: map_at_5
value: 46.698
- type: mrr_at_1
value: 43.985
- type: mrr_at_10
value: 54.039
- type: mrr_at_100
value: 54.039
- type: mrr_at_1000
value: 54.039
- type: mrr_at_20
value: 54.039
- type: mrr_at_3
value: 51.54
- type: mrr_at_5
value: 52.859
- type: ndcg_at_1
value: 43.985
- type: ndcg_at_10
value: 55.069
- type: ndcg_at_100
value: 54.967
- type: ndcg_at_1000
value: 54.967
- type: ndcg_at_20
value: 54.996
- type: ndcg_at_3
value: 49.544
- type: ndcg_at_5
value: 51.932
- type: precision_at_1
value: 43.985
- type: precision_at_10
value: 10.202
- type: precision_at_100
value: 1.02
- type: precision_at_1000
value: 0.10200000000000001
- type: precision_at_20
value: 5.101
- type: precision_at_3
value: 23.933
- type: precision_at_5
value: 16.901
- type: recall_at_1
value: 35.36
- type: recall_at_10
value: 68.806
- type: recall_at_100
value: 68.806
- type: recall_at_1000
value: 68.806
- type: recall_at_20
value: 68.806
- type: recall_at_3
value: 52.714000000000006
- type: recall_at_5
value: 59.168
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: mteb/cqadupstack-programmers
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 32.431
- type: map_at_10
value: 45.421
- type: map_at_100
value: 45.421
- type: map_at_1000
value: 45.421
- type: map_at_20
value: 45.421
- type: map_at_3
value: 41.82
- type: map_at_5
value: 43.692
- type: mrr_at_1
value: 41.096
- type: mrr_at_10
value: 51.293
- type: mrr_at_100
value: 51.293
- type: mrr_at_1000
value: 51.293
- type: mrr_at_20
value: 51.293
- type: mrr_at_3
value: 49.049
- type: mrr_at_5
value: 50.327
- type: ndcg_at_1
value: 41.096
- type: ndcg_at_10
value: 52.032999999999994
- type: ndcg_at_100
value: 51.903
- type: ndcg_at_1000
value: 51.897999999999996
- type: ndcg_at_20
value: 51.942
- type: ndcg_at_3
value: 47.024
- type: ndcg_at_5
value: 49.071
- type: precision_at_1
value: 41.096
- type: precision_at_10
value: 9.725999999999999
- type: precision_at_100
value: 0.9730000000000001
- type: precision_at_1000
value: 0.097
- type: precision_at_20
value: 4.8629999999999995
- type: precision_at_3
value: 23.097
- type: precision_at_5
value: 16.096
- type: recall_at_1
value: 32.431
- type: recall_at_10
value: 65.42999999999999
- type: recall_at_100
value: 65.42999999999999
- type: recall_at_1000
value: 65.42999999999999
- type: recall_at_20
value: 65.42999999999999
- type: recall_at_3
value: 50.856
- type: recall_at_5
value: 56.846
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 32.074749999999995
- type: map_at_10
value: 43.474
- type: map_at_100
value: 43.474
- type: map_at_1000
value: 43.474
- type: map_at_20
value: 43.474
- type: map_at_3
value: 40.10458333333333
- type: map_at_5
value: 42.010749999999994
- type: mrr_at_1
value: 38.60425
- type: mrr_at_10
value: 48.05550000000001
- type: mrr_at_100
value: 48.05550000000001
- type: mrr_at_1000
value: 48.05550000000001
- type: mrr_at_20
value: 48.05550000000001
- type: mrr_at_3
value: 45.58083333333334
- type: mrr_at_5
value: 47.04750000000001
- type: ndcg_at_1
value: 38.60425
- type: ndcg_at_10
value: 49.51958333333334
- type: ndcg_at_100
value: 49.3385
- type: ndcg_at_1000
value: 49.33491666666667
- type: ndcg_at_20
value: 49.393
- type: ndcg_at_3
value: 44.32699999999999
- type: ndcg_at_5
value: 46.81008333333333
- type: precision_at_1
value: 38.60425
- type: precision_at_10
value: 8.800666666666668
- type: precision_at_100
value: 0.8800833333333334
- type: precision_at_1000
value: 0.08808333333333335
- type: precision_at_20
value: 4.400333333333334
- type: precision_at_3
value: 20.723166666666664
- type: precision_at_5
value: 14.65683333333333
- type: recall_at_1
value: 32.074749999999995
- type: recall_at_10
value: 62.5025
- type: recall_at_100
value: 62.5025
- type: recall_at_1000
value: 62.5025
- type: recall_at_20
value: 62.5025
- type: recall_at_3
value: 47.81091666666667
- type: recall_at_5
value: 54.38974999999999
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: mteb/cqadupstack-stats
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 28.758
- type: map_at_10
value: 37.633
- type: map_at_100
value: 37.633
- type: map_at_1000
value: 37.633
- type: map_at_20
value: 37.633
- type: map_at_3
value: 34.865
- type: map_at_5
value: 36.437999999999995
- type: mrr_at_1
value: 32.208999999999996
- type: mrr_at_10
value: 40.598
- type: mrr_at_100
value: 40.598
- type: mrr_at_1000
value: 40.598
- type: mrr_at_20
value: 40.598
- type: mrr_at_3
value: 37.935
- type: mrr_at_5
value: 39.476
- type: ndcg_at_1
value: 32.208999999999996
- type: ndcg_at_10
value: 42.798
- type: ndcg_at_100
value: 42.768
- type: ndcg_at_1000
value: 42.768
- type: ndcg_at_20
value: 42.768
- type: ndcg_at_3
value: 37.651
- type: ndcg_at_5
value: 40.172999999999995
- type: precision_at_1
value: 32.208999999999996
- type: precision_at_10
value: 6.84
- type: precision_at_100
value: 0.6839999999999999
- type: precision_at_1000
value: 0.068
- type: precision_at_20
value: 3.42
- type: precision_at_3
value: 16.258
- type: precision_at_5
value: 11.472
- type: recall_at_1
value: 28.758
- type: recall_at_10
value: 55.55799999999999
- type: recall_at_100
value: 55.55799999999999
- type: recall_at_1000
value: 55.55799999999999
- type: recall_at_20
value: 55.55799999999999
- type: recall_at_3
value: 41.488
- type: recall_at_5
value: 47.659
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: mteb/cqadupstack-tex
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 21.088
- type: map_at_10
value: 30.297
- type: map_at_100
value: 30.297
- type: map_at_1000
value: 30.297
- type: map_at_20
value: 30.297
- type: map_at_3
value: 27.376
- type: map_at_5
value: 29.064
- type: mrr_at_1
value: 26.358999999999998
- type: mrr_at_10
value: 34.996
- type: mrr_at_100
value: 34.996
- type: mrr_at_1000
value: 34.996
- type: mrr_at_20
value: 34.996
- type: mrr_at_3
value: 32.467
- type: mrr_at_5
value: 33.944
- type: ndcg_at_1
value: 26.358999999999998
- type: ndcg_at_10
value: 35.851
- type: ndcg_at_100
value: 35.731
- type: ndcg_at_1000
value: 35.729
- type: ndcg_at_20
value: 35.77
- type: ndcg_at_3
value: 30.97
- type: ndcg_at_5
value: 33.312000000000005
- type: precision_at_1
value: 26.358999999999998
- type: precision_at_10
value: 6.641
- type: precision_at_100
value: 0.664
- type: precision_at_1000
value: 0.066
- type: precision_at_20
value: 3.321
- type: precision_at_3
value: 14.923
- type: precision_at_5
value: 10.86
- type: recall_at_1
value: 21.088
- type: recall_at_10
value: 47.818
- type: recall_at_100
value: 47.818
- type: recall_at_1000
value: 47.818
- type: recall_at_20
value: 47.818
- type: recall_at_3
value: 33.815
- type: recall_at_5
value: 39.973
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: mteb/cqadupstack-unix
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 33.579
- type: map_at_10
value: 44.875
- type: map_at_100
value: 44.875
- type: map_at_1000
value: 44.875
- type: map_at_20
value: 44.875
- type: map_at_3
value: 41.64
- type: map_at_5
value: 43.433
- type: mrr_at_1
value: 40.111999999999995
- type: mrr_at_10
value: 49.586999999999996
- type: mrr_at_100
value: 49.586999999999996
- type: mrr_at_1000
value: 49.586999999999996
- type: mrr_at_20
value: 49.586999999999996
- type: mrr_at_3
value: 47.233000000000004
- type: mrr_at_5
value: 48.613
- type: ndcg_at_1
value: 40.111999999999995
- type: ndcg_at_10
value: 50.836000000000006
- type: ndcg_at_100
value: 50.822
- type: ndcg_at_1000
value: 50.822
- type: ndcg_at_20
value: 50.822
- type: ndcg_at_3
value: 45.737
- type: ndcg_at_5
value: 48.081
- type: precision_at_1
value: 40.111999999999995
- type: precision_at_10
value: 8.674999999999999
- type: precision_at_100
value: 0.868
- type: precision_at_1000
value: 0.087
- type: precision_at_20
value: 4.338
- type: precision_at_3
value: 21.02
- type: precision_at_5
value: 14.682999999999998
- type: recall_at_1
value: 33.579
- type: recall_at_10
value: 64.02600000000001
- type: recall_at_100
value: 64.02600000000001
- type: recall_at_1000
value: 64.02600000000001
- type: recall_at_20
value: 64.02600000000001
- type: recall_at_3
value: 49.788
- type: recall_at_5
value: 55.931
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: mteb/cqadupstack-webmasters
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 31.497999999999998
- type: map_at_10
value: 43.456
- type: map_at_100
value: 43.456
- type: map_at_1000
value: 43.456
- type: map_at_20
value: 43.456
- type: map_at_3
value: 40.125
- type: map_at_5
value: 41.829
- type: mrr_at_1
value: 38.735
- type: mrr_at_10
value: 48.756
- type: mrr_at_100
value: 48.756
- type: mrr_at_1000
value: 48.756
- type: mrr_at_20
value: 48.756
- type: mrr_at_3
value: 46.113
- type: mrr_at_5
value: 47.684
- type: ndcg_at_1
value: 38.735
- type: ndcg_at_10
value: 50.241
- type: ndcg_at_100
value: 49.458
- type: ndcg_at_1000
value: 49.437999999999995
- type: ndcg_at_20
value: 49.756
- type: ndcg_at_3
value: 45.14
- type: ndcg_at_5
value: 47.406
- type: precision_at_1
value: 38.735
- type: precision_at_10
value: 9.763
- type: precision_at_100
value: 0.976
- type: precision_at_1000
value: 0.098
- type: precision_at_20
value: 4.881
- type: precision_at_3
value: 21.673000000000002
- type: precision_at_5
value: 15.455
- type: recall_at_1
value: 31.497999999999998
- type: recall_at_10
value: 62.568999999999996
- type: recall_at_100
value: 62.568999999999996
- type: recall_at_1000
value: 62.568999999999996
- type: recall_at_20
value: 62.568999999999996
- type: recall_at_3
value: 47.842
- type: recall_at_5
value: 54.159
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: mteb/cqadupstack-wordpress
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 24.991
- type: map_at_10
value: 34.183
- type: map_at_100
value: 34.183
- type: map_at_1000
value: 34.183
- type: map_at_20
value: 34.183
- type: map_at_3
value: 31.592
- type: map_at_5
value: 33.121
- type: mrr_at_1
value: 27.172
- type: mrr_at_10
value: 36.463
- type: mrr_at_100
value: 36.463
- type: mrr_at_1000
value: 36.463
- type: mrr_at_20
value: 36.463
- type: mrr_at_3
value: 34.165
- type: mrr_at_5
value: 35.616
- type: ndcg_at_1
value: 27.172
- type: ndcg_at_10
value: 39.311
- type: ndcg_at_100
value: 39.292
- type: ndcg_at_1000
value: 39.292
- type: ndcg_at_20
value: 39.301
- type: ndcg_at_3
value: 34.498
- type: ndcg_at_5
value: 37.006
- type: precision_at_1
value: 27.172
- type: precision_at_10
value: 6.174
- type: precision_at_100
value: 0.617
- type: precision_at_1000
value: 0.062
- type: precision_at_20
value: 3.087
- type: precision_at_3
value: 14.972
- type: precision_at_5
value: 10.499
- type: recall_at_1
value: 24.991
- type: recall_at_10
value: 52.649
- type: recall_at_100
value: 52.649
- type: recall_at_1000
value: 52.649
- type: recall_at_20
value: 52.649
- type: recall_at_3
value: 39.818
- type: recall_at_5
value: 45.927
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 12.475999999999999
- type: map_at_10
value: 22.999
- type: map_at_100
value: 22.999
- type: map_at_1000
value: 22.999
- type: map_at_20
value: 22.999
- type: map_at_3
value: 18.804000000000002
- type: map_at_5
value: 20.987000000000002
- type: mrr_at_1
value: 28.404
- type: mrr_at_10
value: 42.335
- type: mrr_at_100
value: 42.335
- type: mrr_at_1000
value: 42.335
- type: mrr_at_20
value: 42.335
- type: mrr_at_3
value: 39.11
- type: mrr_at_5
value: 40.953
- type: ndcg_at_1
value: 28.404
- type: ndcg_at_10
value: 32.467
- type: ndcg_at_100
value: 32.467
- type: ndcg_at_1000
value: 32.467
- type: ndcg_at_20
value: 32.467
- type: ndcg_at_3
value: 26.334999999999997
- type: ndcg_at_5
value: 28.493000000000002
- type: precision_at_1
value: 28.404
- type: precision_at_10
value: 10.43
- type: precision_at_100
value: 1.043
- type: precision_at_1000
value: 0.104
- type: precision_at_20
value: 5.215
- type: precision_at_3
value: 20.13
- type: precision_at_5
value: 15.595999999999998
- type: recall_at_1
value: 12.475999999999999
- type: recall_at_10
value: 39.757
- type: recall_at_100
value: 39.757
- type: recall_at_1000
value: 39.757
- type: recall_at_20
value: 39.757
- type: recall_at_3
value: 24.695
- type: recall_at_5
value: 30.864000000000004
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 9.261999999999999
- type: map_at_10
value: 23.807000000000002
- type: map_at_100
value: 23.807000000000002
- type: map_at_1000
value: 23.807000000000002
- type: map_at_20
value: 23.807000000000002
- type: map_at_3
value: 15.776000000000002
- type: map_at_5
value: 19.17
- type: mrr_at_1
value: 71.75
- type: mrr_at_10
value: 79.959
- type: mrr_at_100
value: 79.959
- type: mrr_at_1000
value: 79.959
- type: mrr_at_20
value: 79.959
- type: mrr_at_3
value: 78.625
- type: mrr_at_5
value: 79.412
- type: ndcg_at_1
value: 59.5
- type: ndcg_at_10
value: 48.988
- type: ndcg_at_100
value: 37.452000000000005
- type: ndcg_at_1000
value: 37.32
- type: ndcg_at_20
value: 41.387
- type: ndcg_at_3
value: 52.567
- type: ndcg_at_5
value: 50.649
- type: precision_at_1
value: 71.75
- type: precision_at_10
value: 40.425
- type: precision_at_100
value: 4.042
- type: precision_at_1000
value: 0.404
- type: precision_at_20
value: 20.212
- type: precision_at_3
value: 57.75
- type: precision_at_5
value: 50.349999999999994
- type: recall_at_1
value: 9.261999999999999
- type: recall_at_10
value: 30.329
- type: recall_at_100
value: 30.329
- type: recall_at_1000
value: 30.329
- type: recall_at_20
value: 30.329
- type: recall_at_3
value: 17.422
- type: recall_at_5
value: 22.598
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 52.014999999999986
- type: f1
value: 47.33036786740981
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 82.00800000000001
- type: map_at_10
value: 88.02799999999999
- type: map_at_100
value: 88.02799999999999
- type: map_at_1000
value: 88.02799999999999
- type: map_at_20
value: 88.02799999999999
- type: map_at_3
value: 87.249
- type: map_at_5
value: 87.78399999999999
- type: mrr_at_1
value: 88.299
- type: mrr_at_10
value: 92.92
- type: mrr_at_100
value: 92.92
- type: mrr_at_1000
value: 92.92
- type: mrr_at_20
value: 92.92
- type: mrr_at_3
value: 92.56400000000001
- type: mrr_at_5
value: 92.83200000000001
- type: ndcg_at_1
value: 88.299
- type: ndcg_at_10
value: 90.88000000000001
- type: ndcg_at_100
value: 90.879
- type: ndcg_at_1000
value: 90.879
- type: ndcg_at_20
value: 90.879
- type: ndcg_at_3
value: 89.85499999999999
- type: ndcg_at_5
value: 90.485
- type: precision_at_1
value: 88.299
- type: precision_at_10
value: 10.522
- type: precision_at_100
value: 1.052
- type: precision_at_1000
value: 0.105
- type: precision_at_20
value: 5.261
- type: precision_at_3
value: 33.573
- type: precision_at_5
value: 20.633000000000003
- type: recall_at_1
value: 82.00800000000001
- type: recall_at_10
value: 94.952
- type: recall_at_100
value: 94.952
- type: recall_at_1000
value: 94.952
- type: recall_at_20
value: 94.952
- type: recall_at_3
value: 92.089
- type: recall_at_5
value: 93.794
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 26.857
- type: map_at_10
value: 44.645
- type: map_at_100
value: 44.645
- type: map_at_1000
value: 44.645
- type: map_at_20
value: 44.645
- type: map_at_3
value: 38.166
- type: map_at_5
value: 41.992000000000004
- type: mrr_at_1
value: 50.309000000000005
- type: mrr_at_10
value: 59.59100000000001
- type: mrr_at_100
value: 59.59100000000001
- type: mrr_at_1000
value: 59.59100000000001
- type: mrr_at_20
value: 59.59100000000001
- type: mrr_at_3
value: 56.97
- type: mrr_at_5
value: 58.498000000000005
- type: ndcg_at_1
value: 50.309000000000005
- type: ndcg_at_10
value: 53.221
- type: ndcg_at_100
value: 53.15800000000001
- type: ndcg_at_1000
value: 53.15800000000001
- type: ndcg_at_20
value: 53.15800000000001
- type: ndcg_at_3
value: 47.506
- type: ndcg_at_5
value: 49.922
- type: precision_at_1
value: 50.309000000000005
- type: precision_at_10
value: 14.985000000000001
- type: precision_at_100
value: 1.498
- type: precision_at_1000
value: 0.15
- type: precision_at_20
value: 7.492
- type: precision_at_3
value: 31.635999999999996
- type: precision_at_5
value: 24.043
- type: recall_at_1
value: 26.857
- type: recall_at_10
value: 62.051
- type: recall_at_100
value: 62.051
- type: recall_at_1000
value: 62.051
- type: recall_at_20
value: 62.051
- type: recall_at_3
value: 42.966
- type: recall_at_5
value: 51.943
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 40.891
- type: map_at_10
value: 70.431
- type: map_at_100
value: 70.431
- type: map_at_1000
value: 70.431
- type: map_at_20
value: 70.431
- type: map_at_3
value: 66.704
- type: map_at_5
value: 69.179
- type: mrr_at_1
value: 81.783
- type: mrr_at_10
value: 87.368
- type: mrr_at_100
value: 87.368
- type: mrr_at_1000
value: 87.368
- type: mrr_at_20
value: 87.368
- type: mrr_at_3
value: 86.59700000000001
- type: mrr_at_5
value: 87.128
- type: ndcg_at_1
value: 81.783
- type: ndcg_at_10
value: 77.697
- type: ndcg_at_100
value: 77.697
- type: ndcg_at_1000
value: 77.697
- type: ndcg_at_20
value: 77.697
- type: ndcg_at_3
value: 72.688
- type: ndcg_at_5
value: 75.69200000000001
- type: precision_at_1
value: 81.783
- type: precision_at_10
value: 16.488
- type: precision_at_100
value: 1.649
- type: precision_at_1000
value: 0.165
- type: precision_at_20
value: 8.244
- type: precision_at_3
value: 47.693000000000005
- type: precision_at_5
value: 30.976
- type: recall_at_1
value: 40.891
- type: recall_at_10
value: 82.438
- type: recall_at_100
value: 82.438
- type: recall_at_1000
value: 82.438
- type: recall_at_20
value: 82.438
- type: recall_at_3
value: 71.54
- type: recall_at_5
value: 77.441
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 89.47240000000001
- type: ap
value: 85.75618304701787
- type: f1
value: 89.44156774176075
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 19.941
- type: map_at_10
value: 33.108
- type: map_at_100
value: 33.108
- type: map_at_1000
value: 33.108
- type: map_at_20
value: 33.108
- type: map_at_3
value: 28.716
- type: map_at_5
value: 31.255
- type: mrr_at_1
value: 20.458000000000002
- type: mrr_at_10
value: 33.646
- type: mrr_at_100
value: 33.646
- type: mrr_at_1000
value: 33.646
- type: mrr_at_20
value: 33.646
- type: mrr_at_3
value: 29.360000000000003
- type: mrr_at_5
value: 31.849
- type: ndcg_at_1
value: 20.458000000000002
- type: ndcg_at_10
value: 40.664
- type: ndcg_at_100
value: 40.664
- type: ndcg_at_1000
value: 40.664
- type: ndcg_at_20
value: 40.664
- type: ndcg_at_3
value: 31.733
- type: ndcg_at_5
value: 36.266999999999996
- type: precision_at_1
value: 20.458000000000002
- type: precision_at_10
value: 6.703
- type: precision_at_100
value: 0.67
- type: precision_at_1000
value: 0.067
- type: precision_at_20
value: 3.3520000000000003
- type: precision_at_3
value: 13.777000000000001
- type: precision_at_5
value: 10.564
- type: recall_at_1
value: 19.941
- type: recall_at_10
value: 64.103
- type: recall_at_100
value: 64.103
- type: recall_at_1000
value: 64.103
- type: recall_at_20
value: 64.103
- type: recall_at_3
value: 39.800999999999995
- type: recall_at_5
value: 50.727999999999994
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 96.45690834473322
- type: f1
value: 96.19980363353172
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 85.38075695394436
- type: f1
value: 71.33409850817071
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 80.12104909213183
- type: f1
value: 77.26691038674358
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 82.69670477471418
- type: f1
value: 82.31935226516424
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 32.733209733023
- type: v_measures
value:
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- 0.3268102022520237
- 0.30894802212942296
- 0.3267412500148118
- 0.3083054819872514
- 0.31284256226804597
- 0.33514297956992917
- 0.3297363893986241
- 0.34536511251773544
- 0.3498041803334763
- 0.3296247928309792
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 32.325298069936835
- type: v_measures
value:
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- 0.3150861010517341
- 0.32155893979978684
- 0.3177517595474066
- 0.30022420485295037
- 0.3197693379138355
- 0.33188657891678974
- 0.3386256684441414
- 0.3305481006752689
- 0.3436443696432427
- 0.31343474614852757
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 32.511595472837335
- type: mrr
value: 33.73044905745997
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 7.0760000000000005
- type: map_at_10
value: 16.039
- type: map_at_100
value: 16.039
- type: map_at_1000
value: 16.039
- type: map_at_20
value: 16.039
- type: map_at_3
value: 11.408
- type: map_at_5
value: 13.547
- type: mrr_at_1
value: 53.559999999999995
- type: mrr_at_10
value: 61.531000000000006
- type: mrr_at_100
value: 61.531000000000006
- type: mrr_at_1000
value: 61.531000000000006
- type: mrr_at_20
value: 61.531000000000006
- type: mrr_at_3
value: 59.236
- type: mrr_at_5
value: 60.49
- type: ndcg_at_1
value: 51.083999999999996
- type: ndcg_at_10
value: 41.332
- type: ndcg_at_100
value: 27.083000000000002
- type: ndcg_at_1000
value: 26.619
- type: ndcg_at_20
value: 33.188
- type: ndcg_at_3
value: 46.605999999999995
- type: ndcg_at_5
value: 44.362
- type: precision_at_1
value: 52.941
- type: precision_at_10
value: 30.65
- type: precision_at_100
value: 3.065
- type: precision_at_1000
value: 0.307
- type: precision_at_20
value: 15.325
- type: precision_at_3
value: 43.447
- type: precision_at_5
value: 38.266
- type: recall_at_1
value: 7.0760000000000005
- type: recall_at_10
value: 20.929000000000002
- type: recall_at_100
value: 20.929000000000002
- type: recall_at_1000
value: 20.929000000000002
- type: recall_at_20
value: 20.929000000000002
- type: recall_at_3
value: 12.601
- type: recall_at_5
value: 15.955
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 39.204
- type: map_at_10
value: 56.808
- type: map_at_100
value: 56.808
- type: map_at_1000
value: 56.808
- type: map_at_20
value: 56.808
- type: map_at_3
value: 52.471999999999994
- type: map_at_5
value: 55.191
- type: mrr_at_1
value: 44.032
- type: mrr_at_10
value: 59.158
- type: mrr_at_100
value: 59.158
- type: mrr_at_1000
value: 59.158
- type: mrr_at_20
value: 59.158
- type: mrr_at_3
value: 55.948
- type: mrr_at_5
value: 57.96
- type: ndcg_at_1
value: 44.032
- type: ndcg_at_10
value: 64.672
- type: ndcg_at_100
value: 64.672
- type: ndcg_at_1000
value: 64.672
- type: ndcg_at_20
value: 64.672
- type: ndcg_at_3
value: 56.955999999999996
- type: ndcg_at_5
value: 61.278999999999996
- type: precision_at_1
value: 44.032
- type: precision_at_10
value: 10.295
- type: precision_at_100
value: 1.03
- type: precision_at_1000
value: 0.10300000000000001
- type: precision_at_20
value: 5.148
- type: precision_at_3
value: 25.83
- type: precision_at_5
value: 18.053
- type: recall_at_1
value: 39.204
- type: recall_at_10
value: 85.936
- type: recall_at_100
value: 85.936
- type: recall_at_1000
value: 85.936
- type: recall_at_20
value: 85.936
- type: recall_at_3
value: 66.387
- type: recall_at_5
value: 76.238
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: map_at_1
value: 71.068
- type: map_at_10
value: 85.271
- type: map_at_100
value: 85.271
- type: map_at_1000
value: 85.271
- type: map_at_20
value: 85.271
- type: map_at_3
value: 82.23899999999999
- type: map_at_5
value: 84.165
- type: mrr_at_1
value: 81.85
- type: mrr_at_10
value: 87.856
- type: mrr_at_100
value: 87.856
- type: mrr_at_1000
value: 87.856
- type: mrr_at_20
value: 87.856
- type: mrr_at_3
value: 86.925
- type: mrr_at_5
value: 87.559
- type: ndcg_at_1
value: 81.89
- type: ndcg_at_10
value: 88.856
- type: ndcg_at_100
value: 88.723
- type: ndcg_at_1000
value: 88.723
- type: ndcg_at_20
value: 88.74300000000001
- type: ndcg_at_3
value: 86.05199999999999
- type: ndcg_at_5
value: 87.61
- type: precision_at_1
value: 81.89
- type: precision_at_10
value: 13.569999999999999
- type: precision_at_100
value: 1.357
- type: precision_at_1000
value: 0.136
- type: precision_at_20
value: 6.784999999999999
- type: precision_at_3
value: 37.807
- type: precision_at_5
value: 24.908
- type: recall_at_1
value: 71.068
- type: recall_at_10
value: 95.797
- type: recall_at_100
value: 95.797
- type: recall_at_1000
value: 95.797
- type: recall_at_20
value: 95.797
- type: recall_at_3
value: 87.65899999999999
- type: recall_at_5
value: 92.107
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 62.16385792305745
- type: v_measures
value:
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- 0.587541495063312
- 0.6775885692888917
- 0.567537304495857
- 0.6678131151900565
- 0.5979158867861365
- 0.5956669142507505
- 0.6200557398628851
- 0.5946598821061599
- 0.603673972233609
- 0.6050659113737895
- 0.5742475015975338
- 0.6273500769309232
- 0.6526752602522112
- 0.6416095306029318
- 0.7334431385812594
- 0.5847584715164077
- 0.62727067061333
- 0.6138592437270369
- 0.6280966889397946
- 0.5740785434257363
- 0.5888905363406383
- 0.6073133172766488
- 0.7167239234204423
- 0.6595740709329266
- 0.5935547159550949
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: v_measure
value: 65.96296778394698
- type: v_measures
value:
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- 0.6994519018160104
- 0.6685310786302858
- 0.6560344637869603
- 0.4053977970605211
- 0.7423583342619767
- 0.6657872853200192
- 0.4487425514322897
- 0.7791528368405061
- 0.7406529421724692
- 0.7901875870736593
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: map_at_1
value: 5.433000000000001
- type: map_at_10
value: 13.991000000000001
- type: map_at_100
value: 13.991000000000001
- type: map_at_1000
value: 13.991000000000001
- type: map_at_20
value: 13.991000000000001
- type: map_at_3
value: 9.708
- type: map_at_5
value: 11.849
- type: mrr_at_1
value: 26.8
- type: mrr_at_10
value: 38.012
- type: mrr_at_100
value: 38.012
- type: mrr_at_1000
value: 38.012
- type: mrr_at_20
value: 38.012
- type: mrr_at_3
value: 34.449999999999996
- type: mrr_at_5
value: 36.59
- type: ndcg_at_1
value: 26.8
- type: ndcg_at_10
value: 23.006999999999998
- type: ndcg_at_100
value: 23.006999999999998
- type: ndcg_at_1000
value: 23.006999999999998
- type: ndcg_at_20
value: 23.006999999999998
- type: ndcg_at_3
value: 21.386
- type: ndcg_at_5
value: 19.046
- type: precision_at_1
value: 26.8
- type: precision_at_10
value: 12.01
- type: precision_at_100
value: 1.201
- type: precision_at_1000
value: 0.12
- type: precision_at_20
value: 6.005
- type: precision_at_3
value: 19.833000000000002
- type: precision_at_5
value: 16.84
- type: recall_at_1
value: 5.433000000000001
- type: recall_at_10
value: 24.34
- type: recall_at_100
value: 24.34
- type: recall_at_1000
value: 24.34
- type: recall_at_20
value: 24.34
- type: recall_at_3
value: 12.058
- type: recall_at_5
value: 17.058
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cos_sim_pearson
value: 84.84178272773948
- type: cos_sim_spearman
value: 82.32746830315172
- type: euclidean_pearson
value: 82.11599650658388
- type: euclidean_spearman
value: 82.38102437050075
- type: manhattan_pearson
value: 82.07071847892156
- type: manhattan_spearman
value: 82.35710877093594
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 86.86916828280668
- type: cos_sim_spearman
value: 79.69553254808825
- type: euclidean_pearson
value: 82.86582224049857
- type: euclidean_spearman
value: 79.1765897124049
- type: manhattan_pearson
value: 83.15978473993391
- type: manhattan_spearman
value: 79.54192003597332
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 88.7719804239987
- type: cos_sim_spearman
value: 89.20788765830103
- type: euclidean_pearson
value: 88.67624029627581
- type: euclidean_spearman
value: 89.15058058277351
- type: manhattan_pearson
value: 88.43477620818435
- type: manhattan_spearman
value: 89.01994285052193
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 87.04733612348426
- type: cos_sim_spearman
value: 86.0120242985069
- type: euclidean_pearson
value: 86.07045247599824
- type: euclidean_spearman
value: 86.22185577032168
- type: manhattan_pearson
value: 85.79555943035328
- type: manhattan_spearman
value: 86.13821651705776
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 89.395594115739
- type: cos_sim_spearman
value: 89.70312809978681
- type: euclidean_pearson
value: 89.10137224981938
- type: euclidean_spearman
value: 89.74149793061072
- type: manhattan_pearson
value: 89.06144914118401
- type: manhattan_spearman
value: 89.78489015365638
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 86.1720394205624
- type: cos_sim_spearman
value: 87.67900288178751
- type: euclidean_pearson
value: 86.73052291563968
- type: euclidean_spearman
value: 87.49116803671033
- type: manhattan_pearson
value: 86.79988999910331
- type: manhattan_spearman
value: 87.57540934207157
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 88.75286004155564
- type: cos_sim_spearman
value: 88.03161515281518
- type: euclidean_pearson
value: 88.55464128719427
- type: euclidean_spearman
value: 87.78041200668837
- type: manhattan_pearson
value: 88.18469209314583
- type: manhattan_spearman
value: 87.31602253333598
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 70.48372140035973
- type: cos_sim_spearman
value: 70.16107814793419
- type: euclidean_pearson
value: 69.65789511103976
- type: euclidean_spearman
value: 68.92441073988654
- type: manhattan_pearson
value: 69.55306498752127
- type: manhattan_spearman
value: 68.82186378798527
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 87.43017430741797
- type: cos_sim_spearman
value: 88.14675226940803
- type: euclidean_pearson
value: 87.33329490848514
- type: euclidean_spearman
value: 87.94164481397011
- type: manhattan_pearson
value: 87.19303598684772
- type: manhattan_spearman
value: 87.86899889639051
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 87.03073019943413
- type: mrr
value: 96.67456726280255
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 64.328
- type: map_at_10
value: 75.046
- type: map_at_100
value: 75.046
- type: map_at_1000
value: 75.046
- type: map_at_20
value: 75.046
- type: map_at_3
value: 72.42
- type: map_at_5
value: 73.88900000000001
- type: mrr_at_1
value: 67.667
- type: mrr_at_10
value: 76.19200000000001
- type: mrr_at_100
value: 76.19200000000001
- type: mrr_at_1000
value: 76.19200000000001
- type: mrr_at_20
value: 76.19200000000001
- type: mrr_at_3
value: 74.556
- type: mrr_at_5
value: 75.372
- type: ndcg_at_1
value: 67.667
- type: ndcg_at_10
value: 79.621
- type: ndcg_at_100
value: 79.621
- type: ndcg_at_1000
value: 79.621
- type: ndcg_at_20
value: 79.621
- type: ndcg_at_3
value: 75.506
- type: ndcg_at_5
value: 77.269
- type: precision_at_1
value: 67.667
- type: precision_at_10
value: 10.467
- type: precision_at_100
value: 1.047
- type: precision_at_1000
value: 0.105
- type: precision_at_20
value: 5.2330000000000005
- type: precision_at_3
value: 29.444
- type: precision_at_5
value: 19.133
- type: recall_at_1
value: 64.328
- type: recall_at_10
value: 92.389
- type: recall_at_100
value: 92.389
- type: recall_at_1000
value: 92.389
- type: recall_at_20
value: 92.389
- type: recall_at_3
value: 81.183
- type: recall_at_5
value: 85.60600000000001
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.83762376237624
- type: cos_sim_ap
value: 96.51580702723564
- type: cos_sim_f1
value: 91.63265306122449
- type: cos_sim_precision
value: 93.54166666666667
- type: cos_sim_recall
value: 89.8
- type: dot_accuracy
value: 99.73663366336633
- type: dot_ap
value: 93.5764284433306
- type: dot_f1
value: 86.56565656565655
- type: dot_precision
value: 87.44897959183675
- type: dot_recall
value: 85.7
- type: euclidean_accuracy
value: 99.84059405940594
- type: euclidean_ap
value: 96.4738308210008
- type: euclidean_f1
value: 91.76470588235294
- type: euclidean_precision
value: 93.92670157068062
- type: euclidean_recall
value: 89.7
- type: manhattan_accuracy
value: 99.84356435643565
- type: manhattan_ap
value: 96.58366196890644
- type: manhattan_f1
value: 91.93054136874362
- type: manhattan_precision
value: 93.94572025052193
- type: manhattan_recall
value: 90.0
- type: max_accuracy
value: 99.84356435643565
- type: max_ap
value: 96.58366196890644
- type: max_f1
value: 91.93054136874362
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 71.3538865724681
- type: v_measures
value:
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- 0.7491029730754422
- 0.6679041279132695
- 0.6706416821131351
- 0.7400759063631245
- 0.7205282507088282
- 0.7207474445915961
- 0.7076023322461216
- 0.7919544039062477
- 0.697356950041273
- 0.7155185617564064
- 0.8170975778555782
- 0.7758530037956233
- 0.7557716341966847
- 0.7418030161151182
- 0.6544532124169519
- 0.7116665112917787
- 0.6779566961395338
- 0.6721164638120183
- 0.6901024025391699
- 0.6457684359608986
- 0.7074519871138994
- 0.7296079088233842
- 0.7023239980988409
- 0.6900078050266639
- 0.6850583572154368
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 36.11009155563876
- type: v_measures
value:
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- 0.35242637090556483
- 0.34198478937626525
- 0.3480143704468013
- 0.3432433824651389
- 0.34581837944580823
- 0.38852793624316134
- 0.3664105091244259
- 0.3798083138774721
- 0.37268279094517115
- 0.37209231273406684
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 55.54551767207771
- type: mrr
value: 56.55926385705797
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.805678984951985
- type: cos_sim_spearman
value: 30.827574116605362
- type: dot_pearson
value: 29.899814768586204
- type: dot_spearman
value: 29.588760095881174
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: map_at_1
value: 0.22200000000000003
- type: map_at_10
value: 2.046
- type: map_at_100
value: 2.046
- type: map_at_1000
value: 2.046
- type: map_at_20
value: 2.046
- type: map_at_3
value: 0.661
- type: map_at_5
value: 1.057
- type: mrr_at_1
value: 84.0
- type: mrr_at_10
value: 91.333
- type: mrr_at_100
value: 91.333
- type: mrr_at_1000
value: 91.333
- type: mrr_at_20
value: 91.333
- type: mrr_at_3
value: 91.0
- type: mrr_at_5
value: 91.0
- type: ndcg_at_1
value: 80.0
- type: ndcg_at_10
value: 80.74900000000001
- type: ndcg_at_100
value: 17.761
- type: ndcg_at_1000
value: 7.5920000000000005
- type: ndcg_at_20
value: 52.113
- type: ndcg_at_3
value: 83.542
- type: ndcg_at_5
value: 82.151
- type: precision_at_1
value: 84.0
- type: precision_at_10
value: 84.6
- type: precision_at_100
value: 8.459999999999999
- type: precision_at_1000
value: 0.8460000000000001
- type: precision_at_20
value: 42.3
- type: precision_at_3
value: 88.0
- type: precision_at_5
value: 86.0
- type: recall_at_1
value: 0.22200000000000003
- type: recall_at_10
value: 2.235
- type: recall_at_100
value: 2.235
- type: recall_at_1000
value: 2.235
- type: recall_at_20
value: 2.235
- type: recall_at_3
value: 0.695
- type: recall_at_5
value: 1.121
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 3.2750000000000004
- type: map_at_10
value: 10.514
- type: map_at_100
value: 10.514
- type: map_at_1000
value: 10.514
- type: map_at_20
value: 10.514
- type: map_at_3
value: 5.662
- type: map_at_5
value: 7.808
- type: mrr_at_1
value: 40.816
- type: mrr_at_10
value: 49.88
- type: mrr_at_100
value: 49.88
- type: mrr_at_1000
value: 49.88
- type: mrr_at_20
value: 49.88
- type: mrr_at_3
value: 46.259
- type: mrr_at_5
value: 47.585
- type: ndcg_at_1
value: 37.755
- type: ndcg_at_10
value: 25.237
- type: ndcg_at_100
value: 21.149
- type: ndcg_at_1000
value: 21.149
- type: ndcg_at_20
value: 21.401999999999997
- type: ndcg_at_3
value: 27.465
- type: ndcg_at_5
value: 26.169999999999998
- type: precision_at_1
value: 40.816
- type: precision_at_10
value: 21.224
- type: precision_at_100
value: 2.122
- type: precision_at_1000
value: 0.212
- type: precision_at_20
value: 10.612
- type: precision_at_3
value: 26.531
- type: precision_at_5
value: 24.490000000000002
- type: recall_at_1
value: 3.2750000000000004
- type: recall_at_10
value: 16.264
- type: recall_at_100
value: 16.264
- type: recall_at_1000
value: 16.264
- type: recall_at_20
value: 16.264
- type: recall_at_3
value: 6.265999999999999
- type: recall_at_5
value: 9.677
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 66.181640625
- type: ap
value: 12.61343083198892
- type: f1
value: 51.12214559856414
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 62.543859649122815
- type: f1
value: 62.742315191046295
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 54.7799424517948
- type: v_measures
value:
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- 0.550822270643678
- 0.5550309505411892
- 0.5374116804548088
- 0.530806408291854
- 0.5520216200733947
- 0.5723223656123475
- 0.5487505833189581
- 0.5496668776225391
- 0.5230606424471813
- 0.5581008461735308
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 88.24581271979496
- type: cos_sim_ap
value: 81.34631603712425
- type: cos_sim_f1
value: 73.6588459099556
- type: cos_sim_precision
value: 70.91575091575092
- type: cos_sim_recall
value: 76.62269129287598
- type: dot_accuracy
value: 86.33247898909221
- type: dot_ap
value: 74.8713850965631
- type: dot_f1
value: 69.68152866242038
- type: dot_precision
value: 67.36453201970444
- type: dot_recall
value: 72.16358839050132
- type: euclidean_accuracy
value: 88.37098408535495
- type: euclidean_ap
value: 81.3880827682646
- type: euclidean_f1
value: 73.69367056104764
- type: euclidean_precision
value: 71.76794198549638
- type: euclidean_recall
value: 75.72559366754618
- type: manhattan_accuracy
value: 88.28157596709781
- type: manhattan_ap
value: 81.11568493905267
- type: manhattan_f1
value: 73.38364779874215
- type: manhattan_precision
value: 70.1201923076923
- type: manhattan_recall
value: 76.96569920844327
- type: max_accuracy
value: 88.37098408535495
- type: max_ap
value: 81.3880827682646
- type: max_f1
value: 73.69367056104764
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.54476656188147
- type: cos_sim_ap
value: 86.93964282285746
- type: cos_sim_f1
value: 79.50401702190103
- type: cos_sim_precision
value: 75.93020811435778
- type: cos_sim_recall
value: 83.43085925469664
- type: dot_accuracy
value: 88.64050917840649
- type: dot_ap
value: 84.81007248888473
- type: dot_f1
value: 77.95706670508572
- type: dot_precision
value: 73.24038982133189
- type: dot_recall
value: 83.32306744687403
- type: euclidean_accuracy
value: 89.53894516241705
- type: euclidean_ap
value: 86.92299719471643
- type: euclidean_f1
value: 79.55922060862585
- type: euclidean_precision
value: 75.61381606325426
- type: euclidean_recall
value: 83.93902063443178
- type: manhattan_accuracy
value: 89.5234214305119
- type: manhattan_ap
value: 86.93261273512803
- type: manhattan_f1
value: 79.54703705061019
- type: manhattan_precision
value: 75.90041261626688
- type: manhattan_recall
value: 83.56174930705266
- type: max_accuracy
value: 89.54476656188147
- type: max_ap
value: 86.93964282285746
- type: max_f1
value: 79.55922060862585
---
Details to run in https://github.com/raghavlite/TDTE | [
"BIOSSES",
"SCIFACT"
] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1431 | Lots-of-LoRAs | null | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"arxiv:2407.00066",
"license:mit",
"region:us"
] | "2024-06-14T18:43:05Z" | 2024-12-16T21:47:23+00:00 | 0 | 0 | ---
language: en
library_name: pytorch
license: mit
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1431
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1431_head_qa_answer_generation
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
https://huggingface.co/datasets/Lots-of-LoRAs/task1431_head_qa_answer_generation sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{brüelgabrielsson2024compressserveservingthousands,
title={Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead},
author={Rickard Brüel-Gabrielsson and Jiacheng Zhu and Onkar Bhardwaj and Leshem Choshen and Kristjan Greenewald and Mikhail Yurochkin and Justin Solomon},
year={2024},
eprint={2407.00066},
archivePrefix={arXiv},
primaryClass={cs.DC},
url={https://arxiv.org/abs/2407.00066},
}
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"HEAD-QA"
] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1487 | Lots-of-LoRAs | null | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"arxiv:2407.00066",
"license:mit",
"region:us"
] | "2024-06-14T18:43:11Z" | 2024-12-16T21:48:44+00:00 | 0 | 0 | ---
language: en
library_name: pytorch
license: mit
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1487
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1487_organism_substance_extraction_anem_dataset
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
https://huggingface.co/datasets/Lots-of-LoRAs/task1487_organism_substance_extraction_anem_dataset sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{brüelgabrielsson2024compressserveservingthousands,
title={Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead},
author={Rickard Brüel-Gabrielsson and Jiacheng Zhu and Onkar Bhardwaj and Leshem Choshen and Kristjan Greenewald and Mikhail Yurochkin and Justin Solomon},
year={2024},
eprint={2407.00066},
archivePrefix={arXiv},
primaryClass={cs.DC},
url={https://arxiv.org/abs/2407.00066},
}
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"ANEM"
] |
TensorStack/UnstableIllusion-onnx | TensorStack | text-to-image | [
"onnx",
"text-to-image",
"region:us"
] | "2024-06-17T01:53:03Z" | 2024-06-17T02:01:23+00:00 | 0 | 1 | ---
pipeline_tag: text-to-image
---
# unStable Illusion v2 - Onnx Olive DirectML Optimized
## Original Model
https://civitai.com/models/147687?modelVersionId=164719
## C# Inference Demo
https://github.com/saddam213/OnnxStack
```csharp
// Create Pipeline
var pipeline = StableDiffusionPipeline.CreatePipeline("D:\\Models\\UnstableIllusion-onnx");
// Prompt
var promptOptions = new PromptOptions
{
Prompt = "Craft an image of a bustling street food market with vendors cooking up sizzling kebabs, noodles, and dumplings"
};
// Run pipeline
var result = await pipeline.GenerateImageAsync(promptOptions);
// Save Image Result
await result.SaveAsync("Result.png");
```
## Inference Result
 | [
"CRAFT"
] |
TensorStack/MidgardPony-XL-onnx | TensorStack | text-to-image | [
"onnx",
"text-to-image",
"region:us"
] | "2024-06-17T02:28:28Z" | 2024-06-17T02:38:02+00:00 | 0 | 1 | ---
pipeline_tag: text-to-image
---
# Midgard Pony v3 - Onnx Olive DirectML Optimized
## Original Model
https://civitai.com/models/470287?modelVersionId=561310
## C# Inference Demo
https://github.com/TensorStack-AI/OnnxStack
```csharp
// Create Pipeline
var pipeline = StableDiffusionXLPipeline.CreatePipeline("D:\\Models\\MidgardPony-XL");
// Prompt
var promptOptions = new PromptOptions
{
Prompt = "Craft an image of a gallant furry prince, with a charming smile and a sword at his side, ready to embark on a quest"
};
// Run pipeline
var result = await pipeline.GenerateImageAsync(promptOptions);
// Save Image Result
await result.SaveAsync("Result.png");
```
## Inference Result
 | [
"CRAFT"
] |
TensorStack/Fluently-XL-Final-onnx | TensorStack | text-to-image | [
"onnx",
"text-to-image",
"region:us"
] | "2024-06-17T04:05:07Z" | 2024-06-17T04:14:19+00:00 | 0 | 1 | ---
pipeline_tag: text-to-image
---
# Fluently XL Final - Onnx Olive DirectML Optimized
## Original Model
https://huggingface.co/fluently/Fluently-XL-Final
## C# Inference Demo
https://github.com/TensorStack-AI/OnnxStack
```csharp
// Create Pipeline
var pipeline = StableDiffusionXLPipeline.CreatePipeline("D:\\Models\\Fluently-XL-Final-onnx");
// Prompt
var promptOptions = new PromptOptions
{
Prompt = "Craft an image of a nurse taking care of a patient in a hospital room, with medical equipment and a warm smile."
};
// Run pipeline
var result = await pipeline.GenerateImageAsync(promptOptions);
// Save Image Result
await result.SaveAsync("Result.png");
```
## Inference Result
 | [
"CRAFT"
] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task593 | Lots-of-LoRAs | null | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"arxiv:2407.00066",
"license:mit",
"region:us"
] | "2024-06-17T21:55:45Z" | 2024-07-03T20:32:28+00:00 | 0 | 0 | ---
language: en
library_name: pytorch
license: mit
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task593
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task593_sciq_explanation_generation
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
https://huggingface.co/datasets/Lots-of-LoRAs/task593_sciq_explanation_generation sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{brüelgabrielsson2024compressserveservingthousands,
title={Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead},
author={Rickard Brüel-Gabrielsson and Jiacheng Zhu and Onkar Bhardwaj and Leshem Choshen and Kristjan Greenewald and Mikhail Yurochkin and Justin Solomon},
year={2024},
eprint={2407.00066},
archivePrefix={arXiv},
primaryClass={cs.DC},
url={https://arxiv.org/abs/2407.00066},
}
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"SCIQ"
] |
PabloVP/teste | PabloVP | null | [
"region:us"
] | "2024-06-18T16:55:53Z" | 2024-06-18T16:56:16+00:00 | 0 | 0 | ---
{}
---
# Med-PaLM 🌴🔬
Welcome to Med-PaLM, your fun-filled, AI-powered friend in the world of biomedicine! 😄🔍
[](https://github.com/kyegomez/Med-Palm/issues)
[](https://github.com/kyegomez/Med-Palm/network)
[](https://github.com/kyegomez/Med-Palm/stargazers) [](https://github.com/kyegomez/Med-Palm/blob/master/LICENSE)
[](https://twitter.com/intent/tweet?text=Excited%20to%20introduce%20Med-Palm,%20the%20all-new%20robotics%20model%20with%20the%20potential%20to%20revolutionize%20automation.%20Join%20us%20on%20this%20journey%20towards%20a%20smarter%20future.%20%23RT1%20%23Robotics&url=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMed-Palm)
[](https://www.facebook.com/sharer/sharer.php?u=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMed-Palm)
[](https://www.linkedin.com/shareArticle?mini=true&url=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMed-Palm&title=Introducing%20Med-Palm%2C%20the%20All-New%20Robotics%20Model&summary=Med-Palm%20is%20the%20next-generation%20robotics%20model%20that%20promises%20to%20transform%20industries%20with%20its%20intelligence%20and%20efficiency.%20Join%20us%20to%20be%20a%20part%20of%20this%20revolutionary%20journey%20%23RT1%20%23Robotics&source=)

[](https://www.reddit.com/submit?url=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMed-Palm&title=Exciting%20Times%20Ahead%20with%20Med-Palm%2C%20the%20All-New%20Robotics%20Model%20%23RT1%20%23Robotics) [](https://news.ycombinator.com/submitlink?u=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMed-Palm&t=Exciting%20Times%20Ahead%20with%20Med-Palm%2C%20the%20All-New%20Robotics%20Model%20%23RT1%20%23Robotics)
[](https://pinterest.com/pin/create/button/?url=https%3A%2F%2Fgithub.com%2Fkyegomez%2FMed-Palm&media=https%3A%2F%2Fexample.com%2Fimage.jpg&description=Med-Palm%2C%20the%20Revolutionary%20Robotics%20Model%20that%20will%20Change%20the%20Way%20We%20Work%20%23RT1%20%23Robotics)
[](https://api.whatsapp.com/send?text=I%20just%20discovered%20Med-Palm,%20the%20all-new%20robotics%20model%20that%20promises%20to%20revolutionize%20automation.%20Join%20me%20on%20this%20exciting%20journey%20towards%20a%20smarter%20future.%20%23RT1%20%23Robotics%0A%0Ahttps%3A%2F%2Fgithub.com%2Fkyegomez%2FMed-Palm)

## 🚀 Getting Started
Want to play with Med-PaLM? Awesome! 🥳 Let's get you set up:
1. Grab your own copy:
```
pip install MedPalm
```
## 🧪 How to Use
It's easy-peasy! 🍋
```python
import torch
from med_palm import MedPalm
# Kick-start the model
model = MedPalm()
# Let's get some tokenized inputs going
text_tokens = torch.tensor([[1, 2, 3, 4, 5]]) # Just an example! Use your own data.
images = torch.randn(1, 3, 224, 224) # This too!
# Let Med-PaLM work its magic!
output = model(text_tokens, images)
# Voila! 🎉
print(output)
```
📝 Note: Modify the examples to suit your data and project needs.
## 📚 Datasets
- Wanna deep-dive? [Click here for a dive into dataset strategies](docs/DATASETS.md)
## 🏛️ System Architecture
Med-PaLM is here to be the bridge 🌉 between the vast world of medical data types. From text 📜 to images 🖼️ and even genomic data 🧬, we've got you covered!
Our superstar? A massive multimodal generative model! 🌟 Trained on the swanky MultiMedBench, it's geared to tackle diverse tasks like medical Q&A, mammography interpretation, and even genomic variant calling!
## 💼 Commercial Use-Cases
Med-PaLM isn't just fun, it's super useful! 🛍️
- **Clinical Diagnostics**: Combining medical imaging, patient tales 📖, and genes, we're aiming for top-notch diagnostic solutions.
- **Healthcare Research**: Dive deep into diverse datasets and discover something new with Med-PaLM by your side! 🤿
- **Telemedicine**: Quick, reliable, and remote! 🌍 Med-PaLM's here to revolutionize telehealth.
# Contributing to Med Palm 🤖🌟
First off, big high fives 🙌 and thank you for considering a contribution to Pali! Your help and enthusiasm can truly elevate this project. Whether you're fixing bugs 🐛, adding features 🎁, or just providing feedback, every bit matters! Here's a step-by-step guide to make your contribution journey smooth:
## 1. Set the Stage 🎬
**Fork the Repository:** Before you dive in, create a fork of the Pali repository. This gives you your own workspace where you can make changes without affecting the main project.
1. Go to the top right corner of the Pali repo.
2. Click on the "Fork" button.
Boom! You now have a copy on your GitHub account.
## 2. Clone & Set Up 🚀
**Clone Your Fork:**
```bash
git clone https://github.com/YOUR_USERNAME/pali.git
cd pali
```
**Connect with the Main Repo:** To fetch updates from the main Pali repository, set it up as a remote:
```bash
git remote add upstream https://github.com/original_pali_repo/pali.git
```
## 3. Make Your Magic ✨
Create a new branch for your feature, bugfix, or whatever you're looking to contribute:
```bash
git checkout -b feature/my-awesome-feature
```
Now, dive into the code and sprinkle your magic!
## 4. Stay Updated 🔄
While you're working, the main Pali repository might have updates. Keep your local copy in sync:
```bash
git fetch upstream
git merge upstream/main
```
## 5. Share Your Brilliance 🎁
Once you've made your changes:
1. **Stage & Commit:**
```bash
git add .
git commit -m "Add my awesome feature"
```
2. **Push to Your Fork:**
```bash
git push origin feature/my-awesome-feature
```
3. **Create a Pull Request:** Head back to your fork on GitHub, and you'll see a "New Pull Request" button. Click on it!
## 6. The Review Dance 💃🕺
Once your PR is submitted, our Pali team will review it. They might have questions or feedback. Stay engaged, discuss, and make any needed changes. Collaboration is key! 🤝
## 7. Celebrate & Wait 🎉
After review and any necessary tweaks, your contribution will be merged. Pat yourself on the back and celebrate! 🎊
## 8. Spread the Word 📢
Share about your contribution with your network. The more the merrier! Plus, it feels good to show off a bit, right? 😉
Remember, every contribution, no matter how small or large, is valued and appreciated. It's the collective effort that makes open-source so vibrant and impactful. Thanks for being a part of the Pali adventure! 🌟🚀
----
## 📜 License
Med-PaLM's chillin' under the MIT license. Check out the details [here](LICENSE.md).
## 🎉 A Big Thank You!
A thunderous applause 👏 for the amazing clinicians and data wizards who've made Med-PaLM what it is today. We're on a mission to reshape healthcare, and every bit of your expertise has been invaluable!
So, let's dive into the world of biomedicine with Med-PaLM! 🎈🥳 | [
"MEDICAL DATA"
] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1434 | Lots-of-LoRAs | null | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"arxiv:2407.00066",
"license:mit",
"region:us"
] | "2024-06-18T19:47:44Z" | 2024-07-03T20:11:16+00:00 | 0 | 0 | ---
language: en
library_name: pytorch
license: mit
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1434
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1434_head_qa_classification
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
https://huggingface.co/datasets/Lots-of-LoRAs/task1434_head_qa_classification sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{brüelgabrielsson2024compressserveservingthousands,
title={Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead},
author={Rickard Brüel-Gabrielsson and Jiacheng Zhu and Onkar Bhardwaj and Leshem Choshen and Kristjan Greenewald and Mikhail Yurochkin and Justin Solomon},
year={2024},
eprint={2407.00066},
archivePrefix={arXiv},
primaryClass={cs.DC},
url={https://arxiv.org/abs/2407.00066},
}
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"HEAD-QA"
] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1482 | Lots-of-LoRAs | null | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"arxiv:2407.00066",
"license:mit",
"region:us"
] | "2024-06-18T19:48:10Z" | 2024-07-03T20:18:24+00:00 | 0 | 0 | ---
language: en
library_name: pytorch
license: mit
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1482
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1482_gene_extraction_chemprot_dataset
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
https://huggingface.co/datasets/Lots-of-LoRAs/task1482_gene_extraction_chemprot_dataset sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{brüelgabrielsson2024compressserveservingthousands,
title={Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead},
author={Rickard Brüel-Gabrielsson and Jiacheng Zhu and Onkar Bhardwaj and Leshem Choshen and Kristjan Greenewald and Mikhail Yurochkin and Justin Solomon},
year={2024},
eprint={2407.00066},
archivePrefix={arXiv},
primaryClass={cs.DC},
url={https://arxiv.org/abs/2407.00066},
}
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"CHEMPROT"
] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1483 | Lots-of-LoRAs | null | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"arxiv:2407.00066",
"license:mit",
"region:us"
] | "2024-06-18T19:49:54Z" | 2024-07-03T20:16:22+00:00 | 0 | 0 | ---
language: en
library_name: pytorch
license: mit
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1483
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1483_chemical_extraction_chemprot_dataset
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
https://huggingface.co/datasets/Lots-of-LoRAs/task1483_chemical_extraction_chemprot_dataset sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{brüelgabrielsson2024compressserveservingthousands,
title={Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead},
author={Rickard Brüel-Gabrielsson and Jiacheng Zhu and Onkar Bhardwaj and Leshem Choshen and Kristjan Greenewald and Mikhail Yurochkin and Justin Solomon},
year={2024},
eprint={2407.00066},
archivePrefix={arXiv},
primaryClass={cs.DC},
url={https://arxiv.org/abs/2407.00066},
}
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"CHEMPROT"
] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1486 | Lots-of-LoRAs | null | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"arxiv:2407.00066",
"license:mit",
"region:us"
] | "2024-06-18T20:03:10Z" | 2024-07-03T20:34:24+00:00 | 0 | 0 | ---
language: en
library_name: pytorch
license: mit
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1486
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1486_cell_extraction_anem_dataset
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
https://huggingface.co/datasets/Lots-of-LoRAs/task1486_cell_extraction_anem_dataset sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{brüelgabrielsson2024compressserveservingthousands,
title={Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead},
author={Rickard Brüel-Gabrielsson and Jiacheng Zhu and Onkar Bhardwaj and Leshem Choshen and Kristjan Greenewald and Mikhail Yurochkin and Justin Solomon},
year={2024},
eprint={2407.00066},
archivePrefix={arXiv},
primaryClass={cs.DC},
url={https://arxiv.org/abs/2407.00066},
}
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"ANEM"
] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task846 | Lots-of-LoRAs | null | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"arxiv:2407.00066",
"license:mit",
"region:us"
] | "2024-06-18T20:10:15Z" | 2024-07-03T20:04:30+00:00 | 0 | 0 | ---
language: en
library_name: pytorch
license: mit
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task846
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task846_pubmedqa_classification
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
https://huggingface.co/datasets/Lots-of-LoRAs/task846_pubmedqa_classification sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{brüelgabrielsson2024compressserveservingthousands,
title={Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead},
author={Rickard Brüel-Gabrielsson and Jiacheng Zhu and Onkar Bhardwaj and Leshem Choshen and Kristjan Greenewald and Mikhail Yurochkin and Justin Solomon},
year={2024},
eprint={2407.00066},
archivePrefix={arXiv},
primaryClass={cs.DC},
url={https://arxiv.org/abs/2407.00066},
}
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"PUBMEDQA"
] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task594 | Lots-of-LoRAs | null | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"arxiv:2407.00066",
"license:mit",
"region:us"
] | "2024-06-18T20:11:24Z" | 2024-07-03T20:04:00+00:00 | 0 | 0 | ---
language: en
library_name: pytorch
license: mit
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task594
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task594_sciq_question_generation
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
https://huggingface.co/datasets/Lots-of-LoRAs/task594_sciq_question_generation sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{brüelgabrielsson2024compressserveservingthousands,
title={Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead},
author={Rickard Brüel-Gabrielsson and Jiacheng Zhu and Onkar Bhardwaj and Leshem Choshen and Kristjan Greenewald and Mikhail Yurochkin and Justin Solomon},
year={2024},
eprint={2407.00066},
archivePrefix={arXiv},
primaryClass={cs.DC},
url={https://arxiv.org/abs/2407.00066},
}
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"SCIQ"
] |
frankmorales2020/Meta-Llama-3-8B-MEDAL-flash-attention-2-cosine | frankmorales2020 | null | [
"peft",
"tensorboard",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"dataset:generator",
"base_model:meta-llama/Meta-Llama-3-8B",
"base_model:adapter:meta-llama/Meta-Llama-3-8B",
"license:llama3",
"region:us"
] | "2024-06-19T15:44:15Z" | 2024-06-19T19:07:44+00:00 | 0 | 0 | ---
base_model: meta-llama/Meta-Llama-3-8B
datasets:
- generator
library_name: peft
license: llama3
tags:
- trl
- sft
- generated_from_trainer
model-index:
- name: Meta-Llama-3-8B-MEDAL-flash-attention-2-cosine
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Meta-Llama-3-8B-MEDAL-flash-attention-2-cosine
This model is a fine-tuned version of [meta-llama/Meta-Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B) on the generator dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 3
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 6
- total_train_batch_size: 18
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 10
### Training results
### Framework versions
- PEFT 0.11.1
- Transformers 4.41.2
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1 | [
"MEDAL"
] |
Lots-of-LoRAs/Mistral-7B-Instruct-v0.2-4b-r16-task1449 | Lots-of-LoRAs | null | [
"pytorch",
"safetensors",
"en",
"arxiv:1910.09700",
"arxiv:2407.00066",
"license:mit",
"region:us"
] | "2024-06-20T15:18:30Z" | 2024-07-03T20:26:35+00:00 | 0 | 0 | ---
language: en
library_name: pytorch
license: mit
---
# Model Card for Mistral-7B-Instruct-v0.2-4b-r16-task1449
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
LoRA trained on task1449_disease_entity_extraction_bc5cdr_dataset
- **Developed by:** bruel
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** LoRA
- **Language(s) (NLP):** en
- **License:** mit
- **Finetuned from model [optional]:** mistralai/Mistral-7B-Instruct-v0.2
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** https://github.com/bruel-gabrielsson
- **Paper [optional]:** "Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead" (2024), Rickard Brüel Gabrielsson, Jiacheng Zhu, Onkar Bhardwaj, Leshem Choshen, Kristjan Greenewald, Mikhail Yurochkin and Justin Solomon
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
https://huggingface.co/datasets/Lots-of-LoRAs/task1449_disease_entity_extraction_bc5cdr_dataset sourced from https://github.com/allenai/natural-instructions
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{brüelgabrielsson2024compressserveservingthousands,
title={Compress then Serve: Serving Thousands of LoRA Adapters with Little Overhead},
author={Rickard Brüel-Gabrielsson and Jiacheng Zhu and Onkar Bhardwaj and Leshem Choshen and Kristjan Greenewald and Mikhail Yurochkin and Justin Solomon},
year={2024},
eprint={2407.00066},
archivePrefix={arXiv},
primaryClass={cs.DC},
url={https://arxiv.org/abs/2407.00066},
}
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] | [
"BC5CDR"
] |
frankmorales2020/POC-Meta-Llama-3-8B-MEDAL-flash-attention-2-cosine-evaldata | frankmorales2020 | null | [
"peft",
"tensorboard",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"dataset:generator",
"base_model:meta-llama/Meta-Llama-3-8B",
"base_model:adapter:meta-llama/Meta-Llama-3-8B",
"license:llama3",
"region:us"
] | "2024-06-23T00:28:13Z" | 2024-06-23T20:45:19+00:00 | 0 | 0 | ---
base_model: meta-llama/Meta-Llama-3-8B
datasets:
- generator
library_name: peft
license: llama3
tags:
- trl
- sft
- generated_from_trainer
model-index:
- name: POC-Meta-Llama-3-8B-MEDAL-flash-attention-2-cosine-evaldata
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# POC-Meta-Llama-3-8B-MEDAL-flash-attention-2-cosine-evaldata
This model is a fine-tuned version of [meta-llama/Meta-Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 2.2897
## Model description
Article: https://medium.com/@frankmorales_91352/fine-tuning-meta-llama-3-8b-with-medal-a-refined-approach-for-enhanced-medical-language-b924d226b09d
## Training and evaluation data
Fine-Tuning: https://github.com/frank-morales2020/MLxDL/blob/main/FineTuning_LLM_Meta_Llama_3_8B_for_MEDAL_EVALDATA.ipynb
Evaluation: https://github.com/frank-morales2020/MLxDL/blob/main/Meta_Llama_3_8B_for_MEDAL_EVALUATOR_evaldata.ipynb
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.15
- num_epochs: 0.3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 2.7645 | 0.0069 | 100 | 2.6720 |
| 2.5917 | 0.0138 | 200 | 2.5243 |
| 2.5054 | 0.0207 | 300 | 2.4705 |
| 2.4406 | 0.0277 | 400 | 2.4379 |
| 2.4272 | 0.0346 | 500 | 2.4136 |
| 2.4171 | 0.0415 | 600 | 2.3942 |
| 2.3908 | 0.0484 | 700 | 2.3793 |
| 2.3808 | 0.0553 | 800 | 2.3664 |
| 2.3588 | 0.0622 | 900 | 2.3571 |
| 2.3595 | 0.0692 | 1000 | 2.3494 |
| 2.3411 | 0.0761 | 1100 | 2.3421 |
| 2.3308 | 0.0830 | 1200 | 2.3369 |
| 2.3358 | 0.0899 | 1300 | 2.3320 |
| 2.3295 | 0.0968 | 1400 | 2.3270 |
| 2.337 | 0.1037 | 1500 | 2.3228 |
| 2.3182 | 0.1106 | 1600 | 2.3195 |
| 2.3334 | 0.1176 | 1700 | 2.3161 |
| 2.3278 | 0.1245 | 1800 | 2.3128 |
| 2.3151 | 0.1314 | 1900 | 2.3101 |
| 2.3245 | 0.1383 | 2000 | 2.3075 |
| 2.3073 | 0.1452 | 2100 | 2.3053 |
| 2.3094 | 0.1521 | 2200 | 2.3036 |
| 2.3101 | 0.1590 | 2300 | 2.3013 |
| 2.3102 | 0.1660 | 2400 | 2.2995 |
| 2.3042 | 0.1729 | 2500 | 2.2980 |
| 2.2942 | 0.1798 | 2600 | 2.2965 |
| 2.2876 | 0.1867 | 2700 | 2.2951 |
| 2.3077 | 0.1936 | 2800 | 2.2941 |
| 2.2851 | 0.2005 | 2900 | 2.2931 |
| 2.2766 | 0.2075 | 3000 | 2.2923 |
| 2.2873 | 0.2144 | 3100 | 2.2916 |
| 2.2971 | 0.2213 | 3200 | 2.2910 |
| 2.2942 | 0.2282 | 3300 | 2.2906 |
| 2.2872 | 0.2351 | 3400 | 2.2903 |
| 2.2996 | 0.2420 | 3500 | 2.2901 |
| 2.2855 | 0.2489 | 3600 | 2.2899 |
| 2.2969 | 0.2559 | 3700 | 2.2898 |
| 2.2871 | 0.2628 | 3800 | 2.2898 |
| 2.2905 | 0.2697 | 3900 | 2.2897 |
| 2.2915 | 0.2766 | 4000 | 2.2897 |
| 2.2921 | 0.2835 | 4100 | 2.2897 |
| 2.3087 | 0.2904 | 4200 | 2.2897 |
| 2.3017 | 0.2974 | 4300 | 2.2897 |
### Framework versions
- PEFT 0.11.1
- Transformers 4.41.2
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1 | [
"MEDAL"
] |
Kalray/densenet-121 | Kalray | image-classification | [
"onnx",
"image-classification",
"dataset:ILSVRC/imagenet-1k",
"arxiv:1608.06993",
"arxiv:1808.03570",
"license:apache-2.0",
"region:us"
] | "2024-06-25T06:33:34Z" | 2024-09-10T16:31:33+00:00 | 0 | 0 | ---
datasets:
- ILSVRC/imagenet-1k
license: apache-2.0
pipeline_tag: image-classification
---
# Introduction
This repository stores the model for Densenet-121, compatible with Kalray's neural network API. </br>
Please see www.github.com/kalray/kann-models-zoo for details and proper usage. </br>
# Contents
- ONNX: densenet121s.onnx
- Quantized ONNX (INT8) : densetnet121s-q.onnx
# Lecture note reference
- Densely Connected Convolutional Networks, https://arxiv.org/pdf/1608.06993.pdf
# Repository or links references
- [PyTorch | TorchVision](https://pytorch.org/vision/stable/models/generated/torchvision.models.densenet121.html#torchvision.models.densenet121)
BibTeX entry and citation info
```
@article{DBLP:journals/corr/abs-1808-03570,
author = {Chia{-}Yu Li and
Ngoc Thang Vu},
title = {Densely Connected Convolutional Networks for Speech Recognition},
journal = {CoRR},
volume = {abs/1808.03570},
year = {2018},
url = {http://arxiv.org/abs/1808.03570},
eprinttype = {arXiv},
eprint = {1808.03570},
timestamp = {Sun, 02 Sep 2018 15:01:55 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1808-03570.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
Authors:
+ [email protected]
+ [email protected] | [
"CHIA"
] |
Kalray/densenet-169 | Kalray | image-classification | [
"onnx",
"image-classification",
"dataset:ILSVRC/imagenet-1k",
"arxiv:1608.06993",
"arxiv:1808.03570",
"license:apache-2.0",
"region:us"
] | "2024-06-25T06:34:38Z" | 2024-09-20T10:06:17+00:00 | 0 | 0 | ---
datasets:
- ILSVRC/imagenet-1k
license: apache-2.0
pipeline_tag: image-classification
---
# Introduction
This repository stores the model for Densenet-169, compatible with Kalray's neural network API. </br>
Please see www.github.com/kalray/kann-models-zoo for details and proper usage. </br>
# Contents
- ONNX: densenet169s.onnx
- Quantized ONNX (INT8): densenet169s-q.onnx
# Lecture note reference
- Densely Connected Convolutional Networks, https://arxiv.org/pdf/1608.06993.pdf
# Repository or links references
- [PyTorch | TorchVision](https://pytorch.org/vision/stable/models/generated/torchvision.models.densenet169.html#torchvision.models.densenet169)
BibTeX entry and citation info
```
@article{DBLP:journals/corr/abs-1808-03570,
author = {Chia{-}Yu Li and
Ngoc Thang Vu},
title = {Densely Connected Convolutional Networks for Speech Recognition},
journal = {CoRR},
volume = {abs/1808.03570},
year = {2018},
url = {http://arxiv.org/abs/1808.03570},
eprinttype = {arXiv},
eprint = {1808.03570},
timestamp = {Sun, 02 Sep 2018 15:01:55 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1808-03570.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
Authors:
+ [email protected]
+ [email protected] | [
"CHIA"
] |
Darshan7575/librispeech_100_multiconvformer_ctcatt_conv_fusion | Darshan7575 | automatic-speech-recognition | [
"espnet",
"audio",
"automatic-speech-recognition",
"en",
"dataset:librispeech_100",
"arxiv:1804.00015",
"license:cc-by-4.0",
"region:us"
] | "2024-06-25T09:49:02Z" | 2024-07-22T14:37:33+00:00 | 0 | 0 | ---
datasets:
- librispeech_100
language: en
license: cc-by-4.0
tags:
- espnet
- audio
- automatic-speech-recognition
---
## ESPnet2 ASR model
### `Darshan7575/librispeech_100_multiconvformer_ctcatt_conv_fusion`
This model was trained by Darshan Prabhu using librispeech_100 recipe in [espnet](https://github.com/espnet/espnet/).
### Demo: How to use in ESPnet2
Follow the [ESPnet installation instructions](https://espnet.github.io/espnet/installation.html)
if you haven't done that already.
```bash
cd espnet
git checkout b8b31c06da081ee1690c8dc1bf102a7811124529
pip install -e .
cd egs2/librispeech_100/asr1
./run.sh --skip_data_prep false --skip_train true --download_model Darshan7575/librispeech_100_multiconvformer_ctcatt_conv_fusion
```
<!-- Generated by scripts/utils/show_asr_result.sh -->
# RESULTS
## Environments
- date: `Sun Jan 28 23:50:53 UTC 2024`
- python version: `3.9.16 (main, Mar 8 2023, 14:00:05) [GCC 11.2.0]`
- espnet version: `espnet 202304`
- pytorch version: `pytorch 2.1.2+cu118`
- Git hash: `3651c2e67126c4544820cf148407be7f2679866c`
- Commit date: `Sat Jul 1 14:46:46 2023 +0000`
## exp/librispeech_100_multiconvformer_conv_fusion
### WER
|dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err|
|---|---|---|---|---|---|---|---|---|
|decode_asr_lm_lm_train_en_bpe5000_valid.loss.ave_asr_model_valid.acc.ave/dev_clean|2703|54402|94.8|4.8|0.3|0.7|5.9|53.8|
|decode_asr_lm_lm_train_en_bpe5000_valid.loss.ave_asr_model_valid.acc.ave/dev_other|2864|50948|85.4|13.2|1.4|2.0|16.6|78.8|
|decode_asr_lm_lm_train_en_bpe5000_valid.loss.ave_asr_model_valid.acc.ave/test_clean|2620|52576|94.5|5.0|0.4|0.7|6.2|55.5|
|decode_asr_lm_lm_train_en_bpe5000_valid.loss.ave_asr_model_valid.acc.ave/test_other|2939|52343|85.0|13.6|1.5|2.0|17.0|80.5|
### CER
|dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err|
|---|---|---|---|---|---|---|---|---|
|decode_asr_lm_lm_train_en_bpe5000_valid.loss.ave_asr_model_valid.acc.ave/dev_clean|2703|288456|98.3|1.0|0.7|0.6|2.3|53.8|
|decode_asr_lm_lm_train_en_bpe5000_valid.loss.ave_asr_model_valid.acc.ave/dev_other|2864|265951|93.6|4.0|2.4|2.0|8.4|78.8|
|decode_asr_lm_lm_train_en_bpe5000_valid.loss.ave_asr_model_valid.acc.ave/test_clean|2620|281530|98.3|1.0|0.7|0.6|2.4|55.5|
|decode_asr_lm_lm_train_en_bpe5000_valid.loss.ave_asr_model_valid.acc.ave/test_other|2939|272758|93.6|3.8|2.6|1.9|8.2|80.5|
### TER
|dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err|
|---|---|---|---|---|---|---|---|---|
|decode_asr_lm_lm_train_en_bpe5000_valid.loss.ave_asr_model_valid.acc.ave/dev_clean|2703|69558|92.5|4.7|2.8|0.6|8.1|53.8|
|decode_asr_lm_lm_train_en_bpe5000_valid.loss.ave_asr_model_valid.acc.ave/dev_other|2864|64524|82.0|12.9|5.0|2.4|20.4|78.8|
|decode_asr_lm_lm_train_en_bpe5000_valid.loss.ave_asr_model_valid.acc.ave/test_clean|2620|66983|92.4|4.8|2.8|0.6|8.2|55.5|
|decode_asr_lm_lm_train_en_bpe5000_valid.loss.ave_asr_model_valid.acc.ave/test_other|2939|66650|81.6|12.9|5.5|2.2|20.6|80.5|
## ASR config
<details><summary>expand</summary>
```
config: conf/tuning/train_asr_multiconvformer_conv_fusion_linear1024.yaml
print_config: false
log_level: INFO
dry_run: false
iterator_type: sequence
output_dir: exp/librispeech_100_multiconvformer_conv_fusion
ngpu: 1
seed: 2022
num_workers: 4
num_att_plot: 3
dist_backend: nccl
dist_init_method: env://
dist_world_size: null
dist_rank: null
local_rank: 0
dist_master_addr: null
dist_master_port: null
dist_launcher: null
multiprocessing_distributed: false
unused_parameters: false
sharded_ddp: false
cudnn_enabled: true
cudnn_benchmark: false
cudnn_deterministic: true
collect_stats: false
write_collected_feats: false
max_epoch: 70
patience: null
val_scheduler_criterion:
- valid
- loss
early_stopping_criterion:
- valid
- loss
- min
best_model_criterion:
- - valid
- acc
- max
keep_nbest_models: 10
nbest_averaging_interval: 0
grad_clip: 5.0
grad_clip_type: 2.0
grad_noise: false
accum_grad: 4
no_forward_run: false
resume: true
train_dtype: float32
use_amp: false
log_interval: null
use_matplotlib: true
use_tensorboard: true
create_graph_in_tensorboard: false
use_wandb: false
wandb_project: null
wandb_id: null
wandb_entity: null
wandb_name: null
wandb_model_log_interval: -1
detect_anomaly: false
pretrain_path: null
init_param: []
ignore_init_mismatch: false
freeze_param: []
num_iters_per_epoch: null
batch_size: 20
valid_batch_size: null
batch_bins: 16000000
valid_batch_bins: null
train_shape_file:
- exp/asr_stats_raw_en_bpe5000_sp/train/speech_shape
- exp/asr_stats_raw_en_bpe5000_sp/train/text_shape.bpe
valid_shape_file:
- exp/asr_stats_raw_en_bpe5000_sp/valid/speech_shape
- exp/asr_stats_raw_en_bpe5000_sp/valid/text_shape.bpe
batch_type: numel
valid_batch_type: null
fold_length:
- 80000
- 150
sort_in_batch: descending
sort_batch: descending
multiple_iterator: false
chunk_length: 500
chunk_shift_ratio: 0.5
num_cache_chunks: 1024
chunk_excluded_key_prefixes: []
train_data_path_and_name_and_type:
- - dump/raw/train_clean_100_sp/wav.scp
- speech
- kaldi_ark
- - dump/raw/train_clean_100_sp/text
- text
- text
valid_data_path_and_name_and_type:
- - dump/raw/dev/wav.scp
- speech
- kaldi_ark
- - dump/raw/dev/text
- text
- text
allow_variable_data_keys: false
max_cache_size: 0.0
max_cache_fd: 32
valid_max_cache_size: null
exclude_weight_decay: false
exclude_weight_decay_conf: {}
optim: adam
optim_conf:
lr: 0.002
weight_decay: 1.0e-06
scheduler: warmuplr
scheduler_conf:
warmup_steps: 15000
token_list:
- <blank>
- <unk>
- ▁THE
- S
- ▁AND
- ▁OF
- ▁TO
- ▁A
- ▁IN
- ED
- ▁I
- ▁HE
- ▁WAS
- ▁THAT
- ING
- ▁IT
- ''''
- ▁HIS
- ▁HAD
- ▁WITH
- ▁YOU
- ▁FOR
- T
- ▁AS
- ▁HER
- LY
- ▁NOT
- ▁BUT
- ▁SHE
- ▁BE
- D
- E
- ▁IS
- ▁AT
- ▁ON
- ▁HIM
- ▁THEY
- ▁BY
- ▁HAVE
- Y
- ▁MY
- ▁SO
- ▁ALL
- ▁THIS
- ▁WERE
- ▁WHICH
- ▁ME
- ▁FROM
- ▁ONE
- ▁SAID
- ▁WE
- N
- ER
- ▁NO
- ▁THERE
- ▁WHEN
- ▁AN
- ▁THEIR
- ▁OR
- ▁WOULD
- ▁WHO
- ▁THEM
- R
- ▁IF
- ▁WHAT
- ▁ARE
- ▁BEEN
- ▁OUT
- ▁UP
- M
- ▁WILL
- ▁DO
- ▁MAN
- ▁COULD
- C
- ▁THEN
- ▁INTO
- ▁MORE
- ▁SOME
- ES
- P
- ▁VERY
- ▁NOW
- ▁YOUR
- ▁LITTLE
- ▁TIME
- ▁ABOUT
- ▁DID
- ▁THAN
- ▁LIKE
- ▁HAS
- L
- G
- AL
- IN
- ▁UPON
- ▁CAN
- ▁WELL
- ▁OTHER
- ▁OVER
- US
- ▁TWO
- ▁ONLY
- ▁ANY
- ▁OUR
- O
- EN
- RE
- ▁MADE
- U
- ▁AFTER
- ▁SEE
- ▁S
- ▁DOWN
- ▁BEFORE
- LL
- ST
- B
- ▁OLD
- ▁DAY
- ▁MISS
- ▁GREAT
- ▁US
- ▁KNOW
- OR
- ▁SUCH
- ▁GOOD
- ▁WAY
- A
- ▁THESE
- ▁CAME
- ▁UN
- ▁SHOULD
- ▁HOW
- ▁MISTER
- ▁GO
- ▁MUCH
- ▁WHERE
- ▁MUST
- ▁NEVER
- ▁COME
- ▁BACK
- ION
- 'ON'
- ▁LONG
- F
- ▁AGAIN
- ▁FIRST
- LE
- ▁MEN
- ▁EVEN
- NESS
- ▁MIGHT
- ▁OWN
- ▁MAY
- K
- ▁HIMSELF
- ▁SAY
- ▁JUST
- ▁THROUGH
- ▁RE
- ▁AM
- ▁ITS
- ▁WENT
- ▁THOUGHT
- ▁
- ▁DE
- ▁MAKE
- I
- ▁HAND
- ▁THINK
- ▁HOUSE
- ▁HERE
- IC
- H
- ATION
- ▁LIFE
- IT
- ▁EYES
- ▁MOST
- ▁WITHOUT
- ▁TOO
- ▁THOSE
- ABLE
- ▁EVERY
- ▁DON
- ▁MANY
- ▁AWAY
- ITY
- VE
- W
- ▁STILL
- ▁BEING
- ▁C
- ▁LAST
- ▁NIGHT
- ▁O
- ▁HEAD
- AN
- ▁FOUND
- ▁NOTHING
- ▁YOUNG
- ▁WHILE
- ▁TAKE
- ▁GET
- ▁PEOPLE
- RO
- ▁OFF
- ▁THOUGH
- EST
- ▁YET
- ▁THREE
- TH
- ▁RIGHT
- ▁UNDER
- AR
- ▁FACE
- IES
- ▁ROOM
- ▁NEW
- ▁SAW
- RA
- V
- ▁ASKED
- ▁TELL
- ERS
- ▁SAME
- MENT
- ▁HEART
- LESS
- ▁WORK
- ▁PLACE
- ▁ANOTHER
- ▁EVER
- ▁LEFT
- ▁SHALL
- ▁FATHER
- ▁PUT
- ▁ONCE
- ▁TOOK
- ▁LET
- ▁ALWAYS
- ▁SEEMED
- ▁PART
- IL
- UR
- ▁WHY
- ▁TOLD
- ▁GIVE
- ▁LOVE
- CE
- ▁MIND
- ▁LOOKED
- ▁HEARD
- ▁SOON
- ▁LOOK
- ▁MOTHER
- ▁FAR
- IVE
- ▁BECAUSE
- ▁HOME
- OUS
- ▁T
- EL
- ▁D
- ▁SOMETHING
- ▁SIDE
- ▁KING
- IS
- ATE
- ▁MOMENT
- ENT
- RY
- ▁THINGS
- ▁ST
- ▁LIGHT
- ▁FIND
- ▁GOING
- ▁THING
- ▁WORLD
- IR
- AT
- ▁WATER
- ▁END
- ▁DOOR
- ISH
- ▁KNEW
- ▁WOMAN
- ▁SIR
- ▁EACH
- RI
- ▁HAVING
- ▁AGAINST
- ▁FEW
- ▁E
- ▁BEGAN
- ▁BETTER
- ▁YES
- ▁NAME
- ▁ENOUGH
- ET
- ▁HARD
- ▁VOICE
- ▁YEARS
- ▁GOT
- ▁WHOLE
- ▁WHITE
- ▁WANT
- ▁GIRL
- ▁DONE
- ▁SEEN
- ▁HUNDRED
- ▁CALLED
- ▁BETWEEN
- ▁MORNING
- FUL
- AS
- ▁FELT
- TER
- ▁KIND
- X
- CH
- ▁HERSELF
- ANT
- ▁TOWARD
- ▁HALF
- ▁OH
- ▁AMONG
- ▁HOWEVER
- ▁TURNED
- ▁ALSO
- ▁BOTH
- ▁POOR
- ▁PERHAPS
- ▁REPLIED
- ▁COURSE
- UL
- ▁QUITE
- ▁REST
- ▁DOES
- ▁MYSELF
- NG
- LO
- ANCE
- ▁MA
- ▁SET
- ▁SMALL
- ▁B
- ▁SURE
- ▁F
- ▁GAVE
- ▁PRESENT
- ▁HIGH
- ▁ALMO
- ▁R
- CK
- ▁WHOM
- ▁NEAR
- ▁CARE
- ▁WAR
- ▁GOD
- ▁TOGETHER
- ▁SAT
- ▁SHOW
- TE
- NE
- ▁BEST
- ▁UNTIL
- ▁OPEN
- ▁W
- ▁FOUR
- ▁DEAR
- ▁HANDS
- ▁WORDS
- ▁SINCE
- ▁LAND
- ▁DIS
- MAN
- ▁ANYTHING
- ▁FEET
- ▁NEXT
- ▁GENERAL
- LING
- ▁LAY
- ▁NOR
- ▁STOOD
- ▁BLACK
- ▁POWER
- ▁BROUGHT
- Z
- IE
- ▁ROUND
- ▁BELIEVE
- ▁LARGE
- ▁ALONG
- ▁HELP
- ▁DAYS
- ▁FIVE
- ▁K
- ▁HOPE
- AM
- ▁CO
- ▁KEEP
- ▁FULL
- ▁WALK
- ▁MASTER
- ATED
- ▁NATURE
- ▁JOHN
- ▁POINT
- ▁DUR
- ▁MATTER
- ▁MONEY
- ▁CHILD
- ▁LOOKING
- ▁RATHER
- ▁AIR
- IA
- ▁P
- ▁TWENTY
- ▁FIRE
- OL
- ▁LESS
- ▁SHORT
- ▁PASSED
- ▁INDEED
- TY
- ▁CASE
- ▁WORD
- ▁WISH
- ▁COUNTRY
- LED
- ID
- ▁BOY
- ▁SOUND
- ▁FORM
- ▁CRIED
- LA
- ▁FRIEND
- TON
- ▁FACT
- ▁UNCLE
- ▁TAKEN
- ▁AL
- ▁TEN
- IAN
- ▁GONE
- ▁SEA
- ▁REASON
- TING
- ▁WHOSE
- ▁OTHERS
- AC
- ▁LI
- ▁DEATH
- ▁CERTAIN
- ▁ANSWERED
- ▁THEMSELVES
- ▁LADY
- ▁STATE
- ▁CAR
- ▁WIFE
- ▁THOUSAND
- ▁TRUE
- ▁BEHIND
- AGE
- ▁DOCTOR
- ▁FEAR
- ▁OFTEN
- OM
- ▁TILL
- ▁HA
- IOUS
- ▁AROUND
- IST
- ▁SENT
- ▁SPEAK
- ▁WOMEN
- ▁GROUND
- VER
- ENCE
- NA
- ▁TALK
- ▁CHILDREN
- TION
- CO
- MO
- ▁HEAR
- ▁ORDER
- ▁LEAVE
- ▁PRO
- ▁ALREADY
- ▁LA
- ▁FINE
- SE
- ▁BA
- PP
- ▁THUS
- AD
- ▁NEED
- ▁SIGHT
- ▁CALL
- ▁FELL
- ▁MANNER
- MP
- ▁BECAME
- UM
- ▁WATCH
- OW
- ▁FOOT
- ▁CANNOT
- ▁BODY
- ▁TOWN
- ▁LIVE
- INE
- ▁RETURNED
- ▁WONDER
- MA
- ▁G
- UT
- ▁CLOSE
- UN
- IM
- ▁ALONE
- ▁DIDN
- ▁LORD
- ▁RED
- ARY
- ▁GIVEN
- ▁SIX
- ▁EVERYTHING
- ▁DARK
- ▁DEAD
- ▁STRONG
- ▁SON
- ▁COMING
- URE
- ▁HELD
- ▁ABOVE
- ▁REALLY
- ▁BEAUTIFUL
- ▁SECOND
- ARD
- ▁EVENING
- ▁CON
- ▁HOUR
- ▁FELLOW
- ▁ROSE
- ▁PERSON
- ▁EX
- ▁CH
- ▁FORCE
- ▁MO
- ▁ARM
- ▁CAUSE
- ▁TURN
- ▁CITY
- ▁DOUBT
- ▁QUESTION
- TIC
- ▁DEEP
- ▁HAIR
- ICAL
- ▁MEAN
- ▁DI
- ▁CLEAR
- ▁SOMETIMES
- ▁STRANGE
- ▁FEEL
- ▁HO
- ▁IMP
- WARD
- AUGHT
- ▁CAPTAIN
- ▁USE
- ▁UNDERSTAND
- ▁KEPT
- ▁BR
- ▁WOOD
- ▁PRE
- ▁YEAR
- ▁TI
- ▁LEAST
- ▁BED
- ▁SA
- ▁TABLE
- ▁BECOME
- ▁FREE
- ▁FAMILY
- ME
- ▁EYE
- ▁WHETHER
- ▁MAKING
- ▁WITHIN
- ▁SORT
- ▁ANSWER
- ▁PO
- ▁SAYS
- ▁EARTH
- ▁RETURN
- ▁SUDDENLY
- ▁FRIENDS
- ▁GREEN
- ▁SUN
- ▁FAIR
- ▁TH
- ▁FALL
- ▁EITHER
- ▁BO
- ▁PRINCE
- ▁THOU
- ▁ITSELF
- ▁CHURCH
- ▁BIG
- ▁ABLE
- ▁DIFFERENT
- ▁SEVERAL
- ▁DAUGHTER
- ▁WON
- ▁WIND
- ▁BAD
- ▁LOST
- ▁READ
- ▁STORY
- ▁APPEARED
- DE
- ▁NUMBER
- ▁SP
- ▁LOW
- ▁ROAD
- ▁POSSIBLE
- ▁HUMAN
- ▁RIVER
- ▁STREET
- ▁GA
- ▁COLD
- ▁MET
- ▁ACT
- ▁BROTHER
- ▁AGE
- ▁KNOWN
- ▁CONTINUED
- ▁BRING
- ▁ILL
- ▁RUN
- ▁LAW
- ▁SUBJECT
- ▁CUT
- J
- PER
- ▁PA
- ▁TROUBLE
- ▁GLAD
- HE
- ▁SLEEP
- MEN
- ▁LATE
- ▁MEANS
- ▁ASK
- ▁REACHED
- ▁RAN
- AK
- ▁HORSE
- ▁USED
- WAY
- OP
- ▁WINDOW
- ▁SNOW
- ▁PAST
- ▁OBJECT
- ▁THEREFORE
- IONS
- ▁TREE
- ▁COMP
- ▁BLUE
- CA
- ▁VI
- ▁SIGN
- ▁EIGHTEEN
- ▁GARDEN
- ▁BUSINESS
- ▁PETER
- ▁FOLLOWED
- ▁SEEM
- ▁HOLD
- ▁HAPPY
- ▁LONGER
- ▁ACROSS
- ▁BU
- BE
- ▁ELSE
- ▁PLAY
- ▁SOUL
- ▁STAND
- ▁ARMS
- ▁SCHOOL
- ▁PRINCESS
- ▁CERTAINLY
- LT
- ▁ENGLISH
- ▁SEVEN
- ▁PER
- ▁IDEA
- ▁LE
- ▁BOOK
- ▁FEELING
- ▁HUSBAND
- ▁LINE
- PT
- THOUGH
- ▁OUGHT
- ▁RICH
- IP
- ▁VIEW
- ▁DREAM
- ▁SENSE
- ▁LO
- ▁READY
- ▁CARRIED
- ▁M
- ▁REGARD
- ▁CHANCE
- ▁WANTED
- ▁LIVED
- ▁LATER
- ▁INTEREST
- ▁EN
- ▁EFFECT
- ▁CLA
- ▁CHANGE
- ▁CA
- ▁REAL
- ▁SUPPOSE
- LES
- ▁ART
- ▁TIMES
- ▁MAR
- IF
- ▁WILD
- ▁ADDED
- ▁LETTER
- IAL
- ▁THANK
- ▁PARTY
- LAND
- ▁PAY
- ▁BREATH
- ▁TAKING
- ▁COURT
- ▁COUNT
- ILY
- ▁COMMON
- ▁PUBLIC
- ▁PURPOSE
- ▁PRETTY
- ▁TRUTH
- ▁STAY
- ▁EM
- NT
- ▁SH
- ▁REMEMBER
- ▁ENTERED
- ▁RECEIVED
- RED
- ▁SPOKE
- ▁USUAL
- ▁THY
- ▁FIGURE
- ▁LED
- ▁TREES
- ▁TRIED
- ▁FORWARD
- NED
- ▁HAT
- ▁BLOOD
- ▁BEYOND
- ▁BANK
- ▁LIVING
- ▁JOY
- ▁HOURS
- ▁ENGLAND
- ▁STONE
- VI
- GE
- ▁SWEET
- ▁POSITION
- ▁FRONT
- ▁GIRLS
- ▁VISIT
- ▁CHARACTER
- ▁SPIRIT
- ▁TA
- BO
- QUE
- QUI
- ▁OPENED
- ▁OCCASION
- ▁MEET
- ▁EIGHT
- ▁REMAIN
- ▁PASS
- TO
- ▁NORTH
- ▁SERVICE
- ▁SISTER
- ▁SE
- ▁BEAR
- ▁PLEASURE
- ▁CHIEF
- ▁FOREST
- ▁BELL
- ▁EXPERIENCE
- ▁STRUCK
- ▁CARRY
- ORY
- ▁WARM
- 'NO'
- ▁WORTH
- ▁SAYING
- ▁SILENCE
- ▁CROSS
- ▁JE
- ▁H
- ▁BEAUTY
- PH
- ▁DEAL
- KE
- ▁SECRET
- DY
- ▁MILES
- ▁LU
- ▁DOING
- ▁BOYS
- ▁CROWD
- ▁ACCOUNT
- REW
- ISM
- TI
- ▁FE
- ▁NONE
- ▁RO
- ▁NEARLY
- ▁CHA
- ▁YOUTH
- ▁CAP
- HA
- ▁BIT
- ▁LIE
- ▁ATTENTION
- ▁STANDING
- ▁STAR
- ▁RESPECT
- ▁FURTHER
- ATIONS
- ▁ROCK
- ▁BOW
- EM
- ▁EARLY
- ▁MOUTH
- ▁BOAT
- UB
- ▁IMMEDIATELY
- ▁EXCEPT
- SHIP
- ▁PICTURE
- ▁BRIGHT
- ▁WA
- ▁GREW
- ▁LEAD
- ▁CUR
- ▁TONE
- RRY
- RS
- ▁WIDE
- CHE
- ▁FORTH
- IG
- OS
- ▁NEITHER
- ▁YOURSELF
- ▁SMILE
- ▁DRESS
- ▁OPINION
- ▁HAPPENED
- ▁WAIT
- ▁SIT
- ▁SHIP
- ▁AH
- ▁DESIRE
- ▁THICK
- ▁THIRD
- ▁GRAND
- ▁FOLLOW
- ▁GATHER
- ▁HILL
- ALLY
- ▁COMPANY
- ▁CHAIR
- DER
- ▁TOP
- ▁PAR
- ▁LENGTH
- ▁THIRTY
- ▁MINE
- ▁MI
- ▁EAT
- ▁EQUAL
- ▁AFRAID
- ▁FRESH
- ▁TAIL
- ▁FILLED
- ▁SU
- ▁MINUTES
- ▁FAST
- BU
- ▁ENTER
- ▁QUEEN
- ▁UTTER
- AG
- ▁FLOOR
- ▁SHA
- DI
- ▁HEAVEN
- ▁STOPPED
- ▁GUARD
- ▁HALL
- ▁BAR
- ▁COMPLETE
- ▁NINE
- ▁WEEK
- ▁GOLD
- VA
- ▁FIFTY
- ▁BEAT
- ▁PRESS
- ▁ATTEMPT
- ▁EXCLAIMED
- DO
- ▁CONF
- ▁SEEMS
- ▁STARTED
- ▁EL
- ▁HAR
- ▁EXPRESSION
- ▁TRA
- ▁WONDERFUL
- ▁SAINT
- ▁APPEARANCE
- ▁GRAVE
- ▁OFFICE
- ▁INSTEAD
- ▁SILENT
- ▁SOUTH
- ▁AGO
- ▁CAMP
- ▁LOVED
- ▁PATH
- ▁LEARN
- ▁PLAN
- ▁GOVERNMENT
- OUR
- PPED
- ▁SITTING
- ▁SEAT
- TEN
- RESS
- SIDE
- ▁MOVED
- ▁DIE
- ▁RESULT
- ▁SPRING
- ▁PLEASE
- ▁RI
- ▁NATURAL
- ▁ANNE
- ▁STA
- ▁CORNER
- ▁WALL
- ▁IMPOSSIBLE
- ▁BROWN
- ▁SUIT
- ▁MUSIC
- PI
- ▁TRY
- ▁DIED
- ▁TEARS
- ▁JU
- ▁COMFORT
- ▁DANGER
- ▁MEASURE
- ▁PROPERTY
- ▁BORN
- CON
- ▁CR
- ▁BROKEN
- ▁MASS
- EVER
- IER
- ▁EXPRESS
- ▁POCKET
- ▁SCARCE
- ▁SELF
- NY
- ▁MADAME
- ▁LAUGHED
- ▁TOUCH
- ▁APPEAR
- ▁LONDON
- ▁SAFE
- ▁SHARP
- ▁ATTACK
- ▁JANE
- ▁COVERED
- ▁OUTSIDE
- ▁WHATEVER
- ▁PLACED
- ▁RACE
- ▁SHORE
- ▁LAID
- ▁ROMAN
- ▁PERSONAL
- UP
- AU
- ▁REMAINED
- ▁HAPPINESS
- ▁AFTERNOON
- ▁DISTANCE
- ▁STORM
- ▁MARRIED
- ▁FRANK
- ▁VALLEY
- ▁BOUND
- ▁TALKING
- ▁JO
- ▁QUICK
- ▁STEP
- AND
- ▁ARMY
- ▁EFFORT
- ▁FRENCH
- ▁V
- LEY
- ▁PARTICULAR
- ▁START
- ATING
- OO
- LU
- ▁TRANS
- ▁HAPPEN
- ▁HABIT
- ▁VILLAGE
- ▁BELOW
- ▁GENTLEMAN
- BLE
- ▁BILL
- ▁SAVE
- ACT
- ▁SOCIETY
- ▁MAJOR
- ▁QUARTER
- ▁SKY
- ▁GUESS
- CY
- ▁SAD
- ILE
- ▁SL
- ▁PLEASANT
- ▁STRAIGHT
- ▁STRENGTH
- ▁FORTUNE
- ▁WRONG
- ▁COMMAND
- ▁BOX
- ▁QUIET
- ISE
- ▁JA
- IBLE
- ▁TREAT
- ▁GLANCE
- ▁NECESSARY
- ▁FORGET
- ▁MOUNTAIN
- ▁WINTER
- ▁DREW
- ▁WAV
- ▁PLAIN
- ▁ENTIRELY
- ▁TEA
- ▁SOFT
- ▁QUICKLY
- ▁INFLUENCE
- ▁DINNER
- ▁FOOD
- ▁CHAPTER
- ▁YE
- ▁REACH
- ▁GETT
- ▁PAPER
- ▁GIVING
- ▁BEGINNING
- ▁SEND
- ▁FIGHT
- ▁SCENE
- ▁RUSH
- ▁PI
- ▁MARK
- ▁NA
- ▁BROKE
- ▁CLASS
- ▁BATTLE
- ▁EASY
- ▁GROUP
- BY
- ▁STOP
- ▁DIRECTION
- ▁BESIDE
- ▁MOR
- HAM
- UFF
- ▁WEST
- ▁OBLIG
- ▁COLOR
- ▁SINGLE
- ▁EASILY
- ▁PALE
- ▁ACTION
- ▁INTER
- ▁STRANGER
- ▁WI
- ▁CONVERSATION
- ▁BLOW
- ▁MARY
- ▁MU
- ▁TERRIBLE
- ▁THINKING
- ▁PULL
- ▁MOON
- AB
- ▁REP
- ▁ESPECIALLY
- ▁HEAVY
- ▁SICK
- ▁LUCK
- ▁TRAIN
- ▁GUN
- ▁GU
- ▁WAITING
- ▁TURNING
- ITIES
- ▁BREAD
- ▁BELONG
- ▁LOUD
- ▁REPORT
- ▁AMERICAN
- ▁JOURNEY
- ▁ANXIOUS
- ▁LIPS
- ▁KILLED
- IGHT
- GO
- ▁CONSIDER
- ▁PROBABLY
- ▁PALACE
- ▁HISTORY
- ▁LAKE
- ▁SHUT
- ▁SIMPLY
- WA
- ▁PAIN
- ▁HORSES
- ▁SEEING
- FULLY
- ▁EXPECTED
- ▁EVIL
- ▁BURN
- ▁SIMPLE
- ▁DIRECT
- IFIED
- HER
- ▁SLOWLY
- ▁LEG
- UGH
- ▁SAIL
- RIC
- ▁WISHED
- ▁RULE
- ▁LAD
- ▁MORAL
- ▁MOVE
- ▁FOLLOWING
- ▁SILVER
- ▁SEARCH
- ▁CHANGED
- ▁HANDSOME
- ▁COULDN
- ▁PASSION
- ▁HU
- ▁SMILED
- ▁STREAM
- ▁CONCERN
- ▁PRESENCE
- STER
- ▁CONTENT
- ▁BOARD
- ▁SHAPE
- ▁DECIDED
- ▁MARRY
- ▁PERFECT
- ▁STEPS
- ▁CLOSED
- ABLY
- DEN
- ▁WEAK
- ▁SUFFICIENT
- ▁SHADOW
- ▁EXPECT
- ▁SPOT
- ▁DUTY
- ▁SPEAKING
- ▁BESIDES
- ▁FIELD
- ▁ROLL
- ▁TRYING
- ▁EAR
- ▁VER
- ▁MARRIAGE
- ▁SHOT
- ▁SLAVE
- ▁MILL
- ▁NATION
- ▁NECK
- ▁ARRIVED
- ▁TALL
- ▁GRACE
- LIN
- ▁FORTY
- ▁BROAD
- ▁SUMMER
- ▁COUSIN
- ▁BEGIN
- ▁CATCH
- ▁FO
- ▁PE
- ▁MEANT
- ▁THIN
- IO
- ▁GROW
- ▁TRO
- ▁NOTICE
- ▁CRY
- ▁FISH
- ▁COM
- ▁DEGREE
- ▁HONOUR
- ▁UNDERSTOOD
- ▁SHOP
- ▁TRUST
- ▁CONDITION
- ▁FARM
- IZ
- ▁SUDDEN
- ▁SUCCESS
- ▁SURPRISE
- ORS
- ▁THOUGHTS
- UND
- ▁ALLOWED
- ITE
- ▁NARROW
- ▁GLASS
- ▁SERIOUS
- ▁STICK
- ▁GAME
- ▁SPENT
- ▁SELL
- ▁GRA
- ▁LOWER
- ▁RAISED
- ▁PIN
- ▁ALLOW
- ▁CALM
- FT
- ▁L
- ▁PU
- ▁FIT
- ACH
- ▁SUFFER
- ▁LEGS
- ▁SUPPORT
- ▁FRANCE
- ▁LATTER
- OV
- ▁TASTE
- ▁GATE
- ▁INSTANT
- ▁MINUTE
- ▁OFFER
- ▁GREATER
- ▁PORT
- ILL
- ▁INDIVIDUAL
- ▁AUNT
- ▁EAST
- ▁ADVANTAGE
- ▁FASHION
- ▁SWORD
- ▁TWELVE
- ▁HONOR
- ▁MOVEMENT
- ▁ISLAND
- ACK
- ▁WOODS
- NCH
- ▁PLEASED
- ▁ENEMY
- ▁RAIN
- ▁VARIOUS
- ▁OBSERVED
- ▁LADIES
- ▁BELIEVED
- ▁CAST
- ▁RISE
- ▁BALL
- ▁MONTHS
- ICE
- ▁MURDER
- ▁CONDUCT
- ▁SOCIAL
- ▁TENDER
- ▁LEARNED
- ▁FRA
- ▁FIRM
- CLOCK
- ▁PREVENT
- ▁RING
- LIE
- ▁GOLDEN
- ▁DECLARED
- ▁BUILDING
- ▁WRITE
- ▁ATTEND
- ▁CARRIAGE
- ▁SITUATION
- IDE
- ▁NOBLE
- ▁HUNG
- ▁RUNN
- ▁YELLOW
- ▁KNOWLEDGE
- ▁YORK
- ▁PUSH
- ▁LEAVING
- ▁POST
- ▁CIRCUMSTANCES
- ▁SEEK
- ▁FINALLY
- ▁MAIN
- ▁LETTERS
- ▁POL
- ▁ADD
- FE
- ▁ANCIENT
- ▁MARCH
- ▁WINE
- ▁STATES
- ▁WALLS
- ▁PRISONER
- ▁ISABEL
- ▁TEMPER
- ▁JUDGE
- ▁FAINT
- ▁POND
- ▁GRASS
- ▁FAM
- OUT
- ▁LAUGH
- ▁GRAY
- IGN
- ▁ESCAPE
- ▁KILL
- ▁PRAY
- ▁COMES
- ▁ABSOLUTE
- ▁BLIND
- ▁WIN
- ▁HOST
- ▁MERELY
- ▁RID
- ▁EVERYBODY
- ▁MATERIAL
- ▁STRETCH
- ▁DUE
- ▁ROW
- ▁TIN
- ▁PROMISE
- ▁LISTEN
- ▁WALKING
- ▁COMPANION
- ▁INDIAN
- ▁BREAK
- ▁BENEATH
- ▁RUIN
- ▁EDGE
- ▁WOR
- ▁FORMER
- ▁WORSE
- ▁EVIDENTLY
- ▁HARM
- ▁CENT
- ▁PIECE
- ▁LOT
- ▁PRESIDENT
- ▁SPECIAL
- ▁LABOR
- ▁HEALTH
- GA
- ▁PLACES
- ▁BEN
- ▁SOMEWHAT
- ▁DROPPED
- ▁AFFECTION
- ▁EXACTLY
- ▁DARKNESS
- ▁FALLEN
- ▁DRESSED
- ▁BILLY
- ▁ACCEPT
- ▁FL
- ▁HOT
- ▁REPEATED
- ▁MEETING
- PA
- ▁PERIOD
- ▁HONEST
- ▁INSTANCE
- ▁FLA
- ▁PASSAGE
- ▁NE
- ▁POSSESSION
- ▁WEAR
- ▁PEACE
- ▁COAT
- ▁HOUSES
- ▁MOUNTAINS
- ▁FIFTEEN
- ▁WELCOME
- ▁YARD
- ▁PROPER
- ▁MUS
- ADE
- ▁RECEIVE
- ▁SKIN
- ▁GROWN
- ▁AFTERWARDS
- ANG
- ▁DA
- ▁DIFFICULT
- ▁PERSONS
- ▁ACCORDING
- ▁FARMER
- ▁SPEECH
- ▁IMPORTANT
- PAR
- ▁PERFECTLY
- ▁MIN
- ▁CONSIDERED
- ▁NU
- ▁DEPEND
- ▁MORROW
- ▁MOUNT
- ▁KISS
- ▁LYING
- ▁SUFFERING
- ▁EXIST
- ERY
- OOK
- BA
- ▁PAINT
- AH
- ▁CAT
- ▁PURE
- ▁WISE
- ▁PRIVATE
- ▁REBECCA
- ▁VESSEL
- ▁CLEAN
- ▁GENTLEMEN
- ▁IRON
- ▁STORE
- ▁FUR
- ▁INDIANS
- ▁LOSE
- ▁BATH
- ▁NEWS
- ▁CHI
- ▁FA
- ▁CHARGE
- ▁PRIEST
- ▁WRITTEN
- ▁FORGOTTEN
- ▁TRAIL
- ▁CLOTHES
- ▁ALIVE
- ▁SUB
- ▁REPLY
- ▁THROW
- ▁AB
- ▁SOLDIERS
- ▁ISN
- ▁COTTAGE
- ▁COURAGE
- ▁CONTAIN
- ▁BUILT
- ▁PAID
- ▁HUNT
- ▁CASTLE
- HOOK
- ▁MERE
- GGED
- ▁NI
- ▁UNC
- ▁PREPARED
- ▁BARE
- ▁SMILING
- ▁SPREAD
- ▁WEATHER
- ▁EDWARD
- ▁GERMAN
- ▁CURIOUS
- ▁SERVANT
- ▁DISCOVERED
- ▁TRAVEL
- EY
- ▁DANCE
- ▁PEN
- BR
- GEN
- ▁BREAKFAST
- ▁CHAMBER
- ▁WILLIAM
- ▁TERROR
- ▁SPITE
- ▁TIRED
- ▁LOCK
- ▁CONSIDERABLE
- TLE
- ▁MANAG
- ▁DRY
- ▁FINISHED
- ▁MILLION
- ▁FRE
- ▁MIS
- ▁PASSING
- ▁DRAW
- ▁BON
- ▁VA
- ▁VEN
- ▁MAKES
- ▁VAIN
- ▁BOTTOM
- ▁DRINK
- ▁FUTURE
- ▁RACHEL
- ▁SORROW
- ▁SIXTEEN
- ▁KNIT
- ▁PROUD
- WI
- ▁TOBY
- ▁NOISE
- ▁SLIGHT
- ▁PROCEED
- ▁FER
- ▁COVER
- ▁DRAWING
- ▁FAVOR
- ▁CATHERINE
- ▁NEWSPAPER
- ▁NOBODY
- ▁ROOF
- ▁WEALTH
- ▁PROVE
- ▁DRAWN
- TTED
- OKE
- ▁DETERMINED
- ▁DOG
- ▁REMEMBERED
- ▁OPENING
- ▁FLOWERS
- ▁GENTLE
- ▁KNIGHT
- ▁RECOVER
- ▁DESERT
- ▁MOTION
- ▁NICE
- ▁INTENTION
- ▁GROWING
- ▁CLOUD
- ▁MONTH
- HOOD
- ▁POT
- UDE
- ▁PLANT
- ▁MAD
- ▁ENJOY
- ▁FAT
- ▁COR
- ▁KNOWING
- ▁IDEAS
- IZED
- ▁CHEEK
- ▁EUROPE
- ▁KNOCK
- ▁ALARM
- ▁TONGUE
- ▁SPACE
- ▁PATSY
- ▁MISTRESS
- ▁HENRY
- ▁JERRY
- ▁LIKED
- ▁PLAYED
- ▁BOOKS
- ▁MODER
- ▁CORN
- ▁ELIZABETH
- ▁CLUB
- ▁BRAIN
- ▁TROOP
- ▁COOK
- ▁DU
- ▁FUN
- DAY
- ▁QUA
- ▁FLOW
- ▁DARE
- ▁DELIGHT
- ▁WOUND
- ▁DESCEND
- ▁EVERYWHERE
- ▁FRIGHTENED
- ▁GEORGE
- ▁PECULIAR
- ▁MACHINE
- ▁PATIENT
- ▁MEADOW
- ▁PEASANT
- ▁BURST
- ▁ORDINAR
- ▁SONG
- ▁BRAVE
- ▁EXISTENCE
- ▁LUCY
- ▁J
- ▁CAREFULLY
- ▁PRESENTLY
- ▁GEN
- ▁COW
- LLY
- ▁PROMISED
- UOUS
- ▁LIFTED
- ▁MEANING
- ALL
- ▁FAIL
- NER
- ▁REGULAR
- ▁VIRTUE
- ▁STUDY
- ▁PROTECT
- ▁FOND
- ▁FANCY
- ▁STOCK
- ▁KEY
- ▁JUSTICE
- ▁PACK
- LET
- ▁AFFAIRS
- ▁DIFFICULTY
- ▁WORE
- ▁COST
- ▁HEAT
- ▁SHOULDER
- ▁OFFERED
- ▁MISTAKE
- ▁DOLLARS
- ▁LOOKS
- QUA
- ▁BREAST
- ▁PRINCIPLE
- ▁CHARLES
- ▁TEETH
- ▁OCCUPIED
- ▁DROP
- ▁PAPA
- ▁SHEEP
- ▁KNOWS
- ▁DECK
- ▁BORE
- ▁EXC
- ▁SURPRISED
- ▁STATION
- ▁PL
- ▁PR
- ▁OURSELVES
- ▁SYMPATHY
- ▁RUTH
- ▁EXCITED
- ▁CONTROL
- ▁ANGRY
- ▁IMAGINATION
- ▁WITNESS
- ▁HOLDING
- THER
- DA
- ▁TRADE
- ▁CREATURE
- ▁SISTERS
- ▁JOIN
- LAS
- ▁ALTOGETHER
- ▁CIVIL
- ▁EMPTY
- ▁LEAP
- ▁HURT
- ▁BOLD
- ▁TASK
- ▁POLICE
- ▁DRAGON
- ▁MAID
- ▁CLAIM
- ▁SHAME
- ▁PHYSICAL
- ▁CONC
- ▁SEIZED
- ▁OB
- ▁LIVES
- ▁HEIGHT
- ▁GI
- ▁PAL
- ▁CHARMING
- ▁FEELINGS
- ▁SERVANTS
- ▁DELIVER
- ▁FRUIT
- ▁SATISFIED
- ▁STRUGGLE
- ▁WROTE
- ▁CONCEAL
- ▁MOVING
- ▁FLASH
- ▁OPPOSITE
- ▁HURRY
- ▁ROUGH
- ▁PRICE
- ▁AWFUL
- ▁SAND
- ▁SLIPP
- ▁SHOWN
- ▁SPRA
- ▁AGREED
- ▁FIXED
- ▁PERCEIVED
- ▁UPPER
- ▁FINGER
- ▁FINGERS
- ▁EAGER
- LF
- ▁EARS
- LIGHT
- ▁IMAGINE
- ▁LIKELY
- ▁COAST
- ▁UNITED
- ▁VAN
- ▁EXPLAINED
- ▁TELLING
- ▁DANGEROUS
- ▁DICK
- ▁COOL
- ▁CAL
- ▁INSIST
- BI
- ▁SECURE
- ▁HILLS
- ▁SAN
- ▁CHEER
- ▁FILL
- ▁BUY
- ZA
- HI
- ▁CLOTH
- ▁POSSESSED
- ▁ADVANCE
- ▁METHOD
- ATIVE
- ▁GREATLY
- ▁SMOKE
- ▁HIGHER
- ▁COMPANIONS
- ▁ANIMALS
- ▁GALL
- ▁QUIETLY
- ▁TRAVELL
- ▁RESOLVED
- ▁FLEW
- ▁CARLYLE
- ▁MEMORY
- ▁RESIST
- ▁GRAHAM
- ▁LAUGHING
- ▁FAITH
- ▁BIRD
- CRI
- ▁LEAVES
- ▁AMERICA
- ▁DEMAND
- BOARD
- ▁AWAKE
- ▁CURIOSITY
- ▁LANGUAGE
- ▁VIOLENT
- ▁AWARE
- ▁DOUBLE
- ▁LOOSE
- LIKE
- ▁ADAM
- ▁RISING
- ▁HOTEL
- ▁BAND
- ▁ENGAGED
- ▁HEADS
- ▁LOG
- ▁FORMED
- ▁WINDOWS
- ▁PREFER
- RUS
- ▁THROWN
- ▁ARCH
- ▁PAUSE
- ▁SERVE
- KIN
- ▁FALLING
- ▁VO
- ▁WHISPERED
- ▁POWERFUL
- ▁ER
- ▁DEPART
- ▁CRUEL
- ▁EXAMPLE
- ▁SMOOTH
- ▁INTRODUC
- ▁RELIGION
- ▁SEVENTEEN
- ▁ABSENCE
- ▁PRINT
- ▁SHINING
- ▁ICE
- ▁POET
- ▁DREADFUL
- ▁REQUIRED
- ▁ORIGINAL
- ▁POINTED
- ▁INSIDE
- ▁BROTHERS
- ▁PRODUCED
- ▁SPOKEN
- ▁CREATURES
- ▁FLY
- ▁TOM
- ▁PURSU
- ▁SYSTEM
- ▁EXCELLENT
- ▁EXCITEMENT
- ▁MIDDLE
- ▁FALSE
- ▁REGRET
- ▁RAY
- ▁PHYSICIAN
- ▁COP
- ▁VALUE
- ▁TOUCHED
- ▁FLAT
- ▁OAK
- ▁SUM
- ▁LOSS
- ▁PAPERS
- ▁STEPP
- ▁REVER
- ▁SHADE
- SOME
- ▁LISTENED
- ▁N
- ▁DISCOVER
- ▁BITTER
- TERN
- ▁HOLE
- ▁ADVANCED
- ▁PICK
- ARTAGNAN
- ▁CORPORAL
- ▁ASLEEP
- ▁TEMPLE
- ▁INDICAT
- IUM
- ▁FARTHER
- ▁EXCUSE
- ▁FLU
- ▁NOSE
- ▁SIXTY
- ▁SUPPOSED
- ▁PROVED
- ▁RATE
- ▁SHOULDERS
- ▁AFFAIR
- ▁FIELDS
- ▁REMARKED
- AVE
- ▁WEEKS
- ▁ESTABLISH
- ▁PARIS
- ▁ADMIT
- ▁NEIGHBOR
- ▁ATTRACT
- ▁CUSTOM
- ▁DISTINGUISH
- ▁SURFACE
- ▁COUPLE
- ▁DEVIL
- ▁LIMIT
- ▁ROYAL
- ▁FOOL
- ▁RARE
- ▁PRIDE
- ▁PROFESSOR
- ▁SAKE
- ▁DALE
- ▁VAST
- ▁REFUSED
- ▁FAILED
- ▁BAG
- ▁ROB
- ▁WASH
- ▁FAIRY
- ▁FREQUENT
- ▁MARILLA
- ▁PROGRESS
- ▁RELIEF
- ▁DROVE
- ▁DOZEN
- ▁AHEAD
- ▁ADVENTURE
- ▁GRANT
- ▁PRIM
- ▁MENTAL
- ▁PAIR
- ▁IMPRESSION
- ▁WOUNDED
- ▁FULLY
- ▁DISAPPEARED
- ▁MILE
- ▁DRIVE
- ▁MUD
- ▁SIZE
- ▁ANIMAL
- ZE
- ▁GRE
- ▁REPRESENT
- ▁ACQUAINTANCE
- ▁INSTRUMENT
- ▁SPLENDID
- ▁UNKNOWN
- ▁CORONEL
- ▁EMPEROR
- ▁EARNEST
- ▁EXTEND
- ▁BRIEF
- ▁RENDER
- ▁PARENTS
- ▁GENTLY
- ▁CALLING
- ▁TRIBE
- ▁CHRISTIAN
- ▁INTERESTING
- ▁LAMP
- ▁JIMM
- ▁DIV
- ▁LOVER
- UCH
- ▁HID
- ▁NEEDED
- ▁ORDERED
- ▁MEAL
- ▁SLOW
- ▁DAM
- ▁CLOUDS
- ▁DAN
- ▁GAR
- ▁EXPLAIN
- ▁QUI
- ▁CLIMB
- ▁HURRIED
- ▁MURMUR
- ▁SWIFT
- ▁ARTHUR
- ▁JEFF
- ▁KINGDOM
- ▁MESSAGE
- ▁PROTEST
- ▁ORGAN
- ▁RISK
- ▁FORGIVE
- ▁OCCURRED
- ▁PEARL
- ▁ODD
- ▁INFORMATION
- ▁BUSY
- ▁TRI
- ▁LACK
- ▁BAY
- ▁FLEET
- ▁CROWN
- ▁WAITED
- ▁BIRDS
- ▁PITY
- ▁SUCCEEDED
- ▁INFORMED
- ▁WISHES
- ▁DIRECTLY
- ▁CABIN
- ▁AUGUST
- ▁COUNTENANCE
- ▁HORROR
- ▁PHILIP
- ▁POPULAR
- ▁PREVIOUS
- ▁CONTRARY
- ▁ARTICLE
- ▁DIFFERENCE
- ▁HIDDEN
- ▁HUGE
- ▁AUTHORITY
- ▁POUND
- ▁JUMP
- ▁SPI
- ▁SHAKE
- ▁EVENTS
- ▁FRO
- ▁LEAN
- ▁CRO
- ▁TRIM
- ▁SHARE
- ▁FISHER
- ▁SETTLED
- ▁QUESTIONS
- ▁SI
- ▁VAL
- ▁APPROACHED
- ▁SUGGESTED
- ▁CONTINU
- ▁PERFORM
- ▁ACKNOWLEDG
- ▁CLIFF
- ▁COLONEL
- ▁GHOST
- ▁MAJESTY
- ▁EMOTION
- ▁SUPPER
- ▁DISTANT
- ▁INTERESTED
- ▁JACK
- ▁HUM
- ▁TRAMP
- ▁BRI
- ▁POUR
- ▁SHIPS
- ▁CHAIN
- ▁DY
- ▁RANK
- ▁MATTERS
- ▁LOVELY
- AW
- ▁PAT
- ▁WORKING
- ▁CONSEIL
- ▁EVIDENCE
- ▁MERCHANT
- ▁SOLEMN
- ▁CONSTANT
- ▁MINISTER
- ▁OFFICIAL
- ▁SENTIMENT
- ▁CENTURY
- ▁DELAY
- ▁JAMES
- ▁MATCH
- ▁FOREIGN
- ▁AROSE
- ▁BEAST
- ▁BAB
- ▁WIT
- ▁REMARKABLE
- ▁THOR
- ▁COMPAR
- ▁MAL
- ▁NEARER
- ▁FOURTH
- ▁GREY
- ▁MENTION
- ▁RUBB
- ▁CHARM
- ▁BARON
- ▁DESIRED
- SCAR
- ▁HOPED
- ▁TEACHER
- ▁MON
- ITCH
- BEL
- ▁PARTS
- ▁EIGHTY
- LAC
- GGING
- ▁REFLECT
- ▁COLLECT
- ▁BULL
- ▁CONSCIOUS
- ▁MOMENTS
- ▁DISTURB
- ▁COLLEGE
- ▁EGGS
- ▁STUPID
- ▁YESTERDAY
- ▁EXAMINE
- ▁FAULT
- ▁DEPTH
- ▁ROOT
- ▁MOUSE
- ▁SOUGHT
- ▁TURTLE
- ▁NATIVE
- ▁CRACK
- ▁SOLD
- ▁INVIT
- ▁PICKED
- ▁CEASED
- ▁HEARING
- ▁MIDS
- ▁PLAYING
- ▁STAGE
- ▁UNTO
- ▁GAIN
- ▁MIST
- ▁ORDERS
- ▁KNEES
- ▁TALE
- ▁DISTINCT
- ▁BENT
- ▁DESPAIR
- ▁TRIUMPH
- ▁SQUARE
- ▁THROAT
- ▁BOUGHT
- ▁PERMIT
- ▁SPEND
- ▁TRIP
- ▁THREATEN
- ▁ROME
- INESS
- ▁EXPOS
- GON
- ▁WRITING
- ▁INCREASED
- ▁PORTION
- ▁TENT
- IUS
- ▁YO
- ▁INTENDED
- ▁NAMED
- RATION
- ▁NOTIC
- ▁PIPE
- ▁WILLING
- ▁INSTANTLY
- ▁SERVED
- ▁BAL
- ▁POSSESS
- ▁CRE
- ▁ADMIRATION
- ▁LIBERTY
- ▁OPPORTUNITY
- ▁SELDOM
- ▁BIRTH
- ▁GLOW
- ▁INCLUD
- ▁REQUEST
- ▁TYPE
- ▁SLEPT
- ▁CRIME
- ▁MOTIVE
- ▁ELSIE
- ▁BEGUN
- ▁CONSENT
- ▁ADMITTED
- ▁AVOID
- ▁ADDRESS
- ▁HATE
- ▁DEMANDED
- ▁APPARENTLY
- ▁SUGGESTION
- ▁CONSIDERATION
- ▁BLESS
- ▁PROCEEDED
- NCY
- ▁PRISON
- ▁CONT
- ▁SHOUTED
- ▁FACES
- ▁SPIRITS
- ▁DEVELOP
- ▁ACCIDENT
- ▁ADVICE
- ▁INNOCENT
- ▁INSTINCT
- ▁UNCONSCIOUS
- ▁MYSTERIOUS
- ▁PRETEND
- ▁PEEP
- ▁ANYONE
- ▁DUKE
- ▁PLUM
- VILLE
- ▁SEVERE
- ▁ALAS
- ▁DELIGHTED
- ▁ISSUE
- ▁ASKING
- ▁CROW
- ▁ACCEPTED
- ▁RIDE
- ▁DOORS
- ▁TAR
- ▁PREPAR
- ▁SUGGEST
- WOOD
- ▁CITIZEN
- ▁ENTRANCE
- ▁LINCOLN
- ▁POLITICAL
- ▁PRACTICAL
- ▁STIFF
- ▁WIDOW
- ▁CAPITAL
- ▁CLEVER
- ▁MAMMA
- ▁CREDIT
- ▁OBEY
- ▁STRING
- ▁DAILY
- ▁ARGUMENT
- ▁HEAP
- ▁APARTMENT
- ▁FLIGHT
- ▁ELDER
- ▁PUR
- ▁PAGE
- ▁DUST
- ▁GAZE
- ▁NATIONAL
- ▁BABY
- DDING
- ISTS
- ▁TEACH
- ▁STREETS
- CAL
- ▁GE
- AFF
- ▁GOES
- ▁POSSIBL
- UNG
- ▁LINES
- GUE
- ▁VOTE
- ▁HUNTING
- ▁QUO
- ▁RESEMBL
- ▁BASKET
- ▁CIRCLE
- ▁CONSEQUENCE
- ▁KITCHEN
- ▁TREASURE
- ▁NEVERTHELESS
- ▁FANCI
- ▁ASSEMBL
- ▁GRIEF
- ▁VEIL
- ▁SEASON
- ▁INVENT
- ▁VIRGINIA
- ▁HUT
- ▁GUEST
- ▁ROAR
- ▁BEHOLD
- ▁VICTORY
- ▁CAPABLE
- ▁DULL
- ▁SHOE
- ▁FLOAT
- ▁MERRY
- ▁IMMEDIATE
- ETH
- ▁ELEANOR
- ▁EXPLANATION
- ▁PARLIAMENT
- ▁PRINCIPAL
- ▁PROPORTION
- ▁RESOLUTION
- ▁UNUSUAL
- ▁BLUFF
- ▁NINETEEN
- ▁SENSATION
- ▁VISIBLE
- ▁INCOME
- ▁FATE
- ▁SUPER
- ▁LAUGHTER
- ▁EASE
- ▁LOAD
- ▁JEW
- ▁ZE
- ▁FEVER
- ▁WEDDING
- ▁JOINED
- ▁TRACE
- ▁LEADER
- ▁CLEARLY
- ▁FLOWER
- ▁TERMS
- ▁EMPLOYED
- OCK
- ▁PARTICULARLY
- ▁MEMBERS
- ▁CONFESS
- ▁GRO
- ▁ADDRESSED
- ▁CHRIST
- ▁ACCOMPANI
- ▁AFFORD
- ▁AMOUNT
- ▁BRILLIANT
- ▁COMMUNICAT
- ▁FIERCE
- ▁RECORD
- ▁SACRIFICE
- ▁TEMPT
- ▁CORDIAL
- ▁COLOUR
- ▁PROOF
- ▁ESTATE
- ▁PARDON
- ▁ADVIS
- ▁ATTITUDE
- ▁IMPORTANCE
- ▁BOOT
- ▁SHOCK
- ▁FIR
- ▁PLENT
- ▁HIT
- ▁MEMBER
- ▁SUR
- ▁SEATED
- ▁MAG
- AVING
- ▁FAVOUR
- ▁REMARK
- ▁DIM
- ▁FAITHFUL
- ▁SAVED
- CHI
- ▁SIN
- THE
- ▁CONFIDENCE
- ▁EXTRAORDINARY
- ▁FORTUNATE
- ▁MISFORTUNE
- ▁PATIENCE
- ▁RELIGIOUS
- ▁SATISFACTION
- ▁POSITIVE
- ▁SIMILAR
- ▁EXCHANG
- ▁RETREAT
- ▁FLESH
- ▁ADMIRE
- ▁SPIRITUAL
- ▁DAWN
- ▁BURIED
- ▁URGE
- ▁SUNDAY
- ▁FOX
- ▁EMMA
- ▁NURSE
- ▁SNAPP
- ▁PARK
- ▁OBTAIN
- ▁RECOGNIZED
- ▁SPEED
- ▁MAGIC
- ▁LAWS
- ▁REMOVED
- ▁HAM
- ▁PRESERV
- ▁AID
- HOUSE
- ▁MENTIONED
- ▁CONSCIENCE
- ▁CONTEMPT
- ▁DETAIL
- ▁IMMENSE
- ▁NERVOUS
- ▁PRISCILLA
- ▁UNFORTUNATE
- ▁UNHAPPY
- ▁COMPLAIN
- ▁TWICE
- ▁WHISTL
- ▁SNAKE
- ▁WASHINGTON
- ▁PIRATE
- ▁WICKED
- ▁BODIES
- ▁DESIGN
- ▁JASON
- ▁VAGUE
- ▁CONSIST
- ▁GIFT
- ▁ANGEL
- ▁RODE
- ▁FOLD
- ▁BRIDE
- ▁ANGER
- ▁BASE
- ITUDE
- ▁CONCLUDED
- ▁ALTER
- ▁FRI
- ▁PANT
- ▁BID
- ▁HIGHEST
- ▁SAILOR
- MPLE
- ▁OBSERV
- ▁CHEERFUL
- IFICATION
- RID
- ▁DESCRIBED
- ▁BIN
- ▁JEWEL
- ▁ARTIST
- ▁PEER
- ▁NORA
- ▁SKI
- ▁DIAMOND
- ▁ENCOURAGE
- ▁PRIVILEGE
- ▁PROJECT
- ▁ANYBODY
- ▁ENCOUNTER
- ▁HOLLOW
- ▁YIELD
- ▁BOBBY
- ▁SAVAGE
- ▁SOMEBODY
- ▁OTHERWISE
- ▁PRAISE
- ▁PROBLEM
- ▁DISTRESS
- ▁UGLY
- ▁WARRIOR
- ▁MOURN
- ▁RELIEV
- ▁DESK
- ▁FOOLISH
- ▁STARTLED
- ▁SKILL
- SHONE
- ▁LONE
- ▁OBSERVATION
- ▁DENI
- ▁NEST
- ▁SOLDIER
- ▁RELATION
- ▁TRULY
- ▁VISITOR
- ▁OFFICERS
- ERSON
- ▁YA
- ▁EVIDENT
- ▁DREAMS
- ▁KEEPING
- ▁PLAINLY
- ▁DRUNK
- ▁EMBRAC
- ▁INTELLIGENCE
- ▁LIEUTENANT
- ▁PERSUADE
- ▁SURROUNDING
- ▁UNIVERSAL
- ▁GLEAM
- ▁SUPERIOR
- ▁WHEEL
- ▁JEALOUS
- ▁QUEER
- ▁PIERRE
- ▁MILK
- ▁RAIL
- ▁FLUSH
- ▁STAIRS
- ▁JESUS
- ▁HORN
- ▁REGION
- ▁SAFETY
- ▁KA
- ▁GUIDE
- ▁CAKE
- ▁CUP
- ▁INQUIRED
- ▁DEFI
- ▁LESSON
- ▁WRETCHED
- ▁PACE
- ▁TEST
- ▁READING
- ▁ENTIRE
- ▁NET
- ▁DOGS
- ▁COMMANDER
- ▁PRODUCE
- ▁GAINED
- ▁ARRIVAL
- ▁FAMILIAR
- ▁MEANWHILE
- ▁SUSPICION
- ▁CHOICE
- ▁IMPULSE
- ▁THRUST
- ▁PROCESS
- ▁SUMMON
- ▁SHEPHERD
- ▁HASTILY
- ▁GRASP
- ▁COUNTESS
- ▁STYLE
- ▁DWELL
- ▁MERIT
- ▁PITCH
- ▁HUNGRY
- ▁SPORT
- ▁LOUISE
- ▁STERN
- ▁PROVIDED
- ▁ASSUME
- ▁EARLIE
- ▁RAGE
- ▁U
- ▁RAPIDLY
- PORT
- ▁SUCCESSFUL
- ▁FLED
- ▁AGREE
- ▁CONDITIONS
- ▁RELATIONS
- ▁DREAD
- ▁NATURALLY
- ▁EARL
- ▁GAY
- ▁HYPNOTI
- ▁PUTT
- ▁GAZ
- ▁JIM
- ▁PAUS
- ▁PROPOS
- ▁ADMINISTRATION
- ▁ELEVEN
- ▁HOSPITAL
- ▁MAGISTRATE
- ▁STRIKE
- ▁DIGNITY
- ▁GLORY
- ▁BOTTLE
- ▁THRONE
- ▁RECKON
- ▁COSETTE
- ▁MOREOVER
- ▁APPLI
- ▁HIND
- ▁PRODUCT
- ▁POOL
- ▁TRIAL
- HAN
- ▁ERIC
- ▁CUB
- ▁PIECES
- ▁EXCEPTION
- ▁ENJOYED
- ▁DARED
- ▁TRU
- ▁CLOSELY
- ▁RAPID
- ▁AFFECTED
- ▁REQUIRE
- ▁SOFTLY
- ▁BROW
- UCK
- ▁MARKED
- ▁SEVENT
- ▁ELECT
- ▁FORGOT
- ▁CORRECT
- ▁FRANCS
- ▁MARGUERITE
- ▁SCIENCE
- ▁UNEXPECTED
- ▁FOUGHT
- ▁MILITA
- ▁THUNDER
- ▁VOYAGE
- ▁GANEM
- ▁FREEDOM
- ▁NODDED
- ▁CAPTURE
- ▁MORTAL
- ▁OWNER
- ▁POLITE
- ▁VISION
- ▁EDUCATION
- ▁GOVERNOR
- ▁RAV
- ▁REWARD
- ▁HASTE
- ▁REPEAT
- ▁DETERMIN
- ▁PITI
- ▁KNEE
- LINE
- ▁DEVOTED
- ▁INTERRUPTED
- ▁FOLKS
- ▁EXTREME
- ▁APPROACH
- ▁CONTINUE
- ▁BEARING
- ▁CHAP
- ▁ACQUAINTED
- ▁GLIMPSE
- ▁GRADUALLY
- ▁SUNSHINE
- ▁PRACTICE
- ▁SUPPLI
- ▁DAVID
- ▁DRIFT
- ▁SHOWING
- ▁LEVEL
- ▁PROMPT
- ▁QUARREL
- ▁REPRESENTATIVE
- ▁PLUNG
- ▁GIANT
- FALL
- ▁STOUT
- CHA
- WEPT
- ▁GLANC
- ▁SALT
- ▁CHOSEN
- ▁BUCK
- ▁REALIZED
- ▁REALITY
- ▁TUR
- ▁DRIVEN
- ▁CARD
- ▁PRAYER
- ▁TERM
- AID
- ▁HOLY
- ▁ENDURE
- ▁RANGE
- ▁HANG
- ▁SAM
- LAN
- ▁CAVE
- INA
- ▁GRI
- ▁SIGH
- ▁NEIGHBOUR
- ▁COUNCIL
- ▁EXERCISE
- ▁NAUTILUS
- ▁SOMEWHERE
- ▁SYLVIA
- ▁THOROUGH
- ▁VICTIM
- ▁BRIDGE
- ▁COMPELLED
- ▁INCLINED
- ▁OVERCOME
- ▁RESERVE
- ▁ARREST
- ▁PRECIOUS
- ▁DUTCH
- ▁OCEAN
- ▁ACQUIR
- ▁RECALL
- ▁DESTIN
- ▁ATTACH
- ▁SLIM
- ▁WEEP
- ▁CONSCIOUSNESS
- ▁TIGHT
- ▁WAKE
- ▁COMFORTABLE
- ▁ACTIVE
- ▁WINGS
- ▁GRIN
- ▁AFFECT
- ▁WHIT
- ▁IDEAL
- ▁EASTER
- ▁APPROACHING
- ▁CREATED
- ▁PLANS
- ▁INCREASE
- ▁FLYING
- ▁SHOUT
- OES
- MISSION
- ▁ARMED
- ABILITY
- ▁BLUSH
- ▁CONNECTION
- ▁MATTHEW
- ▁MEDICINE
- ▁REMIND
- ▁EXHIBIT
- ▁BLOCK
- ▁DESERVE
- ▁LISTENING
- ▁TITLE
- ▁FLOUR
- ▁FLAME
- ▁AGENT
- ▁USEFUL
- ▁BRIG
- ▁BOIL
- ▁ASSURED
- ▁REFLECTION
- ▁PINE
- ▁WAG
- ▁YOUNGER
- ▁BEARD
- ▁KINDNESS
- CTUALLY
- ▁ACTUAL
- ▁WEIGHT
- ▁LILY
- ▁IMPRESS
- ▁DESCRIBE
- ▁BEHELD
- ▁COMMUNITY
- ▁DESPERATE
- ▁DISPLAY
- ▁ENEMIES
- ▁MELANCHOLY
- ▁MIRROR
- ▁RECOMMEND
- ▁SPANISH
- ▁BLAME
- ▁VOLUME
- ▁SHOOT
- ▁COMBIN
- ▁SHAKING
- ▁SOUTHERN
- ▁MYSTERY
- ▁EVERYONE
- ▁COMMISSION
- ▁COMPOSED
- ▁UDO
- ▁IMAGE
- ▁DECEIV
- ▁FAILURE
- ▁PATTY
- ▁ALICE
- ▁FRAME
- ▁MODEST
- ▁MAGNIFICENT
- ▁BRANCHES
- ▁REIGN
- ▁RAG
- ▁PARISH
- ▁KATE
- ▁AMID
- ▁SLEEPING
- ▁ANNOUNCED
- ▁EAGERLY
- ▁WIRE
- ▁LAP
- ▁ARAB
- ▁EATING
- ▁RUM
- ▁CAREFUL
- ▁DISCUSS
- WORTH
- ▁DISTRICT
- ▁FOREHEAD
- ▁FRANCIS
- ▁INCIDENT
- ▁APPEAL
- ▁EMBARRASS
- ▁MAINTAIN
- ▁PRONOUNC
- ▁FURNISH
- ▁STRAIN
- ▁ELEMENT
- ▁SILK
- ▁FEAST
- ▁RECENT
- ▁DANCING
- ▁LODGE
- ▁ASHAMED
- ▁TRICK
- ▁BOBO
- ▁STUFF
- ▁ET
- ▁ASSERT
- ▁SANK
- ▁TREATMENT
- ECI
- ▁SWIM
- ▁BECOMING
- ▁SINGING
- ▁PLATE
- ▁SCATTERED
- ▁EXTREMELY
- ▁GRIM
- ▁SANG
- ▁FIGHTING
- ▁FACTOR
- ▁PAINFUL
- ▁HIDE
- ▁FUNN
- ▁AFTERWARD
- ▁FROG
- ▁VENTURE
- ▁DISAPPOINT
- ▁COMRADE
- ▁MONSIEUR
- ▁OBVIOUS
- ▁PASSENGER
- ▁PROFOUND
- ▁PUBLISH
- ▁ACCUSTOM
- ▁BLOOM
- ▁SMITH
- ▁RELATIVE
- ▁ACCUSE
- ▁MANIFEST
- ▁SOLID
- ▁MONSTER
- ▁MARIUS
- ▁CANDLE
- ▁PROCUR
- ▁INTERFERE
- ▁HOUSEHOLD
- ▁DEVELOPMENT
- ▁AGREEABLE
- ▁HALT
- ▁NECESSITY
- FOLD
- ▁CITIES
- ▁REGI
- ▁GLOOMY
- BBL
- ▁SEPARATED
- ▁CHEST
- ▁STRIP
- ▁SPAR
- ▁DUN
- ▁SETTLE
- ▁STARED
- ▁HANGING
- ▁FEATURES
- ▁PILE
- ▁ORIGIN
- ARIES
- ▁LION
- ▁ALI
- ▁ASTONISHMENT
- ▁COMPLIMENT
- ▁DELICATE
- ▁COUNSEL
- ▁FIFTH
- ▁SUPPRESS
- ▁BURDEN
- ▁COMPLEX
- ▁ADDITION
- ▁CRUSH
- ▁TWIST
- ▁PIANO
- ▁BRUSH
- ▁CHECK
- ▁ANNIE
- ▁SHELTER
- ▁IMPROV
- ▁WESTERN
- ▁LOCAL
- ▁APPLE
- ▁GREET
- ▁MASK
- ▁RUSSIAN
- ▁TOWER
- ▁CREW
- ▁TIP
- ▁WANDERING
- ▁READER
- ▁WANDERED
- ▁DESTROY
- ▁OBSERVE
- MORE
- ▁ESCAPED
- ▁PET
- ▁BUILD
- ▁REAR
- ▁DESTROYED
- HIN
- ▁OWE
- ▁RANG
- ▁TEAR
- ▁NED
- ▁OFFICER
- ▁TRAP
- ▁OCCUR
- ▁APPOINTED
- ▁ATMOSPHERE
- ▁CHOOSE
- ▁CONCLUSION
- ▁CULTIVAT
- ▁DESCRIPTION
- ▁ENORMOUS
- ▁EXHAUSTED
- ▁LANDSCAPE
- ▁NATASHA
- ▁PROSPECT
- ▁REFRESH
- ▁SPECIES
- ▁SURROUNDED
- ▁WEAPON
- ▁BLANK
- ▁DEFEND
- ▁EDITH
- ▁HORRIBL
- ▁BETRAY
- ▁FERKO
- ▁LABOUR
- ▁NEGRO
- ▁RESUMED
- ▁LEAF
- ▁MUSKET
- ▁INTENSE
- ▁MERCY
- ▁ADOPT
- ▁SCORE
- ▁DASH
- ▁LAWYER
- ▁SLOPE
- ▁CHUCK
- ▁ASSISTANCE
- ▁BROOK
- ▁BREAKING
- ▁ASSIST
- ▁GROAN
- ▁HELEN
- ▁BEHAV
- ▁MAIDEN
- ▁CRIS
- ▁SHOUTING
- ▁NAY
- ▁PIG
- ▁ACCORDINGLY
- ETTE
- ▁DESIR
- ▁RUB
- ▁GRU
- ▁PIT
- ▁HEAVI
- ▁OBTAINED
- ▁SPARE
- ▁BRANCH
- ▁COUNTER
- ▁APART
- ▁AMBITION
- ▁ASTONISHED
- ▁CORRESPOND
- ▁DRIVING
- ▁ENERGY
- ▁HISTORIAN
- ▁REVOLUTION
- ▁SWEEP
- ▁TREMBLING
- ▁CRAFT
- ▁FAMILIES
- ▁LITERATURE
- SBURG
- ▁FEMALE
- ▁TILNEY
- ▁GENEROUS
- ▁SUBMIT
- ▁INTELLECTUAL
- ▁ORCHARD
- ▁STORIES
- ▁DIANA
- ▁VEIN
- ▁TRIFL
- ▁TWIN
- ▁WORSHIP
- ▁MARBLE
- ▁GALLANT
- ▁SENSIBLE
- ▁NEAT
- ▁BROWNIE
- ▁JUNE
- ▁SHAW
- ▁WORST
- ▁USELESS
- ▁FISHING
- ▁CRYING
- ▁MAYBE
- ▁VARI
- ▁PRESERVE
- ▁VOL
- ▁EMPLOY
- ▁INTERRUPT
- ▁SLIGHTLY
- ▁ACCOMPLISHED
- NEY
- ▁STEAM
- ▁BALANC
- ▁LEANING
- ▁SIGHED
- ▁REFUSE
- ▁IMAGINED
- ▁DATE
- GROUND
- ▁ENTERTAIN
- ▁PERCEIVE
- ▁ABROAD
- ▁CHEESE
- ▁DESTRUCTION
- ▁ESSENTIAL
- ▁EXPEDITION
- ▁GRANDFATHER
- ▁INFINITE
- ▁LIBRARY
- ▁MULTITUDE
- ▁NEGLECT
- ▁SWALLOW
- ▁VILLEFORT
- ▁BELOVED
- ▁COMMITTEE
- ▁CONFIDENT
- ▁PURPLE
- ▁PURCHAS
- ▁SCRAP
- ▁SPOIL
- ▁LIKEWISE
- ▁EXTRA
- ▁STRAW
- ▁SALUT
- ▁SOURCE
- ▁HASTENED
- ▁RESENT
- ▁FLOCK
- ▁LOFT
- ▁FLO
- ▁CLO
- ▁CONVINCED
- ▁GOODNESS
- ▁HYPNOTIZ
- ▁SETTING
- ▁HAIL
- ▁PHI
- ▁GROVE
- ▁DISCOVERY
- ▁DAMP
- ▁WHISPER
- ▁LIFT
- ▁HOP
- ▁SUSPECTED
- ▁SCR
- OLI
- ▁FAC
- ▁BUSH
- ▁FOREVER
- ▁BARRICADE
- ▁CONSTITUTION
- ▁ENDEAVOR
- ▁ENTHUSIASM
- ▁EXECUTION
- ▁HYACINTH
- ▁PERCEVAL
- ▁PSYCHE
- ▁REPROACH
- ▁THIRTEEN
- ▁ABSORB
- ▁GRATITUDE
- ▁MERCER
- ▁REPUTATION
- ▁SCREAM
- ▁PUPIL
- ▁RETIRED
- ▁STEEP
- ▁SUMMIT
- ▁MISERABLE
- ▁STRICT
- ▁MINGLED
- ▁DEFEAT
- ▁REVEAL
- ▁LOVING
- ▁GOOSE
- ▁ECHO
- ▁AWAIT
- ▁MOOD
- ▁CRAWLEY
- ▁CELL
- ▁ENGAGEMENT
- ▁PRECED
- ▁SOMEONE
- ▁ARRANGEMENT
- ▁PICKET
- ▁GASP
- ▁HUMOR
- ▁INVITATION
- ▁JOB
- WITHSTAND
- ▁LAMENT
- ▁CLASSES
- ▁HUNGER
- ▁DISPOSED
- ▁STEAMER
- ▁FEARFUL
- ▁GER
- ▁FINAL
- ▁FLAG
- ▁JULY
- ▁DIG
- WORK
- ▁OPPOS
- ▁ANXIETY
- ▁AUDIENCE
- ▁BACHELOR
- ▁COLUMN
- ▁HANDKERCHIEF
- ▁IMPATIENT
- ▁JUDGMENT
- ▁KNIFE
- ▁SOVEREIGN
- ▁STRIKING
- ▁THOMPSON
- ▁EMPIRE
- ▁FULFIL
- ▁CONSULT
- ▁JENNY
- ▁THENARDIER
- ▁POYSER
- ▁FOURTEEN
- ▁JAPANESE
- ▁INDULG
- ▁MARTIAN
- ▁COUNTRIES
- ▁FETCH
- ▁CRITIC
- ▁ROBBER
- ▁CROOK
- ▁DEPARTURE
- ▁MABEL
- ▁PREACH
- ESCENT
- ▁WHIP
- ▁NAIL
- ▁DELIGHTFUL
- ▁DISCUSSION
- ▁SENTENCE
- ▁LANE
- ▁ENGINEER
- ▁ARRANGED
- MMY
- ▁LEST
- ▁RENT
- MMED
- ▁LIST
- ▁ROBE
- ▁MISSION
- ▁GRACEFUL
- ▁LIGHTN
- STONE
- COURT
- ▁CONCEPTION
- ▁CONTRACT
- ▁DROWN
- ▁EXPERIMENT
- ▁HITHERTO
- ▁PLAGUE
- ▁PORTHOS
- ▁SHRIEK
- ▁DETECT
- ▁ACCENT
- ▁ERECT
- ▁SAZEN
- ▁PROFIT
- ▁VIVID
- ▁SQUIRE
- ▁OPERATION
- ▁SMELL
- ▁SIMON
- ▁EXTENT
- ▁KEEN
- ▁EMERG
- ▁REVIV
- ▁REGIMENT
- ▁DISAPPOINTMENT
- ▁STOLE
- ▁DIVINE
- ▁GUILTY
- ▁COWARD
- ▁EXPECTATION
- ▁SIGNOR
- ▁MODE
- ▁CENTRE
- ▁FIL
- HOW
- ▁WEARI
- ▁TOTAL
- ▁VICTOR
- ▁GOVERN
- ▁RAISE
- ▁ABANDON
- ▁ABSURD
- ▁ASPECT
- ▁CRIMINAL
- ▁DEFINITE
- ▁DELIBERAT
- ▁FEATHER
- ▁FLORINA
- ▁MIDNIGHT
- ▁RICHMOND
- ▁SATISFY
- ▁SINGULAR
- ▁STEADILY
- ▁SUPREME
- ▁TIMBER
- ▁PSYCHOLOG
- ▁GESTURE
- ▁VALUABLE
- ▁INTERVAL
- ▁CONFUSION
- ▁FLUTTER
- ▁SACRED
- ▁DISEASE
- ▁UNDERTAKE
- ▁PENETRAT
- ▁MARVEL
- ▁NORTHERN
- ▁GRIEV
- ▁GENIUS
- ▁SADDLE
- ▁NOVEL
- ▁MISERY
- ▁CONVICTION
- ▁SINK
- ▁WAGON
- ▁ARISE
- ▁COMMENT
- ▁BARN
- UPON
- ▁FENCE
- ▁ASSOCIATION
- ▁BONES
- ▁IDLE
- ▁DOUBTFUL
- ▁PREPARATION
- IZZ
- ▁RAIS
- ▁BITTERLY
- ▁JOE
- ▁RELI
- ADI
- ▁METAL
- ▁EXACT
- ▁GLOOM
- FIELD
- ▁DANGLARS
- ▁DISGRACE
- ▁EXAMINATION
- ▁FASCINAT
- ▁GLITTER
- ▁INCREASING
- ▁MESSENGER
- ▁PATRIOT
- ▁PLATFORM
- ▁PROVISION
- ▁QUALITIES
- ▁SELECT
- ▁STEADY
- ▁POVERTY
- ▁POWDER
- ▁PROPHET
- ▁HOLLAND
- ▁TRUNK
- ▁VARIETY
- ▁PLANCHET
- ▁CONQUER
- ▁CONCEIVE
- ▁COMBAT
- ▁STOOP
- ▁SHIRT
- ▁GENERATION
- ▁COMMITTED
- ▁INSULT
- ▁CONFUSED
- ▁RADIAN
- ▁DEBT
- ▁IMITAT
- ▁DART
- ▁CAROLINE
- ▁SWAM
- ▁WREN
- ▁CHILDHOOD
- ▁BRAND
- ▁JOKE
- ▁FRIENDSHIP
- ▁DIRT
- ▁JOLL
- ▁BUSHES
- ▁MINK
- ▁ROUT
- ▁EQUALITY
- ▁HESITATED
- ▁BARK
- ▁ANTI
- ▁STATEMENT
- PHER
- ▁SUNK
- ▁DAT
- ▁BACKWARD
- ▁SUSPECT
- ▁OBJECTION
- ▁RAP
- ▁CHIN
- ▁MATE
- ▁REDUC
- ▁GREGG
- ▁ACCOMPANY
- ▁ANYWHERE
- ▁BENEFIT
- ▁CLERK
- ▁EXPENSE
- ▁FETNAH
- ▁INTERPRET
- ▁LUKASHKA
- ▁NUMEROUS
- ▁SURGEON
- ▁PUZZL
- ▁RESCUE
- ▁GRATEFUL
- ▁APPROV
- ▁RIVAL
- ▁NIECE
- ▁FLOOD
- ▁VANISHED
- ▁ERROR
- ▁BLAZ
- ▁TUMBL
- ▁WENDY
- ▁PERSIST
- ▁CONSOL
- ▁SOAP
- ▁HUMOUR
- ▁FITTED
- ▁HOUSEKEEPER
- ▁ENABL
- ▁OCCASIONALLY
- ▁HATRED
- ▁SWELL
- ▁WORRY
- ▁RUST
- ▁PURSUIT
- ▁INTIMATE
- ▁SEAL
- ▁COLLECTION
- ▁TREMBLED
- ▁DENY
- ▁HUMANITY
- ▁FATAL
- ▁COCK
- ▁DRIVER
- ▁HOPELESS
- ▁MISTAKEN
- ▁LUC
- ▁ACCOMPLISH
- ▁COAL
- ▁ACCORD
- ▁PURSE
- ▁SEPARATE
- ▁ARRIVE
- ▁SMOK
- ▁MADAM
- ▁ASSOCIAT
- ▁INSTRUCT
- ▁CELEBR
- ▁CHANNEL
- ▁CIVILIZATION
- ▁DOCTRINE
- ▁ENDEAVOUR
- ▁GLACIER
- ▁INTELLIGENT
- ▁INVOLVE
- ▁LEATHER
- ▁MUTTERED
- ▁OLENIN
- ▁PENCROFT
- ▁PERPLEX
- ▁SPECTATOR
- ▁UNIVERSITY
- ▁ATTAIN
- ▁INEVITABL
- ▁YONDER
- ▁ENCHANT
- ▁REPAIR
- ▁CURRENT
- ▁ASCEND
- ▁CREEK
- ▁SPARKL
- ▁RUE
- ▁BEAVER
- ▁INFANT
- ▁CONTINUALLY
- ▁CLASP
- ▁IRISH
- ▁ROLLIN
- ▁PUNISHMENT
- ▁LUNCH
- ▁AGONY
- ▁RUDE
- ▁DRAGG
- ▁INQUIRI
- ▁SEX
- ▁TERRIFI
- ▁ROBIN
- ▁PROFESSIONAL
- ▁SPUR
- ▁GRAIN
- ▁VINE
- ▁PENN
- ▁ROC
- ▁CHASE
- ▁INFORM
- ▁WRITER
- ▁AVO
- ▁TAP
- ▁CREAT
- ▁WHIL
- ▁BARR
- ▁ASSURE
- ▁CIRCUMSTANCE
- ▁OIL
- ▁ROUSE
- ▁COLUMB
- ▁CUNNING
- ▁DOMESTIC
- ▁GLORIOUS
- ▁INDIGNATION
- ▁PRECISELY
- ▁PRUDENCE
- ▁RAILROAD
- ▁SATURDAY
- ▁UTMOST
- ▁VIOLENCE
- ▁WHIRL
- ▁CALCULAT
- ▁OVERWHELM
- ▁PERPETUAL
- ▁QUARLES
- ▁SLENDER
- ▁TELEGRAPH
- ▁ALOUD
- ▁OPPRESS
- ▁CROPPER
- ▁CANADIAN
- ▁HERBERT
- ▁TIMID
- ▁SUPPLY
- ▁STROLL
- ▁CREEP
- ▁OATH
- ▁DUSK
- ▁EXCESS
- ▁HUMBLE
- ▁FURIOUS
- ▁RIDGE
- ▁BULLET
- ▁PONY
- ▁STATU
- ▁ENJOYMENT
- ▁CONWAY
- ▁DIFFICULTIES
- ▁PATCH
- ▁JOYCE
- ▁CLOCK
- ▁RESTORED
- ▁ARGU
- ▁WIG
- ▁CHATT
- ▁PLAC
- ▁REMOVE
- ▁TORN
- ▁DISAPPEAR
- TIME
- WELL
- ▁RECOGNIZE
- ▁FISHE
- ▁DECLARE
- ISTIC
- ▁AUTHOR
- ▁WHISK
- ▁COFFEE
- ▁COMPREHEND
- ▁DISGUISE
- ▁ELZEVIR
- ▁ENTERPRISE
- ▁HOLIDAY
- ▁HORIZON
- ▁IGNORANT
- ▁INTERVIEW
- ▁OLIVER
- ▁RONICKY
- ▁CAPACITY
- ▁DISPOSITION
- ▁EXTERNAL
- ▁OPPOSITION
- ▁REPUBLIC
- ▁WHEAT
- ▁CORPSE
- ▁DARLING
- ▁THRILL
- ▁INHABITANTS
- ▁ORNAMENT
- ▁SHIFT
- ▁RECOGNISE
- ▁SHIVER
- ▁BOAST
- ▁HINT
- ▁BOSTON
- ▁MULTI
- IFYING
- ▁STEAL
- ▁INSTRUCTIONS
- ▁ELECTRIC
- ▁SWING
- ▁SOOTH
- ▁SCALE
- ▁MORLAND
- ▁DISLIKE
- ▁FLATTER
- ▁COACH
- ▁LEIF
- ▁STAMP
- ▁ANYHOW
- ▁MOTIONLESS
- ▁ANDREA
- ▁LOSING
- ▁PAUL
- ▁CAROL
- ▁ADVANC
- ▁IMAGIN
- ▁CENTER
- ▁JAR
- ▁SUCCEED
- ▁DISMISS
- CTOR
- ▁RECEIV
- ▁DRAG
- ▁INTENT
- ▁BARBAR
- ▁PUNISH
- ▁ABRUPTLY
- ▁BERNARD
- ▁DECISION
- ▁INDEPENDENT
- ▁PROVINCE
- ▁SLEEVE
- ▁TREMENDOUS
- ▁UNPLEASANT
- ▁LEISURE
- ▁THRONG
- ▁THUMB
- ▁BANNER
- ▁CONTRADICT
- ▁RESTRAIN
- ▁DIVIDED
- ▁WRAPPED
- ▁HAUNT
- ▁SNEER
- CHESTER
- ▁JULIA
- ▁MILD
- ▁CONTACT
- ▁MEANTIME
- ▁NEEDLE
- ▁BLOT
- ▁BARREL
- ▁ISABELLA
- ▁THEATRE
- ▁ESTABLISHMENT
- ▁MARKET
- ▁CHINA
- ▁FORBID
- ▁PERISH
- ▁DOORWAY
- ▁CARLING
- ▁PERIL
- ▁PRIZE
- ▁HATCH
- ▁CURL
- ▁REFER
- ▁DEVOT
- EMBER
- MONT
- ▁CANOE
- ▁PROFESSION
- ▁CONVICT
- ▁CRAWL
- ▁ACTIVITY
- ▁BEWILDER
- ▁BREEZE
- ▁CONTEMPLAT
- ▁DISGUST
- ▁FATIGUE
- ▁MERRICK
- ▁PRAIRIE
- ▁REFORM
- ▁SPECTACLE
- ▁STUDENT
- ▁TUMULT
- ▁UNIFORM
- ▁VIGOROUS
- ▁CONDEMN
- ▁GENUINE
- ▁THOMAS
- ▁ARROW
- ▁PILLOW
- ▁FEEBLE
- ▁RALPH
- ▁SCHEME
- ▁COLLAR
- ▁JUSTINIAN
- ▁NERVE
- ▁OYSTER
- ▁BENNET
- ▁DUTIES
- ▁BINGLEY
- ▁CHRISTMAS
- ▁CONVEY
- ▁DESPIS
- ▁RATTL
- ▁GARMENTS
- ▁GOWN
- ▁BERYL
- ▁BARRIER
- ▁CHARACTERISTIC
- ▁MEDITAT
- ▁DISCOURSE
- ▁STAFF
- ▁KARA
- ▁MONTE
- ▁READILY
- ▁VENTUR
- ▁HENCE
- ▁ROPE
- ▁CRIES
- ▁ANGLE
- ▁RESPECTABLE
- ▁MOAN
- ▁OUTLINE
- BORN
- ▁FIX
- ▁INTEND
- LIA
- ▁CHILL
- ▁CREP
- ▁CHOSE
- ▁SPECULAT
- ▁ATTRIBUT
- ▁BUFFALO
- ▁ENTREAT
- ▁ENVELOP
- ▁FREDERICK
- ▁IMPATIENCE
- ▁INDIFFERENCE
- ▁INDUSTRY
- ▁INSTITUTION
- ▁LYNDE
- ▁RETAIN
- ▁TROUTINA
- ▁UNCOMFORTABL
- ▁VENGEANCE
- ▁JENKS
- ▁CONGRESS
- ▁SMART
- ▁THITHER
- ▁DISAGREE
- ▁IMPROVEMENT
- ▁PISTOL
- ▁GOSSIP
- ▁ETERNAL
- ▁BELIEF
- ▁SLEDGE
- ▁AROUSED
- ▁ORANGE
- ▁FASTENED
- ▁MONKEY
- ▁WITHDREW
- ▁OFFEND
- ▁PIERC
- ▁MOONLIGHT
- ▁OARS
- ▁GROOM
- ▁FIDDLER
- ▁BARBARA
- SHIRE
- ▁ATTENDANT
- ▁DIVERS
- ▁DUCK
- ▁PROPOSAL
- ▁GROWTH
- ▁CURATE
- ▁STEWAR
- ▁MOCK
- ▁SUCCESSION
- ▁CREATION
- ▁PARTIAL
- ▁SWU
- ▁FROST
- ▁EIGHTH
- ▁AWE
- ▁PERCH
- ▁LACE
- SPOON
- ▁ARRANGE
- SERIES
- ▁FOG
- ▁SCU
- ▁ABRAHAM
- ▁ADMIRAL
- ▁BARBICANE
- ▁CAMPAIGN
- ▁CONSEQUENTLY
- ▁CULTURE
- ▁GRAMMONT
- ▁GWYNPLAINE
- ▁HAPPILY
- ▁HOOPDRIVER
- ▁INDEPENDENCE
- ▁LEOPOLD
- ▁MISCHIEF
- ▁MONTGOMERY
- ▁NECESSARILY
- ▁PSYCHIC
- ▁RABBIT
- ▁REFUGE
- ▁RESPONSIBILIT
- ▁SENATOR
- ▁UNCERTAIN
- ▁MENSTRUA
- ▁FANNY
- ▁SUBSTANCE
- ▁APRIL
- ▁ELBOW
- ▁QUALITY
- ▁BORDER
- ▁BRUTAL
- ▁CARPET
- ▁SOLITAR
- ▁FROWN
- ▁SCENT
- ▁ANNOY
- ▁NAKED
- ▁BOSOM
- ▁CONSUM
- ▁TIGER
- ▁ITALIAN
- ▁PARSON
- ▁DECLIN
- ▁NEIGHBORHOOD
- ▁GREGGORY
- ▁EXCEED
- ▁SILLY
- ▁ICELAND
- ▁HIDEOUS
- ▁STRU
- ▁ALTERNAT
- ▁CABINET
- ▁ABILITY
- ▁BEECH
- ▁SECRETARY
- ▁CONTEST
- ▁MONK
- ▁PADD
- ▁EVA
- ▁CREST
- ▁FINISH
- ▁APPARENT
- ▁MIX
- ▁SLIP
- ▁LUXURI
- ▁AUTUMN
- ▁CIRCULAR
- ▁COMPOSITION
- ▁DISPLEAS
- ▁EXCELLENC
- ▁FURNITURE
- ▁GRADUATE
- ▁INDIFFERENT
- ▁JOSEPH
- ▁OCCUPATION
- ▁POSSIBILITY
- ▁RENEWED
- ▁RESPONDED
- ▁PREVAIL
- ▁HOARSE
- ▁PRACTIS
- ▁FAREWELL
- ▁JULIET
- ▁OVERHEAD
- ▁THREAD
- ▁APPLICATION
- ▁SOLITUDE
- ▁ADAPT
- ▁FALK
- ▁LARK
- ▁COARSE
- ▁MANKIND
- ▁KICK
- ▁BATTER
- ▁SOLICIT
- ▁RESIGN
- ▁MOTOR
- ▁STEEL
- ▁CONTRIV
- ▁AUTHORITIES
- ▁HARSH
- ▁FAVORITE
- ▁TALENT
- ▁FLEECE
- ▁AGITATION
- ▁ABBE
- ▁STUCK
- ▁HEDGE
- ▁BIBLE
- ▁RECOLLECTION
- ▁PARTNER
- ▁DAMON
- ▁SHINE
- ▁HOOK
- ▁CONFESSION
- ▁ASSENT
- ▁ELDE
- ▁BIGGE
- ▁PEACEFUL
- SCRIBED
- ▁WEIGH
- CARLET
- ▁DECIDE
- ▁RECOLLECT
- ▁BOHEMIA
- ▁CALIFORNIA
- ▁CONSTRUCT
- ▁DEMONSTRAT
- ▁DISTRIBUT
- ▁FRIGHTFUL
- ▁GNOME
- ▁IGNORANCE
- ▁JANUARY
- ▁JULIUS
- ▁MEMORIES
- ▁OCCUPY
- ▁PHRASE
- ▁WHIRLWIND
- ▁WILMINGTON
- ▁CARLINI
- ▁CHAUVELIN
- ▁ESTEEM
- ▁GENZABURO
- ▁GLOBE
- ▁LECOQ
- ▁MARGARET
- ▁MONARCH
- ▁NAPOLEON
- ▁SCORN
- ▁STAGGER
- ▁SUSTAIN
- ▁TRADITION
- ▁ADJUST
- ▁FROZEN
- ▁IMPRISON
- ▁LANTERN
- ▁MICHEL
- ▁STOMACH
- ▁TORRENT
- ▁WITHDRAW
- ▁FRANZ
- ▁POISON
- ▁SURVEY
- ▁BRITISH
- ▁ELEVAT
- ▁AWOKE
- ▁ESTHER
- ▁INHERIT
- ▁TRAVERS
- ▁STOPPING
- ▁IRELAND
- ▁COMPARATIVE
- ▁SOBB
- ▁FAVOURITE
- ▁CANVAS
- ▁CLOAK
- ▁GLAR
- ▁ASSISTANT
- ▁DAMAGE
- ▁PEAK
- ▁DISTINCTION
- FARE
- ▁DOLLAR
- ▁BEGGAR
- LUSIVE
- ▁MODEL
- ▁SECUR
- ▁DISPOS
- ▁SLID
- ▁PEA
- ▁SPEEDI
- HOLD
- ▁SNAP
- ▁CIGAR
- ▁AFFLICT
- ▁AMAZEMENT
- ▁LAUNCELOT
- ▁LEAGUE
- ▁MARIPOSA
- ▁POPULATION
- ▁UNEASY
- ▁BLOSSOM
- ▁CATERPILLAR
- ▁INCLINATION
- ▁SUSPEND
- ▁SYNDIC
- ▁TAYLOR
- ▁WILSON
- ▁CONTRAST
- ▁PORTRAIT
- ▁CORONER
- ▁GREEK
- ▁BUNDLE
- ▁BLEW
- ▁THORPE
- ▁ORPHAN
- ▁MUSCLE
- ▁DEAF
- ▁SURVIV
- ▁EXCEEDINGLY
- ▁TENDENC
- ▁ISRAEL
- ▁QUANTIT
- ▁PENSION
- ▁DRIED
- TEXT
- ▁REFERENCE
- ▁REPOSE
- ▁FOLLY
- ▁REPLACE
- ▁TERR
- ▁ANKLE
- ▁SUNLIGHT
- ▁SECURITY
- ▁SHOV
- ▁RAW
- CULAR
- ▁JACKET
- ▁TUNE
- ▁HOBB
- ▁MARTIN
- DUCED
- ▁FIST
- ▁BEGG
- ▁CHOK
- ▁INQUIRE
- ▁INTELLECT
- ▁AMUSEMENT
- ▁APPROPRIATE
- ▁CONGRATULAT
- ▁CONVENTION
- ▁DISCOURAG
- ▁EXQUISITE
- ▁FOUNTAIN
- ▁JUNIOR
- ▁NONSENSE
- ▁OBSTACLE
- ▁SPECIMEN
- ▁SWEAR
- ▁TRANQUIL
- ▁VEHICLE
- ▁WISDOM
- ▁ASCERTAIN
- ▁CAUTIOUS
- ▁CENTURIES
- ▁CORRUPT
- ▁EXPLOR
- ▁TURKEY
- ▁BARGAIN
- ▁CONFOUND
- ▁FUNCTION
- ▁GRACIOUS
- ▁MONICA
- ▁ILLUSTRAT
- ▁CRUMB
- ▁REMEDY
- ▁REMOTE
- ▁REVENGE
- ▁BABYLON
- ▁CAUTION
- ▁INTERIOR
- ▁CRISTEL
- ▁BRAZ
- ▁THIRST
- ▁PROBABLE
- ▁HARMONY
- ▁CHARITY
- ▁DECAY
- ▁COLONI
- ▁AVAIL
- ▁REPULS
- ▁ABSENT
- ▁PULSE
- ▁PRESUM
- ▁CRANE
- ▁NEIGHBOURHOOD
- ▁SUNSET
- ▁CANNON
- ▁GRAPE
- ▁SOFA
- ▁DRANK
- MINOUS
- ▁DECLARATION
- ▁CLOSING
- ▁MEEK
- ▁STARV
- ▁BUNCH
- ▁PERFORMANCE
- ▁ENTERTAINMENT
- ▁STRIV
- ▁EMILY
- ▁VALET
- MPOSED
- ▁INTIMA
- ▁POLISH
- ▁HIRE
- POST
- ▁TREMBLE
- ▁CEASE
- ▁VIRGIN
- ▁RUSSIA
- COURSE
- ▁EDUCAT
- BOUND
- ▁INHABIT
- ▁SUPERINTEND
- ▁BISCUIT
- ▁CHICAGO
- ▁CHOKICHI
- ▁CONFLICT
- ▁ENCLOS
- ▁EXCLUSION
- ▁EXECUTIVE
- ▁GRANDMOTHER
- ▁HEADQUARTERS
- ▁INFERIOR
- ▁INVISIBLE
- ▁MUTUAL
- ▁OPPONENT
- ▁SENSITIVE
- ▁STUDIED
- ▁TEMPORARY
- ▁UNWILLING
- ▁PERMANENT
- ▁BEDROOM
- ▁NOVEMBER
- ▁COMPLICAT
- ▁DEVOUR
- ▁SCRAMBL
- ▁SECTION
- ▁PROPOSITION
- ▁DEPRIV
- ▁RYNCH
- ▁PLEAD
- ▁TORTURE
- ▁SCOUT
- ▁PILOT
- ▁CHERISH
- ▁SPEAR
- ▁SUGAR
- ▁JASPER
- ▁STRAY
- ▁RIFLE
- ▁NORMAL
- ▁JERK
- ▁HONEY
- ▁AWAKENED
- ▁QUIVER
- ▁PYE
- ▁APPLY
- LICK
- JA
- ▁ANNOUNC
- FORE
- ▁ENGINE
- ▁HESITATE
- ▁PROVIDE
- ▁REALIZE
- ▁SEIZE
- ▁RESTORE
- MOUTH
- FOOT
- ▁DIFFER
- ▁ULTIMATE
- ▁ABUNDANCE
- ▁APPRECIATE
- ▁APPREHENSION
- ▁AVENUE
- ▁AWKWARD
- ▁CETERA
- ▁CHIMNEY
- ▁CLUTCH
- ▁CONVENIENT
- ▁CORRIDOR
- ▁DISTRACT
- ▁ELEGANT
- ▁ELSEWHERE
- ▁ENTHUSIASTIC
- ▁EXECUTE
- ▁EXTREMIT
- ▁JERUSALEM
- ▁MIRACLE
- ▁MONSTROUS
- ▁OBEDIENCE
- ▁OBSCURE
- ▁PHENOMENA
- ▁RESIDENCE
- ▁RESOURCE
- ▁REVOLT
- ▁SCIENTIFIC
- ▁SHIELD
- ▁SIMPSON
- ▁UNIVERSE
- VOLUNTARY
- ▁ATTENTIVE
- ▁BRENDA
- ▁DEPOSIT
- ▁MAXIM
- ▁REJECT
- ▁STIRRED
- ▁DISORDER
- ▁SERENE
- ▁TOBACCO
- ▁MILTON
- ▁BALLOON
- ▁STEPHEN
- ▁STRAIT
- ▁CHINESE
- ▁COURTEOUS
- ▁RELEASE
- ▁RECESS
- ▁COTTON
- ▁STUMP
- ▁TANK
- ▁PROMOTE
- ▁DERIVE
- ▁LOYAL
- ▁GRANIT
- ▁DISMAL
- ▁CATTLE
- ▁DOONE
- ▁CUPID
- DIGNIFIED
- ▁RIPE
- ▁EXILE
- ▁ANTIQU
- UMINAT
- ▁SUPPOS
- ▁WRETCH
- ▁IDENTI
- ▁EASI
- ▁SERV
- ▁QUEST
- TOWN
- ▁ACHIEVEMENT
- ▁APPETITE
- ▁BUCCANEER
- ▁COMMENCED
- ▁DELAWARE
- ▁DISCERN
- ▁IMMORTAL
- ▁INDIGNANT
- ▁JOSIANA
- ▁MECHANICAL
- ▁MUSKRAT
- ▁REVIEW
- ▁ROBARTS
- ▁SIGNIFICANT
- ▁SUBSEQUENT
- ▁YOURSELVES
- ▁ANGRILY
- ▁BORROW
- ▁SUBLIME
- ▁AFRICA
- ▁CHICKEN
- ▁DEGRAD
- ▁GEORGI
- ▁HUMILIAT
- ▁LODGING
- ▁REDCOAT
- ▁VIOLET
- ▁HOPKINS
- ▁RAWDON
- ▁PRICK
- ▁WHALE
- ▁FUNERAL
- ▁GUINEA
- ▁DISMAY
- ▁PORCH
- ▁HARVEST
- ▁PARCEL
- ▁SUBDU
- ▁SYRIA
- ▁PANIC
- ▁BOUGHS
- ▁CIGARETTE
- ▁CHRON
- ▁INQUIRY
- ▁CRYSTAL
- ▁SPELL
- ▁PLUCK
- ▁PATTERN
- ▁DARING
- ▁CRITICISM
- ▁DAINT
- ▁DISTURBANCE
- ▁BUTCHER
- ▁LITERA
- ▁ABUSE
- IXTURE
- ▁ANIMAT
- ▁WRIT
- ▁BELIEV
- ▁INDUCE
- COMING
- ▁DRAMA
- ▁AGITAT
- SHAW
- ▁IMPERFECT
- ▁MANUFACTURE
- ▁AFFIRM
- ▁ANGUISH
- ▁ARTIFICIAL
- ▁BIBBS
- ▁CHARLOTTE
- ▁CIRCUS
- ▁CONNISTON
- ▁CONSTITUTE
- ▁DAZZL
- ▁DEFECT
- ▁DISCHARG
- ▁ESCORT
- ▁EXAGGERAT
- ▁GWENDOLEN
- ▁IRRESISTIBL
- ▁PHILOSOPHY
- ▁PHOTOGRAPH
- ▁PILGRIM
- ▁PLEASING
- ▁QUIXOTE
- ▁RESPONSE
- ▁SCRATCH
- ▁SERGEANT
- ▁SHERIFF
- ▁SHUDDER
- ▁STRUCTURE
- ▁SUFFRAGE
- ▁SURRENDER
- ▁SWORE
- ▁VILLAIN
- ▁HESITATING
- ▁FLORENCE
- ▁IRRITAT
- ▁RIGID
- ▁SINISTER
- ▁STUDIO
- ▁RAFT
- ▁CHAMPION
- ▁PAVEMENT
- ▁WOLF
- ▁DEVICE
- ▁WRECK
- ▁HESITATION
- ▁LAZY
- ▁ADJO
- ▁DECENT
- ▁INTERVEN
- ▁WOOL
- ▁ILLUSION
- ▁HAWK
- ▁IMPART
- ▁LUNGS
- ▁WINNING
- ▁VITAL
- ▁CONSPI
- ▁SUBTLE
- ▁CONSTANC
- ▁HURL
- ▁AMIABL
- ▁FOLK
- GGY
- ▁NECESSIT
- ▁PROFESS
- WASH
- ▁ADMIRING
- ▁AMBITIOUS
- ▁ANTHONY
- ▁CEREMONY
- ▁CONTRIBUTE
- ▁CRAGGS
- ▁DETAIN
- ▁DISCLOS
- ▁DWELT
- ▁EGYPT
- ▁FELIX
- ▁JOURNAL
- ▁KWAIRYO
- ▁LIBERAL
- ▁LUMBER
- ▁OCTOBER
- ▁ORGANIZATION
- ▁POPULACE
- ▁PRECAUTION
- ▁PREJUDICE
- ▁PROCLAIM
- ▁PROPRIETOR
- ▁RESPONSIBLE
- ▁RHYTHM
- ▁RIDICULOUS
- ▁SCHOLAR
- ▁SQUEEZ
- ▁SUBSTITUTE
- ▁SURPASS
- ▁THRESHOLD
- ▁WHARTON
- ▁FLICKER
- ▁AMAZED
- ▁BRONZE
- ▁COSSACK
- ▁SPILETT
- ▁CASUAL
- ▁DARCY
- ▁PARLOUR
- ▁SEXUAL
- ▁INSECT
- ▁NATHAN
- ▁EMINENT
- ▁PENCIL
- ▁PETITION
- ▁ROTTEN
- ▁VIGIL
- ▁CAESAR
- ▁EAGLE
- ▁TREAD
- ▁REACTION
- ▁TACIT
- ▁PARLOR
- ▁SPAIN
- ▁WILDERNESS
- ▁DICTAT
- ▁GRATIFY
- ▁STOVE
- ▁SKIRT
- ▁UTILI
- ▁CONCERT
- ▁GORGE
- ▁DECORAT
- ▁LATIN
- ▁ANCHOR
- ▁KNOT
- ▁MONDAY
- ▁GABLES
- ▁TOLERABL
- ▁ROGER
- BERRIES
- ▁INVAD
- IMMER
- OMETER
- ▁PRODUC
- OBIL
- ▁PERMISSI
- FICIENCY
- ▁WANDER
- RREL
- PIECE
- HORN
- ▁COMMIT
- ▁ACCUMULAT
- ▁JAPAN
- ▁ABUNDANT
- ▁ACADEMY
- ▁ALBERT
- ▁BANQUET
- ▁DELICIOUS
- ▁DOCUMENT
- ▁EXCLAMATION
- ▁FEBRUARY
- ▁GROTESQUE
- ▁HEATHERSTONE
- ▁HUMPHREY
- ▁HURSTWOOD
- ▁MOHAMMED
- ▁MOSCOW
- ▁NICHOLAS
- ▁OBSTINATE
- ▁PHANTOM
- ▁PHILOSOPHER
- ▁RECEPTION
- ▁SPANIARD
- ▁SWOLLEN
- ▁TELEPHONE
- ▁TRIBUTE
- ▁TUNNEL
- ▁UNREASONABL
- ▁WIGWAM
- ▁BUTTERFLY
- ▁COLLINS
- ▁DISPATCH
- ▁EDITOR
- ▁CONTINENT
- ▁DIMINISH
- ▁HORRID
- ▁KEATS
- ▁PROVIDENCE
- ▁BEHALF
- ▁CHARLEY
- ▁DRAKE
- ▁LAUNCH
- ▁SALOON
- ▁GIGANT
- ▁DISPUTE
- ▁HYSTERI
- ▁DEFENCE
- ▁SCREEN
- ▁VAULT
- ▁NINTH
- ▁HARBOR
- ▁FLANK
- ▁SPECK
- ▁UPRIGHT
- ▁KEMP
- ▁CANADA
- ▁STALK
- ▁OWL
- ▁BRUTE
- ▁FERRIS
- ▁DECREE
- ▁HABITUAL
- ▁BRISK
- ▁INSPIRE
- ▁HUSH
- ▁CROUCH
- ▁FRIDAY
- ▁MOUNTAINEER
- ▁HISTORIC
- ▁BATES
- ▁RUSK
- ▁SEMI
- DICTION
- ▁BUSI
- ▁REMOV
- MMI
- ▁SUFFIC
- ▁FLEE
- ▁LOUIS
- NLEA
- ▁IMPORT
- OLOGY
- ▁CLERGY
- ▁ADVERTISEMENT
- ▁BENEVOLEN
- ▁BORODINO
- ▁CATHOLIC
- ▁COMMERCIAL
- ▁CONJECTURE
- ▁CURTAIN
- ▁CUTHBERT
- ▁DEMOCRACY
- ▁GUARANTEE
- ▁HYPNOSIS
- ▁INDEFINITE
- ▁INVESTIGATION
- ▁IRREGULAR
- ▁KOYO
- ▁MERRIWIG
- ▁MIRANDA
- ▁NICHOLL
- ▁ONLOOKER
- ▁PERSECUT
- ▁RECOGNITION
- ▁REJOICE
- ▁REMEMBRANCE
- ▁REVELATION
- ▁SCOLD
- ▁SENIOR
- ▁SQUIRREL
- ▁SYMPATHETIC
- ▁TEMPEST
- ▁TREACHER
- ▁UNDERNEATH
- ▁UNEASINESS
- ▁UNNECESSARY
- ▁UPSTAIRS
- ▁VEXATION
- ▁ACCESS
- ▁CHEAP
- ▁ESTIMATE
- ▁HAZARD
- ▁HORSEBACK
- ▁PLUNDER
- ▁RASCAL
- ▁ROSTOV
- ▁ACCUR
- ▁GRAVITY
- ▁SITUATED
- ▁INVARIABL
- ▁PLENTIFUL
- ▁SPENCER
- ▁WALLACE
- ▁POLICY
- ▁WARRANT
- ▁ENVY
- ▁LAMB
- ▁EXTRACT
- ▁CORRAL
- ▁PANEL
- ▁LINK
- ▁LILIES
- ▁BECKON
- ▁SENOR
- ▁BORG
- ▁DEBATE
- ▁STEER
- COGNI
- COMB
- ▁SETTL
- ▁VENERA
- ▁FEATURE
- ▁TERRIBL
- CAPABLE
- OLOGICAL
- ▁INCESSANT
- ▁RESOLUTE
- SHAUGHNESSY
- ▁ABOLITION
- ▁ASSASSIN
- ▁BEHAVIOUR
- ▁BLUNT
- ▁COMMERCE
- ▁CONSTANTINOPLE
- ▁CRICKET
- ▁DISCIPLINE
- ▁DROUET
- ▁DWARF
- ▁INJUSTICE
- ▁LUXURY
- ▁MANUSCRIPT
- ▁MISUNDERSTAND
- ▁POLITICIAN
- ▁REDOUBT
- ▁SALVATION
- ▁SERMON
- ▁STRUGGLING
- ▁SURPRISING
- ▁TRIGGER
- ▁TUESDAY
- ▁TWILIGHT
- ▁UNDOUBTEDLY
- ▁VEGETABLE
- ▁VULGAR
- ▁WAISTCOAT
- ▁WRINKLE
- ▁ALEXANDER
- ▁CEILING
- ▁ECONOMIC
- ▁EVERLASTING
- ▁INFLICT
- ▁LEVISON
- ▁LOBSTER
- ▁OVERFLOW
- ▁SNATCH
- ▁TRAGEDY
- ▁DEASEY
- ▁ENLIGHTEN
- ▁FRIGATE
- ▁INSPECT
- ▁MARVELLOUS
- ▁ATLANTIC
- ▁LUFTON
- ▁BLADE
- ▁CRASH
- ▁SLAUGHTER
- ▁ANNUAL
- ▁CONFERENCE
- ▁TWIG
- ▁REASSUR
- ▁UNIQUE
- ▁WRATH
- ▁CRADLE
- ▁HULLO
- ▁LIQUID
- ▁MIRTH
- ▁EXPERT
- ▁HARVEY
- ▁RESTORATION
- ▁PRETTI
- ▁APOLOGY
- ▁SLAIN
- ▁BARBER
- ▁UPROAR
- ▁SCANT
- ▁BADGER
- ▁GROCER
- ▁ACRES
- ▁BRIDLE
- ▁SPECIFI
- ▁TANGLE
- ▁FERTIL
- ▁PATRON
- WIXT
- LAMOUR
- ▁DARN
- ▁POPE
- ▁PERCEIV
- ▁CONCLUDE
- ▁SIMPL
- ▁GUILT
- ▁CARRIE
- EFFICIENT
- SGIVING
- ▁APPOINTMENT
- ▁APPRECIATION
- ▁CARTRIDGE
- ▁CHALLENGE
- ▁CRAYFISH
- ▁CRIMSON
- ▁CUCUMETTO
- ▁ENERGETIC
- ▁EPOCH
- ▁EXAMINING
- ▁EXTENSIVE
- ▁EXTINGUISH
- ▁GLOODY
- ▁INSIGNIFICANT
- ▁LANDLORD
- ▁LANGUID
- ▁LEGISLATURE
- ▁MAJESTIC
- ▁PACIFIC
- ▁PASTRINI
- ▁PHRONSIE
- ▁RECONCIL
- ▁SIMULTANEOUS
- ▁SKELETON
- ▁SKETCH
- ▁TRANSFORM
- ▁UNJUST
- ▁VEXED
- ▁ASYLUM
- ▁CLUSTER
- ▁ERRAND
- ▁EXPEND
- ▁NEGATIVE
- ▁NORHALA
- ▁SCANDAL
- ▁STIMULAT
- ▁SWEAT
- ▁COMPOUND
- ▁DECEMBER
- ▁EXPAND
- ▁PROLONG
- ▁PURITAN
- ▁CONQUEST
- ▁MAGUA
- ▁SANCHO
- ▁TRENCH
- ▁ENTITLE
- ▁PEPPER
- ▁DISASTER
- ▁REGAIN
- ▁SHREWD
- ▁SULLEN
- ▁CLAVIER
- ▁COLOSS
- ▁SHILLING
- ▁ETHEL
- ▁MYSTERIES
- ▁BULK
- ▁GRANDEUR
- ▁AGNES
- ▁CONVERT
- ▁WRIST
- ▁GLID
- ▁TERRACE
- ▁SONYA
- ▁DANTES
- ▁MOULD
- ▁MAGNET
- ▁PLOT
- RANK
- ▁CAVIT
- ▁SUBSID
- ▁SLAP
- TURNED
- ▁THREAT
- BREAK
- ▁ANCESTORS
- ▁ANTICIPATED
- ▁APPLAUSE
- ▁ASSAULT
- ▁ATTORNEY
- ▁AUTOMATIC
- ▁CARAVAN
- ▁CATASTROPHE
- ▁CAVALCANTI
- ▁CROMWELL
- ▁ENVOY
- ▁EXHAUSTION
- ▁FIEND
- ▁GENEROSITY
- ▁GIMBLET
- ▁HARDQUANONNE
- ▁HOUARN
- ▁INJURY
- ▁MACKINSON
- ▁OGLETHORPE
- ▁PETTICOAT
- ▁RASPBERR
- ▁REHNHJELM
- ▁REJOICING
- ▁REMNANT
- ▁SCOTLAND
- ▁SHRINK
- ▁STANDPOINT
- ▁TESTIMONY
- ▁THEREAFTER
- ▁THIRTIETH
- ▁TWENTIETH
- ▁TYRANT
- ▁VENTNOR
- ▁VETERAN
- ▁WHITTAKER
- ▁ZVERKOV
- ▁ARCHITECTUR
- ▁BLUNDER
- ▁DENSHER
- ▁FORTNIGHT
- ▁JUDITH
- ▁MARIANNE
- ▁MEMORABLE
- ▁REFINED
- ▁REVOLV
- ▁UNDERTAKING
- ▁CLUMP
- ▁GRUMBLE
- ▁SYMPATHI
- ▁TICKET
- ▁TWITCH
- ▁EDITION
- ▁FALANDER
- ▁CARTHAGE
- ▁ORLEANS
- ▁POSSUM
- ▁SWITCH
- ▁CLUNG
- ▁CARDINAL
- ▁GNAW
- ▁LOCATED
- ▁HARROW
- ▁RASH
- ▁SIEGE
- ▁LOAF
- ▁BRUISE
- ▁REGULAT
- ▁RESORT
- ▁SARAH
- ▁LEVIN
- ▁NAVY
- ▁MOOSE
- ▁STOOL
- ▁CHANCELLOR
- ▁INGENIOUS
- ▁CHALK
- ▁PRETENCE
- ▁REPAY
- ▁ROAST
- ▁PLUTO
- ▁BAFFL
- ▁STUMBL
- ▁SPHERE
- ▁PLEDGE
- ▁SPRAWL
- ▁WRAP
- ▁FRINGE
- ▁DREAR
- ARRINGTON
- ▁FEDERA
- KEEPER
- ▁PHYSIC
- ▁ADVENT
- HUMAN
- OLOGIST
- ▁ALEXANDR
- ▁APPARITION
- ▁BARTHOLEMY
- ▁CITOYEN
- ▁CLIMATE
- ▁CONTEMPORAR
- ▁DESOLATE
- ▁DISCONTENT
- ▁ELEPHANT
- ▁FERNANDO
- ▁FERRALTI
- ▁FOLIAGE
- ▁FUGITIVE
- ▁GAMBLING
- ▁INVOLUNTARILY
- ▁LABYRINTH
- ▁LEGITIMATE
- ▁MILLIONAIRE
- ▁PERCEPTION
- ▁PROPRIETY
- ▁REBELLION
- ▁REFRAIN
- ▁RUGGLES
- ▁SCRIPTURE
- ▁SPLENDOR
- ▁SQUADRON
- ▁STRICKEN
- ▁SWARM
- ▁THEODORA
- ▁TOMORROW
- ▁VELVET
- ▁WOLVES
- ▁DISREGARD
- ▁GLIMMER
- ▁SHROUD
- ▁TWINKLING
- ▁UNEQUAL
- ▁CHANNING
- ▁CLUMS
- ▁ENIGMA
- ▁NAVIGAT
- ▁TARKAS
- ▁TEMPERATURE
- ▁DIVISION
- ▁GRATIFICATION
- ▁MONUMENT
- ▁SQUEAK
- ▁KAVIN
- ▁INTERPOSE
- ▁THORNTON
- ▁SOLUTION
- ▁STREAK
- ▁SHRILL
- ▁APRON
- ▁PITEOUS
- ▁HAUGHTY
- ▁RECKLESS
- ▁EMPTI
- ▁WADMAN
- ▁BONNET
- ▁MARTHA
- ▁DUMB
- ▁SHATTER
- ▁ACUTE
- ▁BRINK
- ▁CAPRICE
- ▁HURON
- ▁INFERN
- ▁FOWL
- ▁ENRAGE
- ▁ADORN
- ▁CRUIS
- ▁PROBABILIT
- ▁EXPIR
- ▁IMPETU
- ▁OVERHEAR
- BURTON
- ▁TRANSLAT
- ▁ENGAGE
- ▁CONVINCE
- ▁ABNORMAL
- ▁GESTICULAT
- ▁ABOMINABL
- ▁ADVERSARY
- ▁ADVERTISER
- ▁ADVERTISING
- ▁ANNIHILAT
- ▁ARTILLERY
- ▁CATHEDRAL
- ▁COMPETITOR
- ▁COULSON
- ▁CREVICE
- ▁CUSHION
- ▁DEBRAY
- ▁DEJECT
- ▁DIETRICH
- ▁DISADVANTAGE
- ▁ELLISON
- ▁EMPHASIS
- ▁EXCURSION
- ▁FANTASTIC
- ▁HYPOTHES
- ▁INCONVENIENCE
- ▁INDESCRIBABLE
- ▁INDUSTRI
- ▁INVALID
- ▁MERCILESS
- ▁MESOPOTAMIA
- ▁MOSQUITO
- ▁NARRATIVE
- ▁NOWADAYS
- ▁OPPORTUNITIES
- ▁PROMISING
- ▁RECTANGLE
- ▁REMONSTRANCE
- ▁RESTAURANT
- ▁RIBBON
- ▁SCIENTIST
- ▁SHALMANESER
- ▁SKULL
- ▁SPRUCE
- ▁SUBSTANTIAL
- ▁SYMBOL
- ▁TEAPOT
- ▁TERRITORY
- ▁TRAFFIC
- ▁TREASON
- ▁TRUMPET
- ▁TYRANN
- ▁UNANIMOUS
- ▁UNAWARE
- ▁VICINITY
- ▁WREATH
- ▁ZADIG
- ▁CHATEAU
- ▁CONFRONT
- ▁DUCHESS
- ▁EMBODI
- ▁FEMININ
- ▁FURNACE
- ▁MONTONI
- ▁RENOWN
- ▁SMASH
- ▁HARVARD
- ▁NEWBERRY
- ▁PERFUME
- ▁SIGNATURE
- ▁SPLASH
- ▁SUPPOSITION
- ▁HARBOUR
- ▁ASSURANCE
- ▁BRISTOL
- ▁BUCKINGHAM
- ▁DUDLEY
- ▁INTENSITY
- ▁CHOPIN
- ▁ENLIST
- Q
- <sos/eos>
init: null
input_size: null
ctc_conf:
dropout_rate: 0.0
ctc_type: builtin
reduce: true
ignore_nan_grad: null
zero_infinity: true
joint_net_conf: null
use_preprocessor: true
token_type: bpe
bpemodel: data/en_token_list/bpe_unigram5000/bpe.model
non_linguistic_symbols: null
cleaner: null
g2p: null
speech_volume_normalize: null
rir_scp: null
rir_apply_prob: 1.0
noise_scp: null
noise_apply_prob: 1.0
noise_db_range: '13_15'
short_noise_thres: 0.5
aux_ctc_tasks: []
frontend: default
frontend_conf:
n_fft: 512
win_length: 400
hop_length: 160
fs: 16k
specaug: specaug
specaug_conf:
apply_time_warp: true
time_warp_window: 5
time_warp_mode: bicubic
apply_freq_mask: true
freq_mask_width_range:
- 0
- 27
num_freq_mask: 2
apply_time_mask: true
time_mask_width_ratio_range:
- 0.0
- 0.05
num_time_mask: 5
normalize: global_mvn
normalize_conf:
stats_file: exp/asr_stats_raw_en_bpe5000_sp/train/feats_stats.npz
model: espnet
model_conf:
ctc_weight: 0.3
lsm_weight: 0.1
length_normalized_loss: false
preencoder: null
preencoder_conf: {}
encoder: multiconv_conformer
encoder_conf:
output_size: 256
attention_heads: 4
selfattention_layer_type: rel_selfattn
pos_enc_layer_type: rel_pos
rel_pos_type: latest
cgmlp_linear_units: 1024
multicgmlp_type: concat_fusion
multicgmlp_kernel_sizes: 7,15,23,31
multicgmlp_merge_conv_kernel: 31
use_linear_after_conv: false
gate_activation: identity
num_blocks: 12
dropout_rate: 0.1
positional_dropout_rate: 0.1
attention_dropout_rate: 0.1
input_layer: conv2d
layer_drop_rate: 0.0
linear_units: 1024
positionwise_layer_type: linear
macaron_style: true
use_cnn_module: true
postencoder: null
postencoder_conf: {}
decoder: transformer
decoder_conf:
attention_heads: 4
linear_units: 2048
num_blocks: 6
dropout_rate: 0.1
positional_dropout_rate: 0.1
self_attention_dropout_rate: 0.1
src_attention_dropout_rate: 0.1
layer_drop_rate: 0.0
preprocessor: default
preprocessor_conf: {}
required:
- output_dir
- token_list
version: '202304'
distributed: false
```
</details>
### Citing ESPnet
```BibTex
@inproceedings{watanabe2018espnet,
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
title={{ESPnet}: End-to-End Speech Processing Toolkit},
year={2018},
booktitle={Proceedings of Interspeech},
pages={2207--2211},
doi={10.21437/Interspeech.2018-1456},
url={http://dx.doi.org/10.21437/Interspeech.2018-1456}
}
```
or arXiv:
```bibtex
@misc{watanabe2018espnet,
title={ESPnet: End-to-End Speech Processing Toolkit},
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
year={2018},
eprint={1804.00015},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
| [
"BEAR",
"CRAFT"
] |
Intel/neural-embedding-v1 | Intel | null | [
"mteb",
"model-index",
"region:us"
] | "2024-06-26T03:18:29Z" | 2024-06-26T08:05:33+00:00 | 0 | 13 | ---
tags:
- mteb
model-index:
- name: neural-embedding-v1
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: mteb/amazon_counterfactual
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 93.10447761194031
- type: ap
value: 72.52673607512206
- type: f1
value: 89.6752355529259
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: mteb/amazon_polarity
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 97.536525
- type: ap
value: 96.46802431780014
- type: f1
value: 97.53623627430422
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: mteb/amazon_reviews_multi
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 61.17399999999999
- type: f1
value: 60.40485236445537
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: mteb/arguana
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 44.452000000000005
- type: map_at_10
value: 59.563
- type: map_at_100
value: 60.014
- type: map_at_1000
value: 60.016000000000005
- type: map_at_20
value: 59.923
- type: map_at_3
value: 55.915000000000006
- type: map_at_5
value: 58.056
- type: mrr_at_1
value: 45.804
- type: mrr_at_10
value: 60.089999999999996
- type: mrr_at_100
value: 60.541
- type: mrr_at_1000
value: 60.543
- type: mrr_at_20
value: 60.45
- type: mrr_at_3
value: 56.294
- type: mrr_at_5
value: 58.54899999999999
- type: ndcg_at_1
value: 44.452000000000005
- type: ndcg_at_10
value: 67.208
- type: ndcg_at_100
value: 69.074
- type: ndcg_at_1000
value: 69.122
- type: ndcg_at_20
value: 68.474
- type: ndcg_at_3
value: 59.758
- type: ndcg_at_5
value: 63.621
- type: precision_at_1
value: 44.452000000000005
- type: precision_at_10
value: 9.125
- type: precision_at_100
value: 0.993
- type: precision_at_1000
value: 0.1
- type: precision_at_20
value: 4.808
- type: precision_at_3
value: 23.637
- type: precision_at_5
value: 16.06
- type: recall_at_1
value: 44.452000000000005
- type: recall_at_10
value: 91.252
- type: recall_at_100
value: 99.289
- type: recall_at_1000
value: 99.644
- type: recall_at_20
value: 96.15899999999999
- type: recall_at_3
value: 70.91
- type: recall_at_5
value: 80.29899999999999
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: mteb/arxiv-clustering-p2p
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 53.445166004781356
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: mteb/arxiv-clustering-s2s
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 48.82101672589653
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: mteb/askubuntudupquestions-reranking
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 68.44973588765177
- type: mrr
value: 80.355274150288
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: mteb/biosses-sts
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 90.06690068909613
- type: cos_sim_spearman
value: 87.97730582741434
- type: euclidean_pearson
value: 86.04185393610108
- type: euclidean_spearman
value: 85.91340337831018
- type: manhattan_pearson
value: 86.05913485565931
- type: manhattan_spearman
value: 85.70195277713228
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: mteb/banking77
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 88.73376623376623
- type: f1
value: 88.67733851784945
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: mteb/biorxiv-clustering-p2p
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 48.822459956481474
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: mteb/biorxiv-clustering-s2s
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 45.068617486695764
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: mteb/cqadupstack-android
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 36.980000000000004
- type: map_at_10
value: 51.151
- type: map_at_100
value: 52.852
- type: map_at_1000
value: 52.957
- type: map_at_20
value: 52.196
- type: map_at_3
value: 46.537
- type: map_at_5
value: 49.025999999999996
- type: mrr_at_1
value: 45.494
- type: mrr_at_10
value: 56.765
- type: mrr_at_100
value: 57.483
- type: mrr_at_1000
value: 57.508
- type: mrr_at_20
value: 57.255
- type: mrr_at_3
value: 53.815000000000005
- type: mrr_at_5
value: 55.725
- type: ndcg_at_1
value: 45.494
- type: ndcg_at_10
value: 58.435
- type: ndcg_at_100
value: 63.318
- type: ndcg_at_1000
value: 64.498
- type: ndcg_at_20
value: 60.88
- type: ndcg_at_3
value: 52.307
- type: ndcg_at_5
value: 55.103
- type: precision_at_1
value: 45.494
- type: precision_at_10
value: 11.488
- type: precision_at_100
value: 1.735
- type: precision_at_1000
value: 0.215
- type: precision_at_20
value: 6.8309999999999995
- type: precision_at_3
value: 25.513
- type: precision_at_5
value: 18.282999999999998
- type: recall_at_1
value: 36.980000000000004
- type: recall_at_10
value: 72.82300000000001
- type: recall_at_100
value: 91.525
- type: recall_at_1000
value: 98.44800000000001
- type: recall_at_20
value: 81.345
- type: recall_at_3
value: 55.044000000000004
- type: recall_at_5
value: 63.441
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: mteb/cqadupstack-english
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 37.489
- type: map_at_10
value: 50.708
- type: map_at_100
value: 52.101
- type: map_at_1000
value: 52.22
- type: map_at_20
value: 51.514
- type: map_at_3
value: 46.915
- type: map_at_5
value: 49.185
- type: mrr_at_1
value: 47.643
- type: mrr_at_10
value: 56.806
- type: mrr_at_100
value: 57.369
- type: mrr_at_1000
value: 57.399
- type: mrr_at_20
value: 57.141
- type: mrr_at_3
value: 54.437000000000005
- type: mrr_at_5
value: 55.955999999999996
- type: ndcg_at_1
value: 47.643
- type: ndcg_at_10
value: 56.989000000000004
- type: ndcg_at_100
value: 60.995999999999995
- type: ndcg_at_1000
value: 62.668
- type: ndcg_at_20
value: 58.63699999999999
- type: ndcg_at_3
value: 52.26499999999999
- type: ndcg_at_5
value: 54.684999999999995
- type: precision_at_1
value: 47.643
- type: precision_at_10
value: 10.879
- type: precision_at_100
value: 1.6320000000000001
- type: precision_at_1000
value: 0.211
- type: precision_at_20
value: 6.338000000000001
- type: precision_at_3
value: 25.52
- type: precision_at_5
value: 18.229
- type: recall_at_1
value: 37.489
- type: recall_at_10
value: 68.10300000000001
- type: recall_at_100
value: 84.497
- type: recall_at_1000
value: 94.402
- type: recall_at_20
value: 73.849
- type: recall_at_3
value: 53.925
- type: recall_at_5
value: 60.878
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: mteb/cqadupstack-gaming
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 45.044000000000004
- type: map_at_10
value: 59.804
- type: map_at_100
value: 60.841
- type: map_at_1000
value: 60.870999999999995
- type: map_at_20
value: 60.478
- type: map_at_3
value: 56.169000000000004
- type: map_at_5
value: 58.331999999999994
- type: mrr_at_1
value: 51.849999999999994
- type: mrr_at_10
value: 63.249
- type: mrr_at_100
value: 63.786
- type: mrr_at_1000
value: 63.797000000000004
- type: mrr_at_20
value: 63.592999999999996
- type: mrr_at_3
value: 60.721000000000004
- type: mrr_at_5
value: 62.251
- type: ndcg_at_1
value: 51.849999999999994
- type: ndcg_at_10
value: 66.122
- type: ndcg_at_100
value: 69.614
- type: ndcg_at_1000
value: 70.12
- type: ndcg_at_20
value: 67.805
- type: ndcg_at_3
value: 60.348
- type: ndcg_at_5
value: 63.33800000000001
- type: precision_at_1
value: 51.849999999999994
- type: precision_at_10
value: 10.539
- type: precision_at_100
value: 1.327
- type: precision_at_1000
value: 0.13999999999999999
- type: precision_at_20
value: 5.865
- type: precision_at_3
value: 27.084999999999997
- type: precision_at_5
value: 18.483
- type: recall_at_1
value: 45.044000000000004
- type: recall_at_10
value: 81.192
- type: recall_at_100
value: 95.597
- type: recall_at_1000
value: 98.97200000000001
- type: recall_at_20
value: 87.139
- type: recall_at_3
value: 65.713
- type: recall_at_5
value: 73.213
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: mteb/cqadupstack-gis
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 29.834
- type: map_at_10
value: 40.363
- type: map_at_100
value: 41.559000000000005
- type: map_at_1000
value: 41.626000000000005
- type: map_at_20
value: 41.160999999999994
- type: map_at_3
value: 36.958999999999996
- type: map_at_5
value: 38.897999999999996
- type: mrr_at_1
value: 32.429
- type: mrr_at_10
value: 42.604
- type: mrr_at_100
value: 43.54
- type: mrr_at_1000
value: 43.59
- type: mrr_at_20
value: 43.247
- type: mrr_at_3
value: 39.528999999999996
- type: mrr_at_5
value: 41.36
- type: ndcg_at_1
value: 32.429
- type: ndcg_at_10
value: 46.39
- type: ndcg_at_100
value: 51.561
- type: ndcg_at_1000
value: 53.071
- type: ndcg_at_20
value: 48.951
- type: ndcg_at_3
value: 39.796
- type: ndcg_at_5
value: 43.07
- type: precision_at_1
value: 32.429
- type: precision_at_10
value: 7.277
- type: precision_at_100
value: 1.038
- type: precision_at_1000
value: 0.11900000000000001
- type: precision_at_20
value: 4.249
- type: precision_at_3
value: 17.024
- type: precision_at_5
value: 12.113
- type: recall_at_1
value: 29.834
- type: recall_at_10
value: 62.808
- type: recall_at_100
value: 85.47200000000001
- type: recall_at_1000
value: 96.503
- type: recall_at_20
value: 72.246
- type: recall_at_3
value: 45.059
- type: recall_at_5
value: 52.907000000000004
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: mteb/cqadupstack-mathematica
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 22.121
- type: map_at_10
value: 33.217
- type: map_at_100
value: 34.671
- type: map_at_1000
value: 34.77
- type: map_at_20
value: 34.039
- type: map_at_3
value: 29.389
- type: map_at_5
value: 31.749
- type: mrr_at_1
value: 27.114
- type: mrr_at_10
value: 37.730999999999995
- type: mrr_at_100
value: 38.673
- type: mrr_at_1000
value: 38.725
- type: mrr_at_20
value: 38.279
- type: mrr_at_3
value: 34.494
- type: mrr_at_5
value: 36.609
- type: ndcg_at_1
value: 27.114
- type: ndcg_at_10
value: 39.723000000000006
- type: ndcg_at_100
value: 45.847
- type: ndcg_at_1000
value: 47.879
- type: ndcg_at_20
value: 42.129
- type: ndcg_at_3
value: 33.194
- type: ndcg_at_5
value: 36.763
- type: precision_at_1
value: 27.114
- type: precision_at_10
value: 7.575
- type: precision_at_100
value: 1.218
- type: precision_at_1000
value: 0.15
- type: precision_at_20
value: 4.527
- type: precision_at_3
value: 16.252
- type: precision_at_5
value: 12.363
- type: recall_at_1
value: 22.121
- type: recall_at_10
value: 54.726
- type: recall_at_100
value: 80.662
- type: recall_at_1000
value: 94.645
- type: recall_at_20
value: 62.977000000000004
- type: recall_at_3
value: 37.348
- type: recall_at_5
value: 46.163
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: mteb/cqadupstack-physics
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 36.346000000000004
- type: map_at_10
value: 50.034
- type: map_at_100
value: 51.37500000000001
- type: map_at_1000
value: 51.464
- type: map_at_20
value: 50.739999999999995
- type: map_at_3
value: 45.948
- type: map_at_5
value: 48.421
- type: mrr_at_1
value: 45.043
- type: mrr_at_10
value: 55.642
- type: mrr_at_100
value: 56.335
- type: mrr_at_1000
value: 56.355999999999995
- type: mrr_at_20
value: 56.027
- type: mrr_at_3
value: 53.224000000000004
- type: mrr_at_5
value: 54.798
- type: ndcg_at_1
value: 45.043
- type: ndcg_at_10
value: 56.627
- type: ndcg_at_100
value: 61.751
- type: ndcg_at_1000
value: 62.873999999999995
- type: ndcg_at_20
value: 58.521
- type: ndcg_at_3
value: 50.995999999999995
- type: ndcg_at_5
value: 54.049
- type: precision_at_1
value: 45.043
- type: precision_at_10
value: 10.51
- type: precision_at_100
value: 1.521
- type: precision_at_1000
value: 0.179
- type: precision_at_20
value: 5.958
- type: precision_at_3
value: 24.703
- type: precision_at_5
value: 17.671
- type: recall_at_1
value: 36.346000000000004
- type: recall_at_10
value: 69.95
- type: recall_at_100
value: 91.449
- type: recall_at_1000
value: 98.021
- type: recall_at_20
value: 76.522
- type: recall_at_3
value: 54.348
- type: recall_at_5
value: 62.271
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: mteb/cqadupstack-programmers
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 28.754
- type: map_at_10
value: 42.921
- type: map_at_100
value: 44.440000000000005
- type: map_at_1000
value: 44.516
- type: map_at_20
value: 43.815
- type: map_at_3
value: 38.592999999999996
- type: map_at_5
value: 41.138999999999996
- type: mrr_at_1
value: 36.416
- type: mrr_at_10
value: 48.284
- type: mrr_at_100
value: 49.086
- type: mrr_at_1000
value: 49.116
- type: mrr_at_20
value: 48.741
- type: mrr_at_3
value: 45.301
- type: mrr_at_5
value: 47.104
- type: ndcg_at_1
value: 36.416
- type: ndcg_at_10
value: 50.257
- type: ndcg_at_100
value: 55.931
- type: ndcg_at_1000
value: 57.188
- type: ndcg_at_20
value: 52.607000000000006
- type: ndcg_at_3
value: 43.787
- type: ndcg_at_5
value: 46.941
- type: precision_at_1
value: 36.416
- type: precision_at_10
value: 9.783
- type: precision_at_100
value: 1.465
- type: precision_at_1000
value: 0.173
- type: precision_at_20
value: 5.713
- type: precision_at_3
value: 21.804000000000002
- type: precision_at_5
value: 16.05
- type: recall_at_1
value: 28.754
- type: recall_at_10
value: 66.31099999999999
- type: recall_at_100
value: 90.034
- type: recall_at_1000
value: 98.058
- type: recall_at_20
value: 74.411
- type: recall_at_3
value: 48.332
- type: recall_at_5
value: 56.548
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 30.579583333333332
- type: map_at_10
value: 42.696333333333335
- type: map_at_100
value: 44.078583333333334
- type: map_at_1000
value: 44.176333333333325
- type: map_at_20
value: 43.499833333333335
- type: map_at_3
value: 38.953583333333334
- type: map_at_5
value: 41.125583333333324
- type: mrr_at_1
value: 36.52666666666667
- type: mrr_at_10
value: 46.925666666666665
- type: mrr_at_100
value: 47.74333333333334
- type: mrr_at_1000
value: 47.78266666666667
- type: mrr_at_20
value: 47.42483333333333
- type: mrr_at_3
value: 44.068083333333334
- type: mrr_at_5
value: 45.82383333333333
- type: ndcg_at_1
value: 36.52666666666667
- type: ndcg_at_10
value: 49.1145
- type: ndcg_at_100
value: 54.340583333333335
- type: ndcg_at_1000
value: 55.90625
- type: ndcg_at_20
value: 51.32724999999999
- type: ndcg_at_3
value: 43.146
- type: ndcg_at_5
value: 46.146166666666666
- type: precision_at_1
value: 36.52666666666667
- type: precision_at_10
value: 8.866916666666667
- type: precision_at_100
value: 1.3526666666666667
- type: precision_at_1000
value: 0.1670833333333333
- type: precision_at_20
value: 5.199416666666667
- type: precision_at_3
value: 20.278333333333332
- type: precision_at_5
value: 14.614999999999997
- type: recall_at_1
value: 30.579583333333332
- type: recall_at_10
value: 63.618416666666675
- type: recall_at_100
value: 85.86858333333332
- type: recall_at_1000
value: 96.24825
- type: recall_at_20
value: 71.52533333333334
- type: recall_at_3
value: 47.18050000000001
- type: recall_at_5
value: 54.90683333333334
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: mteb/cqadupstack-stats
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 26.180999999999997
- type: map_at_10
value: 36.4
- type: map_at_100
value: 37.464999999999996
- type: map_at_1000
value: 37.556
- type: map_at_20
value: 36.984
- type: map_at_3
value: 33.354
- type: map_at_5
value: 35.214
- type: mrr_at_1
value: 29.601
- type: mrr_at_10
value: 39.328
- type: mrr_at_100
value: 40.113
- type: mrr_at_1000
value: 40.176
- type: mrr_at_20
value: 39.751999999999995
- type: mrr_at_3
value: 36.58
- type: mrr_at_5
value: 38.313
- type: ndcg_at_1
value: 29.601
- type: ndcg_at_10
value: 42.037
- type: ndcg_at_100
value: 46.946
- type: ndcg_at_1000
value: 49.075
- type: ndcg_at_20
value: 43.827
- type: ndcg_at_3
value: 36.473
- type: ndcg_at_5
value: 39.482
- type: precision_at_1
value: 29.601
- type: precision_at_10
value: 7.009
- type: precision_at_100
value: 1.0290000000000001
- type: precision_at_1000
value: 0.129
- type: precision_at_20
value: 4.018
- type: precision_at_3
value: 16.36
- type: precision_at_5
value: 11.779
- type: recall_at_1
value: 26.180999999999997
- type: recall_at_10
value: 56.275
- type: recall_at_100
value: 78.61200000000001
- type: recall_at_1000
value: 93.887
- type: recall_at_20
value: 62.798
- type: recall_at_3
value: 41.157
- type: recall_at_5
value: 48.49
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: mteb/cqadupstack-tex
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 20.205000000000002
- type: map_at_10
value: 29.947000000000003
- type: map_at_100
value: 31.342
- type: map_at_1000
value: 31.458000000000002
- type: map_at_20
value: 30.741000000000003
- type: map_at_3
value: 26.568
- type: map_at_5
value: 28.372999999999998
- type: mrr_at_1
value: 24.742
- type: mrr_at_10
value: 33.941
- type: mrr_at_100
value: 34.92
- type: mrr_at_1000
value: 34.981
- type: mrr_at_20
value: 34.509
- type: mrr_at_3
value: 31.097
- type: mrr_at_5
value: 32.631
- type: ndcg_at_1
value: 24.742
- type: ndcg_at_10
value: 35.884
- type: ndcg_at_100
value: 41.839999999999996
- type: ndcg_at_1000
value: 44.162
- type: ndcg_at_20
value: 38.273
- type: ndcg_at_3
value: 30.073
- type: ndcg_at_5
value: 32.617000000000004
- type: precision_at_1
value: 24.742
- type: precision_at_10
value: 6.958
- type: precision_at_100
value: 1.155
- type: precision_at_1000
value: 0.154
- type: precision_at_20
value: 4.202
- type: precision_at_3
value: 14.568
- type: precision_at_5
value: 10.757
- type: recall_at_1
value: 20.205000000000002
- type: recall_at_10
value: 49.603
- type: recall_at_100
value: 75.77000000000001
- type: recall_at_1000
value: 91.767
- type: recall_at_20
value: 58.309
- type: recall_at_3
value: 33.353
- type: recall_at_5
value: 39.947
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: mteb/cqadupstack-unix
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 31.543
- type: map_at_10
value: 43.895
- type: map_at_100
value: 45.233000000000004
- type: map_at_1000
value: 45.314
- type: map_at_20
value: 44.707
- type: map_at_3
value: 40.165
- type: map_at_5
value: 42.353
- type: mrr_at_1
value: 37.5
- type: mrr_at_10
value: 47.814
- type: mrr_at_100
value: 48.701
- type: mrr_at_1000
value: 48.74
- type: mrr_at_20
value: 48.378
- type: mrr_at_3
value: 45.04
- type: mrr_at_5
value: 46.729
- type: ndcg_at_1
value: 37.5
- type: ndcg_at_10
value: 50.312999999999995
- type: ndcg_at_100
value: 55.696999999999996
- type: ndcg_at_1000
value: 57.135000000000005
- type: ndcg_at_20
value: 52.734
- type: ndcg_at_3
value: 44.263000000000005
- type: ndcg_at_5
value: 47.268
- type: precision_at_1
value: 37.5
- type: precision_at_10
value: 8.871
- type: precision_at_100
value: 1.278
- type: precision_at_1000
value: 0.149
- type: precision_at_20
value: 5.117
- type: precision_at_3
value: 20.709
- type: precision_at_5
value: 14.832
- type: recall_at_1
value: 31.543
- type: recall_at_10
value: 65.694
- type: recall_at_100
value: 88.105
- type: recall_at_1000
value: 97.38
- type: recall_at_20
value: 74.307
- type: recall_at_3
value: 49.254999999999995
- type: recall_at_5
value: 56.85
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: mteb/cqadupstack-webmasters
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 29.325000000000003
- type: map_at_10
value: 40.653
- type: map_at_100
value: 42.568
- type: map_at_1000
value: 42.782
- type: map_at_20
value: 41.638999999999996
- type: map_at_3
value: 36.726
- type: map_at_5
value: 38.911
- type: mrr_at_1
value: 34.98
- type: mrr_at_10
value: 45.281
- type: mrr_at_100
value: 46.255
- type: mrr_at_1000
value: 46.29
- type: mrr_at_20
value: 45.94
- type: mrr_at_3
value: 41.831
- type: mrr_at_5
value: 44.045
- type: ndcg_at_1
value: 34.98
- type: ndcg_at_10
value: 47.629
- type: ndcg_at_100
value: 53.912000000000006
- type: ndcg_at_1000
value: 55.48
- type: ndcg_at_20
value: 50.281
- type: ndcg_at_3
value: 41.211999999999996
- type: ndcg_at_5
value: 44.529
- type: precision_at_1
value: 34.98
- type: precision_at_10
value: 9.229
- type: precision_at_100
value: 1.854
- type: precision_at_1000
value: 0.258
- type: precision_at_20
value: 5.8500000000000005
- type: precision_at_3
value: 19.631
- type: precision_at_5
value: 14.506
- type: recall_at_1
value: 29.325000000000003
- type: recall_at_10
value: 61.894000000000005
- type: recall_at_100
value: 88.684
- type: recall_at_1000
value: 97.83800000000001
- type: recall_at_20
value: 71.758
- type: recall_at_3
value: 44.265
- type: recall_at_5
value: 53.051
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: mteb/cqadupstack-wordpress
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 23.133
- type: map_at_10
value: 33.263
- type: map_at_100
value: 34.496
- type: map_at_1000
value: 34.582
- type: map_at_20
value: 33.983999999999995
- type: map_at_3
value: 30.12
- type: map_at_5
value: 31.906000000000002
- type: mrr_at_1
value: 25.507999999999996
- type: mrr_at_10
value: 35.663
- type: mrr_at_100
value: 36.659000000000006
- type: mrr_at_1000
value: 36.714
- type: mrr_at_20
value: 36.236000000000004
- type: mrr_at_3
value: 32.748
- type: mrr_at_5
value: 34.365
- type: ndcg_at_1
value: 25.507999999999996
- type: ndcg_at_10
value: 38.968
- type: ndcg_at_100
value: 44.674
- type: ndcg_at_1000
value: 46.725
- type: ndcg_at_20
value: 41.282000000000004
- type: ndcg_at_3
value: 33.038000000000004
- type: ndcg_at_5
value: 35.909
- type: precision_at_1
value: 25.507999999999996
- type: precision_at_10
value: 6.285
- type: precision_at_100
value: 0.98
- type: precision_at_1000
value: 0.128
- type: precision_at_20
value: 3.7249999999999996
- type: precision_at_3
value: 14.171
- type: precision_at_5
value: 10.314
- type: recall_at_1
value: 23.133
- type: recall_at_10
value: 54.042
- type: recall_at_100
value: 80.01599999999999
- type: recall_at_1000
value: 95.05799999999999
- type: recall_at_20
value: 62.643
- type: recall_at_3
value: 38.367000000000004
- type: recall_at_5
value: 45.123000000000005
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: mteb/climate-fever
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 13.923
- type: map_at_10
value: 23.415
- type: map_at_100
value: 25.389
- type: map_at_1000
value: 25.539
- type: map_at_20
value: 24.462
- type: map_at_3
value: 19.719
- type: map_at_5
value: 21.75
- type: mrr_at_1
value: 31.205
- type: mrr_at_10
value: 43.196
- type: mrr_at_100
value: 44.039
- type: mrr_at_1000
value: 44.071
- type: mrr_at_20
value: 43.744
- type: mrr_at_3
value: 40.033
- type: mrr_at_5
value: 41.967
- type: ndcg_at_1
value: 31.205
- type: ndcg_at_10
value: 32.304
- type: ndcg_at_100
value: 39.717
- type: ndcg_at_1000
value: 42.559999999999995
- type: ndcg_at_20
value: 35.166
- type: ndcg_at_3
value: 26.955000000000002
- type: ndcg_at_5
value: 28.967
- type: precision_at_1
value: 31.205
- type: precision_at_10
value: 9.948
- type: precision_at_100
value: 1.7870000000000001
- type: precision_at_1000
value: 0.233
- type: precision_at_20
value: 6.205
- type: precision_at_3
value: 20.108999999999998
- type: precision_at_5
value: 15.453
- type: recall_at_1
value: 13.923
- type: recall_at_10
value: 37.885000000000005
- type: recall_at_100
value: 63.352
- type: recall_at_1000
value: 79.372
- type: recall_at_20
value: 45.954
- type: recall_at_3
value: 24.511
- type: recall_at_5
value: 30.451
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: mteb/dbpedia
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 9.584
- type: map_at_10
value: 23.432
- type: map_at_100
value: 32.513
- type: map_at_1000
value: 34.27
- type: map_at_20
value: 27.18
- type: map_at_3
value: 16.145
- type: map_at_5
value: 19.405
- type: mrr_at_1
value: 74.5
- type: mrr_at_10
value: 81.233
- type: mrr_at_100
value: 81.463
- type: mrr_at_1000
value: 81.46900000000001
- type: mrr_at_20
value: 81.394
- type: mrr_at_3
value: 79.958
- type: mrr_at_5
value: 80.808
- type: ndcg_at_1
value: 62.125
- type: ndcg_at_10
value: 48.047000000000004
- type: ndcg_at_100
value: 52.251999999999995
- type: ndcg_at_1000
value: 59.353
- type: ndcg_at_20
value: 47.264
- type: ndcg_at_3
value: 52.891999999999996
- type: ndcg_at_5
value: 50.766999999999996
- type: precision_at_1
value: 74.5
- type: precision_at_10
value: 38.15
- type: precision_at_100
value: 11.51
- type: precision_at_1000
value: 2.183
- type: precision_at_20
value: 28.749999999999996
- type: precision_at_3
value: 56.25
- type: precision_at_5
value: 49.1
- type: recall_at_1
value: 9.584
- type: recall_at_10
value: 29.215999999999998
- type: recall_at_100
value: 57.914
- type: recall_at_1000
value: 80.67699999999999
- type: recall_at_20
value: 37.358000000000004
- type: recall_at_3
value: 17.422
- type: recall_at_5
value: 22.345000000000002
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: mteb/emotion
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 91.36000000000001
- type: f1
value: 87.72724279223316
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: mteb/fever
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 78.81700000000001
- type: map_at_10
value: 86.392
- type: map_at_100
value: 86.6
- type: map_at_1000
value: 86.611
- type: map_at_20
value: 86.521
- type: map_at_3
value: 85.31
- type: map_at_5
value: 86.047
- type: mrr_at_1
value: 84.878
- type: mrr_at_10
value: 90.359
- type: mrr_at_100
value: 90.426
- type: mrr_at_1000
value: 90.427
- type: mrr_at_20
value: 90.405
- type: mrr_at_3
value: 89.761
- type: mrr_at_5
value: 90.191
- type: ndcg_at_1
value: 84.878
- type: ndcg_at_10
value: 89.459
- type: ndcg_at_100
value: 90.171
- type: ndcg_at_1000
value: 90.349
- type: ndcg_at_20
value: 89.788
- type: ndcg_at_3
value: 87.908
- type: ndcg_at_5
value: 88.844
- type: precision_at_1
value: 84.878
- type: precision_at_10
value: 10.639
- type: precision_at_100
value: 1.123
- type: precision_at_1000
value: 0.11499999999999999
- type: precision_at_20
value: 5.427
- type: precision_at_3
value: 33.333
- type: precision_at_5
value: 20.696
- type: recall_at_1
value: 78.81700000000001
- type: recall_at_10
value: 94.959
- type: recall_at_100
value: 97.72800000000001
- type: recall_at_1000
value: 98.791
- type: recall_at_20
value: 96.036
- type: recall_at_3
value: 90.727
- type: recall_at_5
value: 93.12899999999999
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: mteb/fiqa
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 29.596
- type: map_at_10
value: 50.833
- type: map_at_100
value: 53.034000000000006
- type: map_at_1000
value: 53.135
- type: map_at_20
value: 52.195
- type: map_at_3
value: 44.247
- type: map_at_5
value: 48.107
- type: mrr_at_1
value: 57.87
- type: mrr_at_10
value: 65.566
- type: mrr_at_100
value: 66.15299999999999
- type: mrr_at_1000
value: 66.168
- type: mrr_at_20
value: 65.923
- type: mrr_at_3
value: 63.55499999999999
- type: mrr_at_5
value: 64.727
- type: ndcg_at_1
value: 57.87
- type: ndcg_at_10
value: 58.943999999999996
- type: ndcg_at_100
value: 65.283
- type: ndcg_at_1000
value: 66.706
- type: ndcg_at_20
value: 61.778999999999996
- type: ndcg_at_3
value: 54.554
- type: ndcg_at_5
value: 56.159000000000006
- type: precision_at_1
value: 57.87
- type: precision_at_10
value: 16.435
- type: precision_at_100
value: 2.307
- type: precision_at_1000
value: 0.256
- type: precision_at_20
value: 9.522
- type: precision_at_3
value: 36.986000000000004
- type: precision_at_5
value: 27.16
- type: recall_at_1
value: 29.596
- type: recall_at_10
value: 66.705
- type: recall_at_100
value: 89.45
- type: recall_at_1000
value: 97.758
- type: recall_at_20
value: 75.13300000000001
- type: recall_at_3
value: 49.689
- type: recall_at_5
value: 57.701
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: mteb/hotpotqa
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 42.532
- type: map_at_10
value: 71.931
- type: map_at_100
value: 72.623
- type: map_at_1000
value: 72.662
- type: map_at_20
value: 72.355
- type: map_at_3
value: 68.72200000000001
- type: map_at_5
value: 70.813
- type: mrr_at_1
value: 85.064
- type: mrr_at_10
value: 89.69500000000001
- type: mrr_at_100
value: 89.792
- type: mrr_at_1000
value: 89.795
- type: mrr_at_20
value: 89.759
- type: mrr_at_3
value: 89.129
- type: mrr_at_5
value: 89.5
- type: ndcg_at_1
value: 85.064
- type: ndcg_at_10
value: 78.86999999999999
- type: ndcg_at_100
value: 81.134
- type: ndcg_at_1000
value: 81.862
- type: ndcg_at_20
value: 79.888
- type: ndcg_at_3
value: 74.579
- type: ndcg_at_5
value: 77.086
- type: precision_at_1
value: 85.064
- type: precision_at_10
value: 16.433
- type: precision_at_100
value: 1.818
- type: precision_at_1000
value: 0.191
- type: precision_at_20
value: 8.545
- type: precision_at_3
value: 48.508
- type: precision_at_5
value: 31.084
- type: recall_at_1
value: 42.532
- type: recall_at_10
value: 82.167
- type: recall_at_100
value: 90.905
- type: recall_at_1000
value: 95.699
- type: recall_at_20
value: 85.449
- type: recall_at_3
value: 72.762
- type: recall_at_5
value: 77.711
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: mteb/imdb
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 96.91919999999999
- type: ap
value: 95.88443935380744
- type: f1
value: 96.91873838978964
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: mteb/msmarco
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 21.747
- type: map_at_10
value: 34.764
- type: map_at_100
value: 35.981
- type: map_at_1000
value: 36.027
- type: map_at_20
value: 35.557
- type: map_at_3
value: 30.770999999999997
- type: map_at_5
value: 33.07
- type: mrr_at_1
value: 22.421
- type: mrr_at_10
value: 35.417
- type: mrr_at_100
value: 36.57
- type: mrr_at_1000
value: 36.61
- type: mrr_at_20
value: 36.174
- type: mrr_at_3
value: 31.516
- type: mrr_at_5
value: 33.783
- type: ndcg_at_1
value: 22.421
- type: ndcg_at_10
value: 42.003
- type: ndcg_at_100
value: 47.674
- type: ndcg_at_1000
value: 48.783
- type: ndcg_at_20
value: 44.789
- type: ndcg_at_3
value: 33.918
- type: ndcg_at_5
value: 38.011
- type: precision_at_1
value: 22.421
- type: precision_at_10
value: 6.712
- type: precision_at_100
value: 0.9520000000000001
- type: precision_at_1000
value: 0.105
- type: precision_at_20
value: 3.9309999999999996
- type: precision_at_3
value: 14.632000000000001
- type: precision_at_5
value: 10.845
- type: recall_at_1
value: 21.747
- type: recall_at_10
value: 64.2
- type: recall_at_100
value: 90.04100000000001
- type: recall_at_1000
value: 98.41499999999999
- type: recall_at_20
value: 74.982
- type: recall_at_3
value: 42.303000000000004
- type: recall_at_5
value: 52.11
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: mteb/mtop_domain
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 99.02872777017784
- type: f1
value: 98.8785703018425
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: mteb/mtop_intent
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 90.93935248518011
- type: f1
value: 75.46510480635821
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: mteb/amazon_massive_intent
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 82.49831876260927
- type: f1
value: 79.43439001730579
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: mteb/amazon_massive_scenario
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 84.50235373234702
- type: f1
value: 84.03906668934695
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: mteb/medrxiv-clustering-p2p
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 42.634572576984716
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: mteb/medrxiv-clustering-s2s
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 40.96861872930255
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: mteb/mind_small
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 31.986207669202933
- type: mrr
value: 33.11375583060012
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: mteb/nfcorpus
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 6.783
- type: map_at_10
value: 16.276
- type: map_at_100
value: 21.324
- type: map_at_1000
value: 23.166
- type: map_at_20
value: 18.383
- type: map_at_3
value: 11.296000000000001
- type: map_at_5
value: 13.504
- type: mrr_at_1
value: 53.559999999999995
- type: mrr_at_10
value: 61.589000000000006
- type: mrr_at_100
value: 62.11600000000001
- type: mrr_at_1000
value: 62.158
- type: mrr_at_20
value: 61.976
- type: mrr_at_3
value: 59.855999999999995
- type: mrr_at_5
value: 60.877
- type: ndcg_at_1
value: 50.15500000000001
- type: ndcg_at_10
value: 42.598
- type: ndcg_at_100
value: 39.15
- type: ndcg_at_1000
value: 47.888999999999996
- type: ndcg_at_20
value: 39.956
- type: ndcg_at_3
value: 46.836
- type: ndcg_at_5
value: 45.001000000000005
- type: precision_at_1
value: 52.322
- type: precision_at_10
value: 32.601
- type: precision_at_100
value: 10.145999999999999
- type: precision_at_1000
value: 2.358
- type: precision_at_20
value: 24.025
- type: precision_at_3
value: 44.169000000000004
- type: precision_at_5
value: 39.628
- type: recall_at_1
value: 6.783
- type: recall_at_10
value: 21.175
- type: recall_at_100
value: 40.097
- type: recall_at_1000
value: 71.65
- type: recall_at_20
value: 26.465
- type: recall_at_3
value: 12.589
- type: recall_at_5
value: 15.867999999999999
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: mteb/nq
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 43.376
- type: map_at_10
value: 60.968
- type: map_at_100
value: 61.614999999999995
- type: map_at_1000
value: 61.626000000000005
- type: map_at_20
value: 61.441
- type: map_at_3
value: 56.858
- type: map_at_5
value: 59.476
- type: mrr_at_1
value: 48.841
- type: mrr_at_10
value: 63.366
- type: mrr_at_100
value: 63.79
- type: mrr_at_1000
value: 63.797000000000004
- type: mrr_at_20
value: 63.682
- type: mrr_at_3
value: 60.535000000000004
- type: mrr_at_5
value: 62.348000000000006
- type: ndcg_at_1
value: 48.841
- type: ndcg_at_10
value: 68.362
- type: ndcg_at_100
value: 70.799
- type: ndcg_at_1000
value: 71.004
- type: ndcg_at_20
value: 69.804
- type: ndcg_at_3
value: 61.251
- type: ndcg_at_5
value: 65.28500000000001
- type: precision_at_1
value: 48.841
- type: precision_at_10
value: 10.588000000000001
- type: precision_at_100
value: 1.194
- type: precision_at_1000
value: 0.121
- type: precision_at_20
value: 5.646
- type: precision_at_3
value: 27.298000000000002
- type: precision_at_5
value: 18.841
- type: recall_at_1
value: 43.376
- type: recall_at_10
value: 88.053
- type: recall_at_100
value: 98.194
- type: recall_at_1000
value: 99.67200000000001
- type: recall_at_20
value: 93.318
- type: recall_at_3
value: 70.281
- type: recall_at_5
value: 79.28
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: mteb/quora
config: default
split: test
revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
metrics:
- type: map_at_1
value: 71.477
- type: map_at_10
value: 85.548
- type: map_at_100
value: 86.187
- type: map_at_1000
value: 86.199
- type: map_at_20
value: 85.971
- type: map_at_3
value: 82.50999999999999
- type: map_at_5
value: 84.447
- type: mrr_at_1
value: 82.35
- type: mrr_at_10
value: 88.039
- type: mrr_at_100
value: 88.14699999999999
- type: mrr_at_1000
value: 88.14699999999999
- type: mrr_at_20
value: 88.12100000000001
- type: mrr_at_3
value: 87.048
- type: mrr_at_5
value: 87.73100000000001
- type: ndcg_at_1
value: 82.35
- type: ndcg_at_10
value: 89.024
- type: ndcg_at_100
value: 90.18599999999999
- type: ndcg_at_1000
value: 90.245
- type: ndcg_at_20
value: 89.67399999999999
- type: ndcg_at_3
value: 86.167
- type: ndcg_at_5
value: 87.779
- type: precision_at_1
value: 82.35
- type: precision_at_10
value: 13.565
- type: precision_at_100
value: 1.544
- type: precision_at_1000
value: 0.157
- type: precision_at_20
value: 7.2010000000000005
- type: precision_at_3
value: 37.773
- type: precision_at_5
value: 24.924
- type: recall_at_1
value: 71.477
- type: recall_at_10
value: 95.821
- type: recall_at_100
value: 99.737
- type: recall_at_1000
value: 99.98599999999999
- type: recall_at_20
value: 97.90100000000001
- type: recall_at_3
value: 87.61
- type: recall_at_5
value: 92.135
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: mteb/reddit-clustering
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 66.43811157811552
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: mteb/reddit-clustering-p2p
config: default
split: test
revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
metrics:
- type: v_measure
value: 69.56403346330322
- task:
type: Retrieval
dataset:
name: MTEB SCIDOCS
type: mteb/scidocs
config: default
split: test
revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
metrics:
- type: map_at_1
value: 6.253
- type: map_at_10
value: 17.379
- type: map_at_100
value: 20.51
- type: map_at_1000
value: 20.881
- type: map_at_20
value: 18.983
- type: map_at_3
value: 12.061
- type: map_at_5
value: 14.546000000000001
- type: mrr_at_1
value: 30.8
- type: mrr_at_10
value: 43.814
- type: mrr_at_100
value: 44.883
- type: mrr_at_1000
value: 44.906
- type: mrr_at_20
value: 44.555
- type: mrr_at_3
value: 40.416999999999994
- type: mrr_at_5
value: 42.482
- type: ndcg_at_1
value: 30.8
- type: ndcg_at_10
value: 27.694999999999997
- type: ndcg_at_100
value: 38.248
- type: ndcg_at_1000
value: 43.547000000000004
- type: ndcg_at_20
value: 31.573
- type: ndcg_at_3
value: 26.239
- type: ndcg_at_5
value: 22.817999999999998
- type: precision_at_1
value: 30.8
- type: precision_at_10
value: 14.540000000000001
- type: precision_at_100
value: 2.9690000000000003
- type: precision_at_1000
value: 0.422
- type: precision_at_20
value: 9.5
- type: precision_at_3
value: 24.967
- type: precision_at_5
value: 20.22
- type: recall_at_1
value: 6.253
- type: recall_at_10
value: 29.465000000000003
- type: recall_at_100
value: 60.28
- type: recall_at_1000
value: 85.712
- type: recall_at_20
value: 38.578
- type: recall_at_3
value: 15.201999999999998
- type: recall_at_5
value: 20.507
- task:
type: STS
dataset:
name: MTEB SICK-R
type: mteb/sickr-sts
config: default
split: test
revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
metrics:
- type: cos_sim_pearson
value: 86.88263045065128
- type: cos_sim_spearman
value: 83.2199052396249
- type: euclidean_pearson
value: 83.89316748784084
- type: euclidean_spearman
value: 82.80089923470608
- type: manhattan_pearson
value: 83.79340504513027
- type: manhattan_spearman
value: 82.57647453394455
- task:
type: STS
dataset:
name: MTEB STS12
type: mteb/sts12-sts
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 87.23417612553622
- type: cos_sim_spearman
value: 79.40077017685032
- type: euclidean_pearson
value: 82.98069591415172
- type: euclidean_spearman
value: 77.72626690650102
- type: manhattan_pearson
value: 83.2549008896714
- type: manhattan_spearman
value: 77.97517379409553
- task:
type: STS
dataset:
name: MTEB STS13
type: mteb/sts13-sts
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 89.94319057478221
- type: cos_sim_spearman
value: 89.57673217959568
- type: euclidean_pearson
value: 88.52164819479393
- type: euclidean_spearman
value: 89.28792930444656
- type: manhattan_pearson
value: 88.63748131889201
- type: manhattan_spearman
value: 89.5337354128652
- task:
type: STS
dataset:
name: MTEB STS14
type: mteb/sts14-sts
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 87.0002285020644
- type: cos_sim_spearman
value: 84.85558709405255
- type: euclidean_pearson
value: 85.76743275817024
- type: euclidean_spearman
value: 84.7900299161083
- type: manhattan_pearson
value: 85.81372778099167
- type: manhattan_spearman
value: 84.88975144080597
- task:
type: STS
dataset:
name: MTEB STS15
type: mteb/sts15-sts
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 89.90992300865088
- type: cos_sim_spearman
value: 89.89952259773258
- type: euclidean_pearson
value: 88.95472170794739
- type: euclidean_spearman
value: 89.79840257558794
- type: manhattan_pearson
value: 89.00903847816028
- type: manhattan_spearman
value: 89.99292271664685
- task:
type: STS
dataset:
name: MTEB STS16
type: mteb/sts16-sts
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 85.75044299994977
- type: cos_sim_spearman
value: 86.31137676221347
- type: euclidean_pearson
value: 85.03198959400133
- type: euclidean_spearman
value: 85.62611072515675
- type: manhattan_pearson
value: 85.11681545306745
- type: manhattan_spearman
value: 85.75766564037835
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: mteb/sts17-crosslingual-sts
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 91.85595137809975
- type: cos_sim_spearman
value: 91.19454529401669
- type: euclidean_pearson
value: 90.88727698604517
- type: euclidean_spearman
value: 90.93184869101279
- type: manhattan_pearson
value: 90.79591587599141
- type: manhattan_spearman
value: 90.75783237234161
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: mteb/sts22-crosslingual-sts
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 71.60579944207497
- type: cos_sim_spearman
value: 70.08286575049202
- type: euclidean_pearson
value: 71.83195353568124
- type: euclidean_spearman
value: 70.3030975376705
- type: manhattan_pearson
value: 71.80222200714064
- type: manhattan_spearman
value: 70.04005646739672
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: mteb/stsbenchmark-sts
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 89.65716781275425
- type: cos_sim_spearman
value: 89.90701888074334
- type: euclidean_pearson
value: 88.50498754631819
- type: euclidean_spearman
value: 88.88763469318933
- type: manhattan_pearson
value: 88.58398429591064
- type: manhattan_spearman
value: 89.0138386837653
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: mteb/scidocs-reranking
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 89.26199160020026
- type: mrr
value: 96.86981772766087
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: mteb/scifact
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 62.827999999999996
- type: map_at_10
value: 74.028
- type: map_at_100
value: 74.264
- type: map_at_1000
value: 74.274
- type: map_at_20
value: 74.18599999999999
- type: map_at_3
value: 70.787
- type: map_at_5
value: 72.87
- type: mrr_at_1
value: 66.333
- type: mrr_at_10
value: 74.894
- type: mrr_at_100
value: 75.09599999999999
- type: mrr_at_1000
value: 75.105
- type: mrr_at_20
value: 75.024
- type: mrr_at_3
value: 72.833
- type: mrr_at_5
value: 73.917
- type: ndcg_at_1
value: 66.333
- type: ndcg_at_10
value: 78.82000000000001
- type: ndcg_at_100
value: 79.95
- type: ndcg_at_1000
value: 80.207
- type: ndcg_at_20
value: 79.324
- type: ndcg_at_3
value: 73.87899999999999
- type: ndcg_at_5
value: 76.399
- type: precision_at_1
value: 66.333
- type: precision_at_10
value: 10.5
- type: precision_at_100
value: 1.11
- type: precision_at_1000
value: 0.11299999999999999
- type: precision_at_20
value: 5.367
- type: precision_at_3
value: 29.110999999999997
- type: precision_at_5
value: 19.333
- type: recall_at_1
value: 62.827999999999996
- type: recall_at_10
value: 92.667
- type: recall_at_100
value: 98.0
- type: recall_at_1000
value: 100.0
- type: recall_at_20
value: 94.5
- type: recall_at_3
value: 79.5
- type: recall_at_5
value: 85.739
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: mteb/sprintduplicatequestions-pairclassification
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.85940594059406
- type: cos_sim_ap
value: 96.72065839344104
- type: cos_sim_f1
value: 92.85714285714286
- type: cos_sim_precision
value: 93.42105263157895
- type: cos_sim_recall
value: 92.30000000000001
- type: dot_accuracy
value: 99.84752475247525
- type: dot_ap
value: 96.41536649209695
- type: dot_f1
value: 92.24572004028197
- type: dot_precision
value: 92.90060851926978
- type: dot_recall
value: 91.60000000000001
- type: euclidean_accuracy
value: 99.86039603960396
- type: euclidean_ap
value: 96.63078081708719
- type: euclidean_f1
value: 92.87518948964124
- type: euclidean_precision
value: 93.87129724208376
- type: euclidean_recall
value: 91.9
- type: manhattan_accuracy
value: 99.86435643564356
- type: manhattan_ap
value: 96.71272943532432
- type: manhattan_f1
value: 93.05625950329447
- type: manhattan_precision
value: 94.34737923946557
- type: manhattan_recall
value: 91.8
- type: max_accuracy
value: 99.86435643564356
- type: max_ap
value: 96.72065839344104
- type: max_f1
value: 93.05625950329447
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: mteb/stackexchange-clustering
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 75.95483275621876
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: mteb/stackexchange-clustering-p2p
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 46.20364113200157
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: mteb/stackoverflowdupquestions-reranking
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 56.2577092438525
- type: mrr
value: 57.40251782531194
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: mteb/summeval
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.492357459875645
- type: cos_sim_spearman
value: 30.868968719156825
- type: dot_pearson
value: 29.44619482351129
- type: dot_spearman
value: 31.295984532577215
- task:
type: Retrieval
dataset:
name: MTEB TRECCOVID
type: mteb/trec-covid
config: default
split: test
revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
metrics:
- type: map_at_1
value: 0.22200000000000003
- type: map_at_10
value: 1.9290000000000003
- type: map_at_100
value: 12.435
- type: map_at_1000
value: 32.352
- type: map_at_20
value: 3.496
- type: map_at_3
value: 0.637
- type: map_at_5
value: 1.016
- type: mrr_at_1
value: 86.0
- type: mrr_at_10
value: 92.333
- type: mrr_at_100
value: 92.333
- type: mrr_at_1000
value: 92.333
- type: mrr_at_20
value: 92.333
- type: mrr_at_3
value: 92.0
- type: mrr_at_5
value: 92.0
- type: ndcg_at_1
value: 81.0
- type: ndcg_at_10
value: 75.32900000000001
- type: ndcg_at_100
value: 62.756
- type: ndcg_at_1000
value: 59.232
- type: ndcg_at_20
value: 73.393
- type: ndcg_at_3
value: 78.469
- type: ndcg_at_5
value: 76.953
- type: precision_at_1
value: 86.0
- type: precision_at_10
value: 79.4
- type: precision_at_100
value: 64.94
- type: precision_at_1000
value: 26.332
- type: precision_at_20
value: 77.3
- type: precision_at_3
value: 82.667
- type: precision_at_5
value: 80.4
- type: recall_at_1
value: 0.22200000000000003
- type: recall_at_10
value: 2.113
- type: recall_at_100
value: 16.02
- type: recall_at_1000
value: 57.227
- type: recall_at_20
value: 4.036
- type: recall_at_3
value: 0.6689999999999999
- type: recall_at_5
value: 1.076
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: mteb/touche2020
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 2.83
- type: map_at_10
value: 8.981
- type: map_at_100
value: 14.796000000000001
- type: map_at_1000
value: 16.451999999999998
- type: map_at_20
value: 11.361
- type: map_at_3
value: 5.143
- type: map_at_5
value: 6.537
- type: mrr_at_1
value: 36.735
- type: mrr_at_10
value: 50.99399999999999
- type: mrr_at_100
value: 51.775000000000006
- type: mrr_at_1000
value: 51.775000000000006
- type: mrr_at_20
value: 51.39
- type: mrr_at_3
value: 47.278999999999996
- type: mrr_at_5
value: 49.626
- type: ndcg_at_1
value: 34.694
- type: ndcg_at_10
value: 24.061
- type: ndcg_at_100
value: 35.832
- type: ndcg_at_1000
value: 47.875
- type: ndcg_at_20
value: 25.022
- type: ndcg_at_3
value: 27.939999999999998
- type: ndcg_at_5
value: 25.246000000000002
- type: precision_at_1
value: 36.735
- type: precision_at_10
value: 20.204
- type: precision_at_100
value: 7.224
- type: precision_at_1000
value: 1.516
- type: precision_at_20
value: 15.714
- type: precision_at_3
value: 27.211000000000002
- type: precision_at_5
value: 23.265
- type: recall_at_1
value: 2.83
- type: recall_at_10
value: 14.564
- type: recall_at_100
value: 45.251000000000005
- type: recall_at_1000
value: 81.849
- type: recall_at_20
value: 22.31
- type: recall_at_3
value: 6.065
- type: recall_at_5
value: 8.588
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: mteb/toxic_conversations_50k
config: default
split: test
revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
metrics:
- type: accuracy
value: 91.17320000000001
- type: ap
value: 41.18509354980418
- type: f1
value: 77.77470860794351
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: mteb/tweet_sentiment_extraction
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 77.92869269949067
- type: f1
value: 78.23271267071486
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: mteb/twentynewsgroups-clustering
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 59.59735858830448
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: mteb/twittersemeval2015-pairclassification
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 87.81069321094355
- type: cos_sim_ap
value: 79.1522153826438
- type: cos_sim_f1
value: 72.83363802559415
- type: cos_sim_precision
value: 67.678369195923
- type: cos_sim_recall
value: 78.83905013192613
- type: dot_accuracy
value: 87.369613160875
- type: dot_ap
value: 78.51617049363121
- type: dot_f1
value: 71.89735998026153
- type: dot_precision
value: 67.516218721038
- type: dot_recall
value: 76.88654353562005
- type: euclidean_accuracy
value: 87.72724563390356
- type: euclidean_ap
value: 78.45799796334607
- type: euclidean_f1
value: 72.7159880834161
- type: euclidean_precision
value: 68.65916549460853
- type: euclidean_recall
value: 77.28232189973615
- type: manhattan_accuracy
value: 87.57823210347499
- type: manhattan_ap
value: 78.24705251626389
- type: manhattan_f1
value: 72.34365129500948
- type: manhattan_precision
value: 69.4060606060606
- type: manhattan_recall
value: 75.54089709762533
- type: max_accuracy
value: 87.81069321094355
- type: max_ap
value: 79.1522153826438
- type: max_f1
value: 72.83363802559415
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: mteb/twitterurlcorpus-pairclassification
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 89.76597974152986
- type: cos_sim_ap
value: 87.15717597277224
- type: cos_sim_f1
value: 79.71815316150567
- type: cos_sim_precision
value: 76.89776060671103
- type: cos_sim_recall
value: 82.75331074838313
- type: dot_accuracy
value: 89.49237396670159
- type: dot_ap
value: 86.69824401657353
- type: dot_f1
value: 79.39796433985418
- type: dot_precision
value: 74.7316211441772
- type: dot_recall
value: 84.68586387434554
- type: euclidean_accuracy
value: 89.65149221872937
- type: euclidean_ap
value: 86.98932847862545
- type: euclidean_f1
value: 79.65759212314929
- type: euclidean_precision
value: 76.17876466868105
- type: euclidean_recall
value: 83.46935632891899
- type: manhattan_accuracy
value: 89.63402802033609
- type: manhattan_ap
value: 86.99550128469285
- type: manhattan_f1
value: 79.61443655494647
- type: manhattan_precision
value: 76.23476361586697
- type: manhattan_recall
value: 83.30766861718509
- type: max_accuracy
value: 89.76597974152986
- type: max_ap
value: 87.15717597277224
- type: max_f1
value: 79.71815316150567
---
## Model Details:
This embedding model is a fine-tuned 10.7B parameter LLM on the Intel Gaudi 2 processor using the [upstage/SOLAR-10.7B-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-v1.0).
## Date
July, 2024
## Training Details
Two stage training:
- General Text Embedding Training
- Specific domains Emebedding Training
More technical details will be updated later.
## Evaluation
The results of (MTEB)[https://huggingface.co/spaces/mteb/leaderboard] (English):
| Model Name | MTEB(56) ||
|:----:|:---------:|:----------:|
| [bge-base-en-1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 64.23 |
| [bge-large-en-1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 63.55 |
| [gte-large-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 65.39 |
| [gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 64.11 |
| [mxbai-embed-large-v1](https://huggingface.co/mixedbread-ai/mxbai-embed-large-v1) | 64.68 |
| [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 59.45 |
| [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 61.50 |
| [e5-mistral-7b-instruct](https://huggingface.co/intfloat/e5-mistral-7b-instruct) | 66.63 |
| [gte-Qwen1.5-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct) | 67.34 |
| [NV-Embed-v1](https://huggingface.co/nvidia/NV-Embed-v1) | 69.32 |
| [**neural-embedding-v1**](https://huggingface.co/Intel/neural-embedding-v1) | **69.94** |
| [
"BIOSSES",
"SCIFACT"
] |
Smabbler/Multiclass-Disease-Diagnosis-Model | Smabbler | text-classification | [
"text-classification",
"en",
"dataset:duxprajapati/symptom-disease-dataset",
"region:us"
] | "2024-07-01T14:53:25Z" | 2024-07-05T06:21:13+00:00 | 0 | 1 | ---
datasets:
- duxprajapati/symptom-disease-dataset
language:
- en
metrics:
- accuracy
- f1
pipeline_tag: text-classification
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1).
## Model Details
### Model Description
A predictive machine learning model was developed that can classify data points into distinct categories based on symptoms using diseases data.
- **Developed by:** Priyanka Kamila
- **Model type:** RandomForestClassifier, SVC
- **Language(s) (NLP):** EN
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
This model can be directly used for disease diagnosis based on binary encoded medical features. By inputting patient symptoms in the form of binary vectors, the model predicts the likely medical condition. Here’s how you can utilize the model:
Prepare Input Data:
Ensure that the input data is formatted as a binary matrix, where each row represents a patient and each column represents a symptom or feature.
The target variable should be a categorical label representing the medical condition.
Load the Model:
Load the trained Random Forest Classifier or SVM Classifier from the repository.
You can use libraries like joblib or pickle in Python to load the pre-trained model.
Make Predictions:
Use the loaded model to make predictions on new input data.
For instance, in Python:
python
Copy code
import joblib
model = joblib.load('path_to_model.pkl')
predictions = model.predict(new_input_data)
Interpret Results:
The model will output the predicted medical condition for each input row.
These predictions can be used by healthcare professionals to assist in diagnosing patients.
This model is intended for direct use in clinical decision support systems or healthcare applications where quick
and accurate disease diagnosis is critical. It can be integrated into electronic health records (EHR) systems, patient management software,
or used as a standalone diagnostic tool.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
This model is designed specifically for diagnosing diseases based on binary encoded medical features. It is important to recognize the limitations and
potential misuse of the model:
Non-Medical Applications:
The model is not suitable for non-medical applications or any use cases outside of healthcare diagnostics.
Using this model for unrelated classification tasks will yield inaccurate and irrelevant results.
Incomplete or Inaccurate Input Data:
The model relies on precise binary encoding of medical symptoms. Providing incomplete, inaccurate, or improperly formatted data can lead to incorrect diagnoses.
It is crucial to ensure that input data is complete and correctly formatted according to the binary encoding schema used during model training.
Real-Time Critical Decisions:
While the model can aid in diagnosis, it should not be solely relied upon for real-time critical medical decisions without human oversight.
Healthcare professionals should verify the model’s predictions and consider additional clinical information and diagnostics before making final decisions.
Malicious Use:
The model should not be used to intentionally misdiagnose or manipulate medical diagnoses for fraudulent purposes.
Ensuring ethical use of the model is paramount, and it should only be used to assist in improving patient care.
Diagnostic Scope Limitation:
The model is trained on specific diseases included in the dataset. It may not perform well in diagnosing conditions outside the scope of its training data.
For diseases not represented in the training data, the model might default to predicting "other," which should be interpreted with caution.
General Population Screening:
This model is not intended for general population screening or predicting disease prevalence in broad, non-clinical populations.
It is designed for use with patients already presenting symptoms or those in a clinical setting.
By understanding these limitations and potential misuse scenarios, users can ensure that the model is applied appropriately and ethically in relevant healthcare contexts.
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
The training data used for this model consists of a custom dataset with binary encoded medical features. Each row in the dataset represents a patient's symptoms encoded as binary values, and the corresponding label represents the diagnosed disease. The dataset includes a wide range of medical conditions, with the aim of providing a comprehensive diagnostic tool.
Source of Data:
The dataset was compiled from https://huggingface.co/datasets/duxprajapati/symptom-disease-dataset from huggingface
which was then processed in terms of data-labeling using Smabbler's QueryLab platform ensuring a accurate representation of data-labels for common and rare diseases.
### Pre-processing:
The pre-processing stage is very crucial to the building of an accurate machine learning model and in terms of ensuring its reliability to be used in medical domain.
It involves data cleaning process which is a bit labor-intensive involving extensive manual checks for consistency and iterative validation for retaining high quality of final dataset.
These processes are particularly complex while dealing with medical data.
Here the data was pre-processed to ensure consistency and accuracy. This involved cleaning the data, handling missing values, and normalizing the binary encoding.
Each symptom was converted into a binary feature (0 or 1), indicating its absence or presence respectively.
The labels were mapped to specific diseases using a detailed mapping file to ensure accurate representation.
Smabbler made the pre-processing method easy by providing automated labeling,reducing the manual effort, ensuring consistency,
and maintained high accuracy in the pre-processed dataset,
making it a crucial asset in building a reliable disease diagnostic model.
The data cleaning process, which would have been labor-intensive and time-consuming, was significantly expedited by Smabbler's tools and features.The platform's automation,
standardization, and validation capabilities ensured that the pre-processing was not only quicker but also more reliable and accurate.
Label Mapping:
The labels in the dataset correspond to various diseases. A mapping file (mapping.json) was used to translate encoded labels to human-readable disease names.
Top labels include diseases like Psoriasis, Malaria, Bronchial Asthma, Dengue, Arthritis, Heart Attack, and many more.
Additional Documentation:
Detailed documentation on data pre-processing and filtering steps is provided to ensure reproducibility and transparency.
The dataset card includes information on the data sources, pre-processing steps, and any additional filtering or transformations applied.
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
The training procedure for this model involves several key steps to ensure robust and accurate disease diagnosis using Random Forest and SVM classifiers. Below are the detailed steps and technical specifications related to the training procedure:
Data Splitting:
The dataset was split into training and testing sets using an 80-20 split ratio.
The training set was used to train the classifiers, while the testing set was used to evaluate the model’s performance.
Feature Selection:
Binary encoded features representing the presence or absence of symptoms were selected as input features.
The target variable was the disease label, which was mapped from encoded integers to human-readable disease names.
Model Initialization:
Two classifiers were initialized: Random Forest Classifier and Support Vector Machine (SVM) Classifier.
Both classifiers were initialized with default parameters and a fixed random state to ensure reproducibility.
Training the Models:
Random Forest Classifier:
The Random Forest model was trained on the training data using the fit method.
Hyperparameters such as the number of trees and depth were tuned to optimize performance.
SVM Classifier:
The SVM model was similarly trained using the fit method.
Kernel type, regularization parameters, and other hyperparameters were adjusted for optimal classification.
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
The performance of both models was evaluated on the testing set.
Metrics such as accuracy, precision, recall, and f1-score were calculated to assess model performance.
Confusion matrices were generated to visualize the performance of each classifier in predicting the correct disease labels.


### Results

#### Summary
This model utilizes both Random Forest and SVM classifiers to accurately diagnose a variety of diseases based on binary encoded medical features.
The training involved data pre-processing, feature selection, model training,
and extensive evaluation to ensure reliability. Designed for healthcare applications,
it aids professionals in making informed diagnostic decisions efficiently.
## Model Card Authors
Priyanka Kamila
| [
"MEDICAL DATA"
] |
zhan1993/mbc_library_phi2_icml | zhan1993 | null | [
"region:us"
] | "2024-07-06T15:04:40Z" | 2024-07-06T15:06:18+00:00 | 0 | 0 | ---
{}
---
Number of experts present in the library: 10
| Expert Name | Base Model | Trained on | Adapter Type |
| --- | --- | --- | --- |
| phi2_joint_3epoch_sim_cluster_3 | phi-2 | sordonia/flan-10k-flat/wiki_qa_found_on_google,app_reviews_categorize_rating_using_review,race_middle_Is_this_the_right_answer,super_glue_cb_1_0_2,wiki_qa_Topic_Prediction_Answer_Only,wiki_qa_Direct_Answer_to_Question,super_glue_wsc_fixed_1_0_2,cot_gsm8k_ii,unified_qa_science_inst,race_high_Is_this_the_right_answer,cot_strategyqa,cot_ecqa_ii,quarel_do_not_use,wiki_qa_exercise,wiki_qa_automatic_system,cot_creak_ii,quarel_heres_a_story,quarel_choose_between,stream_qed_ii,wiki_qa_Topic_Prediction_Question_Only,glue_qnli_2_0_0,cot_sensemaking_ii,super_glue_copa_1_0_2,social_i_qa_Generate_the_question_from_the_answer,social_i_qa_Show_choices_and_generate_index,quarel_testing_students,wiki_qa_Topic_Prediction_Question_and_Answer_Pair,wiki_qa_Decide_good_answer,wiki_qa_Jeopardy_style,wiki_qa_Generate_Question_from_Topic,definite_pronoun_resolution_1_1_0,wiqa_effect_with_label_answer,glue_wnli_2_0_0,cot_qasc,cot_strategyqa_ii,quarel_logic_test,stream_aqua_ii | lora |
| phi2_joint_3epoch_sim_cluster_1 | phi-2 | sordonia/flan-10k-flat/natural_questions_open_1_0_0,web_questions_whats_the_answer,web_questions_question_answer,dbpedia_14_pick_one_category_for_the_following_text,kilt_tasks_hotpotqa_combining_facts,web_questions_short_general_knowledge_q,kilt_tasks_hotpotqa_straighforward_qa,adversarial_qa_dbidaf_generate_question,adversarial_qa_droberta_based_on,web_questions_get_the_answer,kilt_tasks_hotpotqa_complex_question,web_questions_potential_correct_answer,trivia_qa_rc_1_1_0,kilt_tasks_hotpotqa_formulate,adversarial_qa_dbert_based_on,adversarial_qa_dbidaf_based_on,squad_v1_1_3_0_0 | lora |
| phi2_joint_3epoch_sim_cluster_4 | phi-2 | sordonia/flan-10k-flat/wiqa_what_might_be_the_first_step_of_the_process,wiqa_what_is_the_final_step_of_the_following_process,wmt16_translate_ro_en_1_0_0,wiqa_what_might_be_the_last_step_of_the_process,wiki_bio_key_content,gem_common_gen_1_1_0,duorc_SelfRC_build_story_around_qa,app_reviews_generate_review,wiki_bio_what_content,wiki_bio_who,gem_e2e_nlg_1_1_0,cot_esnli_ii,wmt16_translate_tr_en_1_0_0,wiqa_what_is_the_missing_first_step,wiki_bio_comprehension,coqa_1_0_0,duorc_ParaphraseRC_build_story_around_qa,multi_news_1_0_0 | lora |
| phi2_joint_3epoch_sim_cluster_7 | phi-2 | sordonia/flan-10k-flat/glue_sst2_2_0_0,adversarial_qa_droberta_generate_question,true_case,stream_qed,huggingface_xsum,cot_esnli,cot_gsm8k,trec_1_0_0,yelp_polarity_reviews_0_2_0,lambada_1_0_0,glue_cola_2_0_0,ag_news_subset_1_0_0,gem_dart_1_1_0,math_dataset_algebra__linear_1d_1_0_0,cnn_dailymail_3_4_0,wiki_hop_original_explain_relation,dbpedia_14_given_list_what_category_does_the_paragraph_belong_to,gem_wiki_lingua_english_en_1_1_0,fix_punct,imdb_reviews_plain_text_1_0_0,race_middle_Write_a_multi_choice_question_for_the_following_article,gigaword_1_2_0,dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to,gem_web_nlg_en_1_1_0,word_segment,race_high_Write_a_multi_choice_question_for_the_following_article,wmt16_translate_de_en_1_0_0,cot_ecqa,aeslc_1_0_0,dream_generate_first_utterance,wmt16_translate_fi_en_1_0_0,dream_answer_to_dialogue,para_crawl_enes,adversarial_qa_dbert_generate_question,race_middle_Write_a_multi_choice_question_options_given_,wmt14_translate_fr_en_1_0_0 | lora |
| phi2_joint_3epoch_sim_cluster_2 | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_question_context_answer,super_glue_record_1_0_2,wiki_hop_original_generate_object,adversarial_qa_droberta_tell_what_it_is,dbpedia_14_given_a_choice_of_categories_,wiki_hop_original_choose_best_object_affirmative_3,quac_1_0_0,wiki_hop_original_choose_best_object_interrogative_1,wiki_hop_original_choose_best_object_affirmative_1,adversarial_qa_dbert_answer_the_following_q,wiki_hop_original_choose_best_object_interrogative_2,adversarial_qa_droberta_question_context_answer,squad_v2_0_3_0_0,wiki_hop_original_generate_subject,wiki_bio_guess_person,adversarial_qa_dbidaf_answer_the_following_q,adversarial_qa_droberta_answer_the_following_q,adversarial_qa_dbert_tell_what_it_is,race_high_Write_a_multi_choice_question_options_given_,wiki_hop_original_choose_best_object_affirmative_2,wiki_hop_original_generate_subject_and_object,drop_2_0_0,adversarial_qa_dbert_question_context_answer,adversarial_qa_dbidaf_tell_what_it_is | lora |
| phi2_joint_3epoch_sim_cluster_5 | phi-2 | sordonia/flan-10k-flat/race_middle_Read_the_article_and_answer_the_question_no_option_,race_high_Select_the_best_answer,quail_description_context_question_answer_id,quail_context_question_description_text,race_high_Read_the_article_and_answer_the_question_no_option_,race_high_Select_the_best_answer_no_instructions_,quail_context_description_question_answer_id,race_high_Taking_a_test,super_glue_multirc_1_0_2,race_middle_Select_the_best_answer,quail_context_question_description_answer_id,quail_description_context_question_answer_text,quail_context_question_answer_description_text,race_high_Select_the_best_answer_generate_span_,race_middle_Select_the_best_answer_generate_span_,quail_context_question_answer_description_id,quail_context_description_question_answer_text,quail_context_description_question_text,quail_context_question_description_answer_text,quail_description_context_question_text,race_middle_Taking_a_test,quail_no_prompt_id,quail_no_prompt_text,race_middle_Select_the_best_answer_no_instructions_ | lora |
| phi2_joint_3epoch_sim_cluster_10 | phi-2 | sordonia/flan-10k-flat/dream_read_the_following_conversation_and_answer_the_question,app_reviews_convert_to_star_rating,cos_e_v1_11_question_option_description_text,social_i_qa_Show_choices_and_generate_answer,quartz_answer_question_based_on,sciq_Direct_Question_Closed_Book_,qasc_qa_with_separated_facts_3,quartz_given_the_fact_answer_the_q,quartz_answer_question_below,kilt_tasks_hotpotqa_final_exam,sciq_Multiple_Choice,wiqa_does_the_supposed_perturbation_have_an_effect,cos_e_v1_11_question_description_option_text,wiki_qa_Is_This_True_,quartz_use_info_from_question_paragraph,sciq_Direct_Question,qasc_qa_with_separated_facts_2,wiqa_which_of_the_following_is_the_supposed_perturbation,app_reviews_convert_to_rating,cos_e_v1_11_question_option_description_id,wiqa_effect_with_string_answer,qasc_qa_with_separated_facts_5,dream_baseline,quartz_having_read_above_passage,cos_e_v1_11_question_description_option_id,qasc_qa_with_separated_facts_1,cos_e_v1_11_description_question_option_text,qasc_qa_with_combined_facts_1,qasc_is_correct_1,cos_e_v1_11_description_question_option_id,social_i_qa_Check_if_a_random_answer_is_valid_or_not,sciq_Multiple_Choice_Closed_Book_,quartz_use_info_from_paragraph_question,qasc_is_correct_2,qasc_qa_with_separated_facts_4,quartz_read_passage_below_choose,quartz_paragraph_question_plain_concat,sciq_Multiple_Choice_Question_First | lora |
| phi2_joint_3epoch_sim_cluster_9 | phi-2 | sordonia/flan-10k-flat/super_glue_rte_1_0_2,cot_sensemaking,super_glue_wic_1_0_2,cos_e_v1_11_rationale,anli_r3_0_1_0,dream_generate_last_utterance,paws_wiki_1_1_0,cos_e_v1_11_generate_explanation_given_text,cot_creak,stream_aqua,snli_1_1_0,cos_e_v1_11_i_think,glue_qqp_2_0_0,cos_e_v1_11_explain_why_human,anli_r2_0_1_0,anli_r1_0_1_0,glue_stsb_2_0_0,cos_e_v1_11_aligned_with_common_sense,glue_mnli_2_0_0,social_i_qa_I_was_wondering,cosmos_qa_1_0_0,glue_mrpc_2_0_0,social_i_qa_Generate_answer | lora |
| phi2_joint_3epoch_sim_cluster_8 | phi-2 | sordonia/flan-10k-flat/ropes_background_new_situation_answer,ropes_prompt_bottom_no_hint,ropes_plain_background_situation,ropes_new_situation_background_answer,ropes_given_background_situation,ropes_prompt_bottom_hint_beginning,ropes_prompt_beginning,ropes_read_background_situation,ropes_plain_bottom_hint,ropes_plain_no_background,ropes_prompt_mix,ropes_background_situation_middle | lora |
| phi2_joint_3epoch_sim_cluster_6 | phi-2 | sordonia/flan-10k-flat/quoref_Context_Contains_Answer,duorc_SelfRC_generate_question_by_answer,quoref_Find_Answer,duorc_ParaphraseRC_movie_director,duorc_ParaphraseRC_answer_question,quoref_Found_Context_Online,quoref_Read_And_Extract_,duorc_ParaphraseRC_title_generation,duorc_ParaphraseRC_decide_worth_it,quoref_What_Is_The_Answer,duorc_ParaphraseRC_generate_question,quoref_Guess_Title_For_Context,quoref_Answer_Test,duorc_SelfRC_question_answering,duorc_SelfRC_title_generation,duorc_ParaphraseRC_generate_question_by_answer,duorc_ParaphraseRC_extract_answer,duorc_SelfRC_answer_question,duorc_SelfRC_decide_worth_it,duorc_ParaphraseRC_question_answering,quoref_Answer_Question_Given_Context,duorc_SelfRC_extract_answer,quoref_Guess_Answer,quoref_Answer_Friend_Question,duorc_SelfRC_movie_director,duorc_SelfRC_generate_question,quoref_Given_Context_Answer_Question | lora |
Last updated on: 2024-07-06 15:04:40+00:00
| [
"SCIQ"
] |
zhan1993/private_library_phi2_icml | zhan1993 | null | [
"region:us"
] | "2024-07-06T21:37:35Z" | 2024-07-07T08:27:47+00:00 | 0 | 0 | ---
{}
---
Number of experts present in the library: 263
| Expert Name | Base Model | Trained on | Adapter Type |
| --- | --- | --- | --- |
| adversarial_qa_dbidaf_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_answer_the_following_q | lora |
| adversarial_qa_dbert_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_based_on | lora |
| adversarial_qa_dbidaf_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_based_on | lora |
| adversarial_qa_dbert_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_question_context_answer | lora |
| adversarial_qa_dbert_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_generate_question | lora |
| adversarial_qa_dbidaf_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_generate_question | lora |
| adversarial_qa_dbidaf_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_question_context_answer | lora |
| adversarial_qa_dbidaf_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbidaf_tell_what_it_is | lora |
| adversarial_qa_droberta_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_answer_the_following_q | lora |
| adversarial_qa_droberta_based_on | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_based_on | lora |
| adversarial_qa_dbert_answer_the_following_q | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_answer_the_following_q | lora |
| adversarial_qa_dbert_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_dbert_tell_what_it_is | lora |
| adversarial_qa_droberta_generate_question | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_generate_question | lora |
| adversarial_qa_droberta_question_context_answer | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_question_context_answer | lora |
| aeslc_1_0_0 | phi-2 | sordonia/flan-10k-flat/aeslc_1_0_0 | lora |
| anli_r1_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r1_0_1_0 | lora |
| ai2_arc_ARC_Easy_1_0_0 | phi-2 | sordonia/flan-10k-flat/ai2_arc_ARC_Easy_1_0_0 | lora |
| ai2_arc_ARC_Challenge_1_0_0 | phi-2 | sordonia/flan-10k-flat/ai2_arc_ARC_Challenge_1_0_0 | lora |
| ag_news_subset_1_0_0 | phi-2 | sordonia/flan-10k-flat/ag_news_subset_1_0_0 | lora |
| adversarial_qa_droberta_tell_what_it_is | phi-2 | sordonia/flan-10k-flat/adversarial_qa_droberta_tell_what_it_is | lora |
| anli_r2_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r2_0_1_0 | lora |
| anli_r3_0_1_0 | phi-2 | sordonia/flan-10k-flat/anli_r3_0_1_0 | lora |
| app_reviews_categorize_rating_using_review | phi-2 | sordonia/flan-10k-flat/app_reviews_categorize_rating_using_review | lora |
| coqa_1_0_0 | phi-2 | sordonia/flan-10k-flat/coqa_1_0_0 | lora |
| app_reviews_convert_to_star_rating | phi-2 | sordonia/flan-10k-flat/app_reviews_convert_to_star_rating | lora |
| cnn_dailymail_3_4_0 | phi-2 | sordonia/flan-10k-flat/cnn_dailymail_3_4_0 | lora |
| cos_e_v1_11_aligned_with_common_sense | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_aligned_with_common_sense | lora |
| bool_q_1_0_0 | phi-2 | sordonia/flan-10k-flat/bool_q_1_0_0 | lora |
| app_reviews_generate_review | phi-2 | sordonia/flan-10k-flat/app_reviews_generate_review | lora |
| app_reviews_convert_to_rating | phi-2 | sordonia/flan-10k-flat/app_reviews_convert_to_rating | lora |
| cos_e_v1_11_i_think | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_i_think | lora |
| cos_e_v1_11_explain_why_human | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_explain_why_human | lora |
| cos_e_v1_11_description_question_option_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_id | lora |
| cos_e_v1_11_description_question_option_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_text | lora |
| cos_e_v1_11_question_description_option_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_id | lora |
| cos_e_v1_11_question_description_option_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_text | lora |
| cos_e_v1_11_generate_explanation_given_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_generate_explanation_given_text | lora |
| cot_creak | phi-2 | sordonia/flan-10k-flat/cot_creak | lora |
| cos_e_v1_11_question_option_description_text | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_text | lora |
| cot_creak_ii | phi-2 | sordonia/flan-10k-flat/cot_creak_ii | lora |
| cos_e_v1_11_question_option_description_id | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_id | lora |
| cosmos_qa_1_0_0 | phi-2 | sordonia/flan-10k-flat/cosmos_qa_1_0_0 | lora |
| cos_e_v1_11_rationale | phi-2 | sordonia/flan-10k-flat/cos_e_v1_11_rationale | lora |
| cot_ecqa | phi-2 | sordonia/flan-10k-flat/cot_ecqa | lora |
| cot_gsm8k | phi-2 | sordonia/flan-10k-flat/cot_gsm8k | lora |
| cot_qasc | phi-2 | sordonia/flan-10k-flat/cot_qasc | lora |
| cot_sensemaking_ii | phi-2 | sordonia/flan-10k-flat/cot_sensemaking_ii | lora |
| cot_sensemaking | phi-2 | sordonia/flan-10k-flat/cot_sensemaking | lora |
| cot_strategyqa | phi-2 | sordonia/flan-10k-flat/cot_strategyqa | lora |
| cot_gsm8k_ii | phi-2 | sordonia/flan-10k-flat/cot_gsm8k_ii | lora |
| cot_esnli_ii | phi-2 | sordonia/flan-10k-flat/cot_esnli_ii | lora |
| cot_esnli | phi-2 | sordonia/flan-10k-flat/cot_esnli | lora |
| dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | lora |
| cot_ecqa_ii | phi-2 | sordonia/flan-10k-flat/cot_ecqa_ii | lora |
| cot_strategyqa_ii | phi-2 | sordonia/flan-10k-flat/cot_strategyqa_ii | lora |
| dream_answer_to_dialogue | phi-2 | sordonia/flan-10k-flat/dream_answer_to_dialogue | lora |
| dream_generate_first_utterance | phi-2 | sordonia/flan-10k-flat/dream_generate_first_utterance | lora |
| definite_pronoun_resolution_1_1_0 | phi-2 | sordonia/flan-10k-flat/definite_pronoun_resolution_1_1_0 | lora |
| dbpedia_14_pick_one_category_for_the_following_text | phi-2 | sordonia/flan-10k-flat/dbpedia_14_pick_one_category_for_the_following_text | lora |
| dream_generate_last_utterance | phi-2 | sordonia/flan-10k-flat/dream_generate_last_utterance | lora |
| dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | lora |
| dream_baseline | phi-2 | sordonia/flan-10k-flat/dream_baseline | lora |
| dream_read_the_following_conversation_and_answer_the_question | phi-2 | sordonia/flan-10k-flat/dream_read_the_following_conversation_and_answer_the_question | lora |
| duorc_ParaphraseRC_answer_question | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_answer_question | lora |
| duorc_ParaphraseRC_build_story_around_qa | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_build_story_around_qa | lora |
| duorc_ParaphraseRC_extract_answer | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_extract_answer | lora |
| duorc_ParaphraseRC_decide_worth_it | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_decide_worth_it | lora |
| duorc_ParaphraseRC_generate_question | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question | lora |
| dbpedia_14_given_a_choice_of_categories_ | phi-2 | sordonia/flan-10k-flat/dbpedia_14_given_a_choice_of_categories_ | lora |
| duorc_SelfRC_answer_question | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_answer_question | lora |
| duorc_ParaphraseRC_generate_question_by_answer | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question_by_answer | lora |
| duorc_ParaphraseRC_title_generation | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_title_generation | lora |
| drop_2_0_0 | phi-2 | sordonia/flan-10k-flat/drop_2_0_0 | lora |
| duorc_SelfRC_decide_worth_it | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_decide_worth_it | lora |
| duorc_ParaphraseRC_movie_director | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_movie_director | lora |
| duorc_SelfRC_extract_answer | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_extract_answer | lora |
| duorc_SelfRC_movie_director | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_movie_director | lora |
| duorc_SelfRC_generate_question | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_generate_question | lora |
| duorc_SelfRC_question_answering | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_question_answering | lora |
| duorc_SelfRC_generate_question_by_answer | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_generate_question_by_answer | lora |
| duorc_SelfRC_title_generation | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_title_generation | lora |
| fix_punct | phi-2 | sordonia/flan-10k-flat/fix_punct | lora |
| gem_dart_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_dart_1_1_0 | lora |
| duorc_SelfRC_build_story_around_qa | phi-2 | sordonia/flan-10k-flat/duorc_SelfRC_build_story_around_qa | lora |
| gem_web_nlg_en_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_web_nlg_en_1_1_0 | lora |
| gem_common_gen_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_common_gen_1_1_0 | lora |
| gem_wiki_lingua_english_en_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_wiki_lingua_english_en_1_1_0 | lora |
| gem_e2e_nlg_1_1_0 | phi-2 | sordonia/flan-10k-flat/gem_e2e_nlg_1_1_0 | lora |
| glue_cola_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_cola_2_0_0 | lora |
| duorc_ParaphraseRC_question_answering | phi-2 | sordonia/flan-10k-flat/duorc_ParaphraseRC_question_answering | lora |
| glue_mnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_mnli_2_0_0 | lora |
| glue_stsb_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_stsb_2_0_0 | lora |
| glue_qqp_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_qqp_2_0_0 | lora |
| glue_qnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_qnli_2_0_0 | lora |
| glue_mrpc_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_mrpc_2_0_0 | lora |
| glue_sst2_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_sst2_2_0_0 | lora |
| glue_wnli_2_0_0 | phi-2 | sordonia/flan-10k-flat/glue_wnli_2_0_0 | lora |
| gigaword_1_2_0 | phi-2 | sordonia/flan-10k-flat/gigaword_1_2_0 | lora |
| kilt_tasks_hotpotqa_formulate | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_formulate | lora |
| hellaswag_1_1_0 | phi-2 | sordonia/flan-10k-flat/hellaswag_1_1_0 | lora |
| imdb_reviews_plain_text_1_0_0 | phi-2 | sordonia/flan-10k-flat/imdb_reviews_plain_text_1_0_0 | lora |
| kilt_tasks_hotpotqa_combining_facts | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_combining_facts | lora |
| huggingface_xsum | phi-2 | sordonia/flan-10k-flat/huggingface_xsum | lora |
| kilt_tasks_hotpotqa_complex_question | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_complex_question | lora |
| kilt_tasks_hotpotqa_final_exam | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_final_exam | lora |
| kilt_tasks_hotpotqa_straighforward_qa | phi-2 | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_straighforward_qa | lora |
| natural_questions_open_1_0_0 | phi-2 | sordonia/flan-10k-flat/natural_questions_open_1_0_0 | lora |
| para_crawl_enes | phi-2 | sordonia/flan-10k-flat/para_crawl_enes | lora |
| paws_wiki_1_1_0 | phi-2 | sordonia/flan-10k-flat/paws_wiki_1_1_0 | lora |
| lambada_1_0_0 | phi-2 | sordonia/flan-10k-flat/lambada_1_0_0 | lora |
| piqa_1_0_0 | phi-2 | sordonia/flan-10k-flat/piqa_1_0_0 | lora |
| openbookqa_0_1_0 | phi-2 | sordonia/flan-10k-flat/openbookqa_0_1_0 | lora |
| multi_news_1_0_0 | phi-2 | sordonia/flan-10k-flat/multi_news_1_0_0 | lora |
| math_dataset_algebra__linear_1d_1_0_0 | phi-2 | sordonia/flan-10k-flat/math_dataset_algebra__linear_1d_1_0_0 | lora |
| qasc_qa_with_combined_facts_1 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_combined_facts_1 | lora |
| qasc_qa_with_separated_facts_1 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_1 | lora |
| qasc_is_correct_2 | phi-2 | sordonia/flan-10k-flat/qasc_is_correct_2 | lora |
| qasc_qa_with_separated_facts_2 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_2 | lora |
| qasc_is_correct_1 | phi-2 | sordonia/flan-10k-flat/qasc_is_correct_1 | lora |
| qasc_qa_with_separated_facts_5 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_5 | lora |
| qasc_qa_with_separated_facts_3 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_3 | lora |
| qasc_qa_with_separated_facts_4 | phi-2 | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_4 | lora |
| quail_context_question_answer_description_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_answer_description_text | lora |
| quail_context_description_question_answer_text | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_answer_text | lora |
| quail_context_description_question_text | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_text | lora |
| quac_1_0_0 | phi-2 | sordonia/flan-10k-flat/quac_1_0_0 | lora |
| quail_context_question_answer_description_id | phi-2 | sordonia/flan-10k-flat/quail_context_question_answer_description_id | lora |
| quail_context_question_description_answer_id | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_answer_id | lora |
| quail_context_question_description_answer_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_answer_text | lora |
| quail_context_description_question_answer_id | phi-2 | sordonia/flan-10k-flat/quail_context_description_question_answer_id | lora |
| quail_description_context_question_answer_id | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_answer_id | lora |
| quail_context_question_description_text | phi-2 | sordonia/flan-10k-flat/quail_context_question_description_text | lora |
| quarel_choose_between | phi-2 | sordonia/flan-10k-flat/quarel_choose_between | lora |
| quail_description_context_question_text | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_text | lora |
| quail_no_prompt_id | phi-2 | sordonia/flan-10k-flat/quail_no_prompt_id | lora |
| quarel_do_not_use | phi-2 | sordonia/flan-10k-flat/quarel_do_not_use | lora |
| quail_no_prompt_text | phi-2 | sordonia/flan-10k-flat/quail_no_prompt_text | lora |
| quarel_heres_a_story | phi-2 | sordonia/flan-10k-flat/quarel_heres_a_story | lora |
| quarel_logic_test | phi-2 | sordonia/flan-10k-flat/quarel_logic_test | lora |
| quartz_given_the_fact_answer_the_q | phi-2 | sordonia/flan-10k-flat/quartz_given_the_fact_answer_the_q | lora |
| quartz_having_read_above_passage | phi-2 | sordonia/flan-10k-flat/quartz_having_read_above_passage | lora |
| quartz_answer_question_below | phi-2 | sordonia/flan-10k-flat/quartz_answer_question_below | lora |
| quartz_answer_question_based_on | phi-2 | sordonia/flan-10k-flat/quartz_answer_question_based_on | lora |
| quail_description_context_question_answer_text | phi-2 | sordonia/flan-10k-flat/quail_description_context_question_answer_text | lora |
| quartz_paragraph_question_plain_concat | phi-2 | sordonia/flan-10k-flat/quartz_paragraph_question_plain_concat | lora |
| quarel_testing_students | phi-2 | sordonia/flan-10k-flat/quarel_testing_students | lora |
| quartz_read_passage_below_choose | phi-2 | sordonia/flan-10k-flat/quartz_read_passage_below_choose | lora |
| quartz_use_info_from_paragraph_question | phi-2 | sordonia/flan-10k-flat/quartz_use_info_from_paragraph_question | lora |
| quoref_Answer_Friend_Question | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Friend_Question | lora |
| quartz_use_info_from_question_paragraph | phi-2 | sordonia/flan-10k-flat/quartz_use_info_from_question_paragraph | lora |
| quoref_Found_Context_Online | phi-2 | sordonia/flan-10k-flat/quoref_Found_Context_Online | lora |
| quoref_Context_Contains_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Context_Contains_Answer | lora |
| quoref_Answer_Test | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Test | lora |
| quoref_Answer_Question_Given_Context | phi-2 | sordonia/flan-10k-flat/quoref_Answer_Question_Given_Context | lora |
| quoref_Find_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Find_Answer | lora |
| quoref_Guess_Answer | phi-2 | sordonia/flan-10k-flat/quoref_Guess_Answer | lora |
| quoref_Guess_Title_For_Context | phi-2 | sordonia/flan-10k-flat/quoref_Guess_Title_For_Context | lora |
| quoref_Given_Context_Answer_Question | phi-2 | sordonia/flan-10k-flat/quoref_Given_Context_Answer_Question | lora |
| race_high_Is_this_the_right_answer | phi-2 | sordonia/flan-10k-flat/race_high_Is_this_the_right_answer | lora |
| quoref_What_Is_The_Answer | phi-2 | sordonia/flan-10k-flat/quoref_What_Is_The_Answer | lora |
| race_high_Read_the_article_and_answer_the_question_no_option_ | phi-2 | sordonia/flan-10k-flat/race_high_Read_the_article_and_answer_the_question_no_option_ | lora |
| race_high_Select_the_best_answer_generate_span_ | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer_generate_span_ | lora |
| quoref_Read_And_Extract_ | phi-2 | sordonia/flan-10k-flat/quoref_Read_And_Extract_ | lora |
| race_high_Select_the_best_answer_no_instructions_ | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer_no_instructions_ | lora |
| race_middle_Is_this_the_right_answer | phi-2 | sordonia/flan-10k-flat/race_middle_Is_this_the_right_answer | lora |
| race_high_Write_a_multi_choice_question_for_the_following_article | phi-2 | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_for_the_following_article | lora |
| race_high_Write_a_multi_choice_question_options_given_ | phi-2 | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_options_given_ | lora |
| race_middle_Select_the_best_answer | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer | lora |
| race_middle_Read_the_article_and_answer_the_question_no_option_ | phi-2 | sordonia/flan-10k-flat/race_middle_Read_the_article_and_answer_the_question_no_option_ | lora |
| race_high_Taking_a_test | phi-2 | sordonia/flan-10k-flat/race_high_Taking_a_test | lora |
| race_middle_Taking_a_test | phi-2 | sordonia/flan-10k-flat/race_middle_Taking_a_test | lora |
| race_middle_Select_the_best_answer_no_instructions_ | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_no_instructions_ | lora |
| race_middle_Select_the_best_answer_generate_span_ | phi-2 | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_generate_span_ | lora |
| race_high_Select_the_best_answer | phi-2 | sordonia/flan-10k-flat/race_high_Select_the_best_answer | lora |
| race_middle_Write_a_multi_choice_question_for_the_following_article | phi-2 | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_for_the_following_article | lora |
| race_middle_Write_a_multi_choice_question_options_given_ | phi-2 | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_options_given_ | lora |
| ropes_background_new_situation_answer | phi-2 | sordonia/flan-10k-flat/ropes_background_new_situation_answer | lora |
| ropes_plain_bottom_hint | phi-2 | sordonia/flan-10k-flat/ropes_plain_bottom_hint | lora |
| ropes_plain_no_background | phi-2 | sordonia/flan-10k-flat/ropes_plain_no_background | lora |
| ropes_new_situation_background_answer | phi-2 | sordonia/flan-10k-flat/ropes_new_situation_background_answer | lora |
| ropes_given_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_given_background_situation | lora |
| ropes_plain_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_plain_background_situation | lora |
| ropes_prompt_beginning | phi-2 | sordonia/flan-10k-flat/ropes_prompt_beginning | lora |
| ropes_background_situation_middle | phi-2 | sordonia/flan-10k-flat/ropes_background_situation_middle | lora |
| ropes_prompt_bottom_hint_beginning | phi-2 | sordonia/flan-10k-flat/ropes_prompt_bottom_hint_beginning | lora |
| sciq_Direct_Question_Closed_Book_ | phi-2 | sordonia/flan-10k-flat/sciq_Direct_Question_Closed_Book_ | lora |
| sciq_Multiple_Choice | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice | lora |
| ropes_prompt_bottom_no_hint | phi-2 | sordonia/flan-10k-flat/ropes_prompt_bottom_no_hint | lora |
| ropes_prompt_mix | phi-2 | sordonia/flan-10k-flat/ropes_prompt_mix | lora |
| sciq_Direct_Question | phi-2 | sordonia/flan-10k-flat/sciq_Direct_Question | lora |
| ropes_read_background_situation | phi-2 | sordonia/flan-10k-flat/ropes_read_background_situation | lora |
| sciq_Multiple_Choice_Closed_Book_ | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice_Closed_Book_ | lora |
| social_i_qa_Check_if_a_random_answer_is_valid_or_not | phi-2 | sordonia/flan-10k-flat/social_i_qa_Check_if_a_random_answer_is_valid_or_not | lora |
| snli_1_1_0 | phi-2 | sordonia/flan-10k-flat/snli_1_1_0 | lora |
| sciq_Multiple_Choice_Question_First | phi-2 | sordonia/flan-10k-flat/sciq_Multiple_Choice_Question_First | lora |
| social_i_qa_Generate_the_question_from_the_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Generate_the_question_from_the_answer | lora |
| social_i_qa_Generate_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Generate_answer | lora |
| social_i_qa_Show_choices_and_generate_answer | phi-2 | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_answer | lora |
| social_i_qa_Show_choices_and_generate_index | phi-2 | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_index | lora |
| squad_v1_1_3_0_0 | phi-2 | sordonia/flan-10k-flat/squad_v1_1_3_0_0 | lora |
| social_i_qa_I_was_wondering | phi-2 | sordonia/flan-10k-flat/social_i_qa_I_was_wondering | lora |
| stream_aqua | phi-2 | sordonia/flan-10k-flat/stream_aqua | lora |
| squad_v2_0_3_0_0 | phi-2 | sordonia/flan-10k-flat/squad_v2_0_3_0_0 | lora |
| stream_aqua_ii | phi-2 | sordonia/flan-10k-flat/stream_aqua_ii | lora |
| stream_qed_ii | phi-2 | sordonia/flan-10k-flat/stream_qed_ii | lora |
| stream_qed | phi-2 | sordonia/flan-10k-flat/stream_qed | lora |
| super_glue_copa_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_copa_1_0_2 | lora |
| super_glue_multirc_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_multirc_1_0_2 | lora |
| super_glue_cb_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_cb_1_0_2 | lora |
| super_glue_record_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_record_1_0_2 | lora |
| super_glue_rte_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_rte_1_0_2 | lora |
| super_glue_wsc_fixed_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_wsc_fixed_1_0_2 | lora |
| super_glue_wic_1_0_2 | phi-2 | sordonia/flan-10k-flat/super_glue_wic_1_0_2 | lora |
| true_case | phi-2 | sordonia/flan-10k-flat/true_case | lora |
| unified_qa_science_inst | phi-2 | sordonia/flan-10k-flat/unified_qa_science_inst | lora |
| web_questions_get_the_answer | phi-2 | sordonia/flan-10k-flat/web_questions_get_the_answer | lora |
| web_questions_question_answer | phi-2 | sordonia/flan-10k-flat/web_questions_question_answer | lora |
| web_questions_potential_correct_answer | phi-2 | sordonia/flan-10k-flat/web_questions_potential_correct_answer | lora |
| trivia_qa_rc_1_1_0 | phi-2 | sordonia/flan-10k-flat/trivia_qa_rc_1_1_0 | lora |
| wiki_bio_comprehension | phi-2 | sordonia/flan-10k-flat/wiki_bio_comprehension | lora |
| web_questions_whats_the_answer | phi-2 | sordonia/flan-10k-flat/web_questions_whats_the_answer | lora |
| wiki_bio_guess_person | phi-2 | sordonia/flan-10k-flat/wiki_bio_guess_person | lora |
| wiki_bio_key_content | phi-2 | sordonia/flan-10k-flat/wiki_bio_key_content | lora |
| wiki_bio_who | phi-2 | sordonia/flan-10k-flat/wiki_bio_who | lora |
| wiki_bio_what_content | phi-2 | sordonia/flan-10k-flat/wiki_bio_what_content | lora |
| web_questions_short_general_knowledge_q | phi-2 | sordonia/flan-10k-flat/web_questions_short_general_knowledge_q | lora |
| wiki_hop_original_choose_best_object_affirmative_1 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_1 | lora |
| wiki_hop_original_choose_best_object_affirmative_2 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_2 | lora |
| trec_1_0_0 | phi-2 | sordonia/flan-10k-flat/trec_1_0_0 | lora |
| wiki_hop_original_choose_best_object_affirmative_3 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_3 | lora |
| wiki_hop_original_choose_best_object_interrogative_1 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_1 | lora |
| wiki_hop_original_explain_relation | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_explain_relation | lora |
| wiki_hop_original_choose_best_object_interrogative_2 | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_2 | lora |
| wiki_hop_original_generate_subject_and_object | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_subject_and_object | lora |
| wiki_hop_original_generate_object | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_object | lora |
| wiki_qa_Decide_good_answer | phi-2 | sordonia/flan-10k-flat/wiki_qa_Decide_good_answer | lora |
| wiki_qa_Direct_Answer_to_Question | phi-2 | sordonia/flan-10k-flat/wiki_qa_Direct_Answer_to_Question | lora |
| wiki_hop_original_generate_subject | phi-2 | sordonia/flan-10k-flat/wiki_hop_original_generate_subject | lora |
| wiki_qa_Generate_Question_from_Topic | phi-2 | sordonia/flan-10k-flat/wiki_qa_Generate_Question_from_Topic | lora |
| wiki_qa_Topic_Prediction_Answer_Only | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Answer_Only | lora |
| wiki_qa_Jeopardy_style | phi-2 | sordonia/flan-10k-flat/wiki_qa_Jeopardy_style | lora |
| word_segment | phi-2 | sordonia/flan-10k-flat/word_segment | lora |
| wiki_qa_Is_This_True_ | phi-2 | sordonia/flan-10k-flat/wiki_qa_Is_This_True_ | lora |
| wiki_qa_automatic_system | phi-2 | sordonia/flan-10k-flat/wiki_qa_automatic_system | lora |
| wiki_qa_Topic_Prediction_Question_Only | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_Only | lora |
| wiki_qa_exercise | phi-2 | sordonia/flan-10k-flat/wiki_qa_exercise | lora |
| wiki_qa_found_on_google | phi-2 | sordonia/flan-10k-flat/wiki_qa_found_on_google | lora |
| wiki_qa_Topic_Prediction_Question_and_Answer_Pair | phi-2 | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_and_Answer_Pair | lora |
| winogrande_1_1_0 | phi-2 | sordonia/flan-10k-flat/winogrande_1_1_0 | lora |
| wiqa_effect_with_string_answer | phi-2 | sordonia/flan-10k-flat/wiqa_effect_with_string_answer | lora |
| wiqa_what_is_the_missing_first_step | phi-2 | sordonia/flan-10k-flat/wiqa_what_is_the_missing_first_step | lora |
| wiqa_effect_with_label_answer | phi-2 | sordonia/flan-10k-flat/wiqa_effect_with_label_answer | lora |
| wiqa_does_the_supposed_perturbation_have_an_effect | phi-2 | sordonia/flan-10k-flat/wiqa_does_the_supposed_perturbation_have_an_effect | lora |
| wiqa_which_of_the_following_is_the_supposed_perturbation | phi-2 | sordonia/flan-10k-flat/wiqa_which_of_the_following_is_the_supposed_perturbation | lora |
| wiqa_what_might_be_the_last_step_of_the_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_might_be_the_last_step_of_the_process | lora |
| wiqa_what_is_the_final_step_of_the_following_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_is_the_final_step_of_the_following_process | lora |
| wmt14_translate_fr_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt14_translate_fr_en_1_0_0 | lora |
| wmt16_translate_ro_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_ro_en_1_0_0 | lora |
| wmt16_translate_tr_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_tr_en_1_0_0 | lora |
| yelp_polarity_reviews_0_2_0 | phi-2 | sordonia/flan-10k-flat/yelp_polarity_reviews_0_2_0 | lora |
| wmt16_translate_fi_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_fi_en_1_0_0 | lora |
| wiqa_what_might_be_the_first_step_of_the_process | phi-2 | sordonia/flan-10k-flat/wiqa_what_might_be_the_first_step_of_the_process | lora |
| wmt16_translate_de_en_1_0_0 | phi-2 | sordonia/flan-10k-flat/wmt16_translate_de_en_1_0_0 | lora |
Last updated on: 2024-07-06 21:37:35+00:00
| [
"SCIQ"
] |
GrainsPolito/DeepMedicalColorization | GrainsPolito | image-classification | [
"image-classification",
"license:mit",
"region:us"
] | "2024-07-10T09:47:25Z" | 2024-07-10T13:05:16+00:00 | 0 | 1 | ---
license: mit
pipeline_tag: image-classification
---
# Deep Colorization modules for Medical Image Analysis
This project aims at bridging the gap between medical image analysis by introducing a light colorization module. Three different modules are proposed and implemented (DECONV, PixelShuffle and ColorU)

The modules are trained jointly with a backbone pre-trained on ImageNet. A multi-stage transfer learning pipeline is summarized here.
First, the colorization module is trained from scratch together with the classifier, while the pre-trained CNN backbone is kept frozen, to learn the mapping which maximizes classification accuracy.
Then, the entire network is fine-tuned to learn useful features for the target task, while simultaneously adjusting the colorization mapping. The figure below shows the output of each colorization module when only the colorization module is trained, and after the entire network is fine-tuned.

## Dependencies
+ Linux
+ Python 3.7
+ PyTorch 1.4.0
## Download
Trained models with DenseNet121 and ResNet18 backbones are available [here](https://drive.google.com/drive/folders/1uwLd-rzkt7Fcph6RqR1Eq41aTh85XGb-?usp=sharing)
A detailed list of models is available [here](README_FILES.md)
All models were trained on [CheXpert](https://stanfordmlgroup.github.io/competitions/chexpert/) to predict the presence/absence of 5 labels:
+ Atelectasis
+ Cardiomegaly
+ Consolidation
+ Edema
+ Pleural Effusion
## Image normalization
If you wish to use the above models, please bear in mind that images were normalized with statistics calculated on the CheXPert dataset:
+ mean: [0.5028, 0.5028, 0.5028]
+ std: [0.2902, 0.2902, 0.2902]
# Citation
If you use the models in your research, please cite our paper:
```
@article{morra2020bridging,
title="Bridging the gap between Natural and Medical Images through Deep Colorization",
author="Morra, Lia and Piano, Luca and Lamberti, Fabrizio and Tommasi, Tatiana",
year="2020"
}
``` | [
"BEAR"
] |
Addax-Data-Science/Deepfaune_v1.1 | Addax-Data-Science | null | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | "2024-07-13T06:46:28Z" | 2024-11-08T05:36:56+00:00 | 0 | 0 | ---
license: cc-by-nc-sa-4.0
---
This model was trained by [the Deepfaune initiative](https://www.deepfaune.cnrs.fr/en/). The official location of the model files is [the Deepfaune GitLab repository](https://plmlab.math.cnrs.fr/deepfaune/software/-/tree/master).
These files are copied to this Hugging face repo because of the easy integration with [EcoAssist](https://addaxdatascience.com/ecoassist).
**Developer**
The DeepFaune initiative
**Model version**
v1.1
**Description**
The Deepfaune initiative aims at developing 'artificial intelligence' models to automatically classify species in images and videos collected using camera-traps. The initiative is led by a core academic team from the French 'Centre National de la Recherche Scientifique' (CNRS), in collaboration with more than 50 European partners involved in wildlife research, conservation and management. The Deepfaune models can be run through a custom software freely available on the website, or through other software packages or platforms like EcoAssist. New versions of the model are published regularly, increasing classification accuracy or adding new species to the list of species that can be recognized. More information is available at: https://www.deepfaune.cnrs.fr.
**Classes**
* badger
* ibex
* red deer
* chamois
* cat
* goat
* roe deer
* dog
* squirrel
* equid
* genet
* hedgehog
* lagomorph
* wolf
* lynx
* marmot
* micromammal
* mouflon
* sheep
* mustelid
* bird
* bear
* nutria
* fox
* wild boar
* cow
**Links**
[Learn more](https://www.deepfaune.cnrs.fr/en/)
[Cite](https://link.springer.com/article/10.1007/s10344-023-01742-7)
[License](https://creativecommons.org/licenses/by-nc-sa/4.0/) | [
"BEAR"
] |
SamySam0/LT3 | SamySam0 | null | [
"arxiv:2310.19727",
"region:us"
] | "2024-07-14T14:32:49Z" | 2024-07-14T14:46:50+00:00 | 0 | 1 | ---
{}
---
# LT3
LT3 is a novel Conditional Transformer designed for generating synthetic medical instructions as an alternative to real medical data, addressing data privacy restrictions and scracity issues. It has demonstrated better generation quality and diversity than Large Language Models (LLM), and the ability to effectively train NER model with performances comparable to those achieved with real data. On top of that, our research proposes a new Beam Search Decoding algorithm (B2SD) which outperformed state-of-the-art methods on our task.
This work was presented at *NeurIPS 2023's Workshop on Synthetic Data Generation with Generative AI*.
Our pre-print can be found here: https://arxiv.org/abs/2310.19727.
*Authors: Samuel Belkadi, Nicolo Micheletti, Lifeng Han, Warren Del-Pinto, Goran Nenadic.*
## Usage
In order to generate syntethic data, you can follow the instructions given on our Github repository: https://github.com/SamySam0/LT3 .
## Evaluation results
### Lexical Similarity Evaluation against References
The results below show that LT3’s generations are the closest match to the reference samples. We used multi-reference evaluation to consolidate our results. Higher scores are better.
| Models | BLEU | ROUGE-1 | ROUGE-2 | ROUGE-L | BERTScore |
|----------|-------|---------|---------|---------|-----------|
| T5 Small | 71.75 | 76.16 | 66.24 | 75.55 | 0.70 |
| T5 Base | 71.98 | 76.28 | 66.30 | 75.45 | 0.70 |
| T5 Large | 69.89 | 75.07 | 65.19 | 74.22 | 0.68 |
| LT3 | **78.52** | **78.16** | **68.72** | **77.55** | **0.72** |
### Lexical Diversity Evaluation within Generated Outputs
The results below measure the diversity between models' outputs. For each label, we measured the Jaccard similarity score of the generations of our models. A higher Jaccard Score indicates more similarity between the two populations, while a lower score indicates better diversity in our tasks.
| | Median Jaccard Score | Average Jaccard Score |
|-------|-----------------------|-----------------------|
| LT3 | **0.650** | **0.652** |
| T5 Base | 0.658 | 0.660 |
### Downstream NER Evaluation
The results below demonstrate the efficiency of our generated synthetic dataset to train an NER model compared to when using real data.

## Thank you
Feel free to use LT3 for any research purpose.
Please contact us if you have any questions, and **cite our work** whenever used. | [
"MEDICAL DATA"
] |
thewhatifproject/gaIA | thewhatifproject | unconditional-image-generation | [
"pytorch",
"gan",
"nvidia",
"stylegan",
"stylegan3",
"unconditional-image-generation",
"it",
"doi:10.57967/hf/2733",
"license:apache-2.0",
"region:us"
] | "2024-07-14T15:56:00Z" | 2024-07-26T11:02:49+00:00 | 0 | 1 | ---
language:
- it
library_name: pytorch
license: apache-2.0
pipeline_tag: unconditional-image-generation
tags:
- nvidia
- gan
- stylegan
- stylegan3
extra_gated_prompt: You agree to not use the model to conduct experiments that cause
harm to human subjects. You agree to cite this model for every usage using its DOI.
extra_gated_fields:
Company: text
Country: country
I want to use this model for:
type: select
options:
- Research
- Education
- Art & Exhibitions
- label: Other
value: other
I agree to use this model for non-commercial use ONLY: checkbox
extra_gated_heading: Acknowledge license and conditions to accept the repository
extra_gated_description: Our team may take 1-2 days to process your request
extra_gated_button_content: I accept
---
# gaIA: Italian Landscape GAN Model
gaIA is the first Italian GAN model trained on satellite images of a selection of Italy's main glaciers, forests, lakes, rivers, and coasts that are most affected by climate change. It is usable for scientific and artistic purposes.

## Dataset
- **Images**: 12k
- **Image Format**: 1024x1024
- **Source**: Copernicus Sentinel 2A
- **Reference Years**: 2017 – June 2024

- **29 Covered Areas**:
- **Glaciers**: Adamello, Gran Paradiso, Marmolada, Presena, Forni, Belvedere
- **Lakes**: Bracciano, Garda, Maggiore, Trasimeno, Iseo, Como
- **Rivers**: Tiber, Adige, Arno, etc.
- **Islands/Coasts**: Chia, Marina di Pisa, Venezia, Stromboli, Rosolina Mare, Costiera Amalfitana
- **Parks**: Abruzzo, Casentinesi, Pollino, Sila, Gargano, Aspromonte

## Training
- **Framework**: StyleGAN3-T
- **GPUs**: 1 - NVIDIA A100 80GB
- **Batch**: 32
- **Gamma**: 32
- **Kimg**: 5152.0
- **Augmentations**: 38,040
- **Time**: ~220 hours

## Requirements
Please refer to Official NVIDIA Paper [Requirements](https://github.com/NVlabs/stylegan3?tab=readme-ov-file#requirements)
## How to Start
```python
import torch
from PIL import Image
import numpy as np
import pickle
# Set the device to GPU
device = torch.device('cuda')
# Load the model
with open('/thewhatifproject/gaIA_v1.pkl', 'rb') as f:
G = pickle.load(f)['G_ema'].cuda() # torch.nn.Module
# Set the model to evaluation mode
G.eval()
# Set the seed for reproducibility
seed = 28
# Generate latent codes using the specified seed
z = torch.from_numpy(np.random.RandomState(seed).randn(1, G.z_dim)).to(device)
# Generate the image using the generator
with torch.no_grad():
img = G(z, None, truncation_psi=1, noise_mode='const')
# Process the image for saving
# - Change dimensions order from NCHW to NHWC
# - Scale from range [-1, +1] to [0, 255]
# - Clamp values to ensure they are within [0, 255]
# - Convert to uint8
img = (img.permute(0, 2, 3, 1) * 127.5 + 128).clamp(0, 255).to(torch.uint8)
# Save the image using PIL
Image.fromarray(img[0].cpu().numpy(), 'RGB').save('generated_image.png')
print("Image saved as 'generated_image.png'")
```
The above code requires torch_utils and `dnnlib` to be accessible via `PYTHONPATH`. It does not need source code for the networks themselves — their class definitions are loaded from the pickle via `torch_utils.persistence`.
The pickle contains three networks. `G` and `D` are instantaneous snapshots taken during training, and `G_ema` represents a moving average of the generator weights over several training steps. The networks are regular instances of `torch.nn.Module`, with all of their parameters and buffers placed on the CPU at import and gradient computation disabled by default.
The generator consists of two submodules, `G.mapping` and `G.synthesis`, that can be executed separately.
See [NVIDIA Repo](https://github.com/NVlabs/stylegan3?tab=readme-ov-file#using-networks-from-python) for additional information.
**A dedicated Repo for gaIA inference with ready-to-use scripts is on the way! Stay tuned!**
## Inference Samples

## Uses
### Scientific
- Transfer Learning
- Synthetic data generation
- Future scenario simulations *
- Comparative analysis *
*It is necessary to integrate external predictive climate models to generate future scenarios sumulation
### Artistic
- Art installations & exhibitions
- Public awareness campaigns
- Multimedia performances
## License
This project and repository contains two licenses:
1. **Apache 2.0 License**: Applies to the model and any modifications or additions made by The "What If" Project.
2. **NVIDIA Source Code License for StyleGAN3**: Applies to the original StyleGAN3 software used for training the model.
Please see the LICENSE files in the repository for more details.
## How to Contribute
Join us in using our model to make a differente! For more information and updates, visit [gaIA spotlight](https://share.thewhatifproject.com/gaia).
## Contact
For any questions or support, contact us through our [website](https://thewhatifproject.com) and follow us on [Instagram](https://www.instagram.com/the.whatifproject/). | [
"CHIA"
] |
TensorStack/AirtistPhoto-onnx | TensorStack | null | [
"onnx",
"region:us"
] | "2024-07-14T19:23:07Z" | 2024-07-14T19:33:35+00:00 | 0 | 1 | ---
{}
---
# AIrtist Photo MAL Realistic - Onnx Olive DirectML Optimized
## Original Model
https://civitai.com/models/229332/airtist-photo-mal-realistic
## C# Inference Demo
https://github.com/TensorStack-AI/OnnxStack
```csharp
// Create Pipeline
var pipeline = StableDiffusionPipeline.CreatePipeline("D:\\Models\\AirtistPhoto-onnx");
// Prompt
var promptOptions = new PromptOptions
{
Prompt = "Craft an image of a sleek motorbike racing down a winding road, with mountains and forests in the background."
};
// Run pipeline
var result = await pipeline.GenerateImageAsync(promptOptions, schedulerOptions);
// Save Image Result
await result.SaveAsync("Result.png");
```
## Inference Result
 | [
"CRAFT"
] |
rama0519/DiabeticLogistic123 | rama0519 | tabular-regression | [
"sklearn",
"joblib",
"medical",
"tabular-regression",
"license:mit",
"region:us"
] | "2024-07-15T10:30:42Z" | 2024-07-16T12:59:29+00:00 | 0 | 0 | ---
library_name: sklearn
license: mit
metrics:
- accuracy
pipeline_tag: tabular-regression
tags:
- medical
---
# Logistic Regression Diabetes Prediction Model
## Instructions for Users
### DiabeticLogistic Model
This model predicts the likelihood of diabetes based on medical data using logistic regression.
### Dataset
The model is trained on a dataset with features including:
- Glucose
- BloodPressure
- SkinThickness
- Insulin
- BMI
- DiabetesPedigreeFunction
- Age
### Preprocessing
Features are normalized using `StandardScaler`.
### Usage
#### Downloading the Model
```python
!pip install pandas scikit-learn joblib huggingface_hub
from huggingface_hub import hf_hub_download
import joblib
import pandas as pd
# Your Hugging Face token
token = "put your token here"
# Download the model and scaler from the Hugging Face Hub using the token
model_path = hf_hub_download(repo_id="rama0519/DiabeticLogistic123", filename="logistic_regression_model.joblib", use_auth_token=token)
scaler_path = hf_hub_download(repo_id="rama0519/DiabeticLogistic123", filename="scaler.joblib", use_auth_token=token)
# Load the model and scaler
model = joblib.load(model_path)
scaler = joblib.load(scaler_path)
# Example data
data = pd.DataFrame({
'Pregnancies': [6, 1],
'Glucose': [148, 85],
'BloodPressure': [72, 66],
'SkinThickness': [35, 29],
'Insulin': [0, 0],
'BMI': [33.6, 26.6],
'DiabetesPedigreeFunction': [0.627, 0.351],
'Age': [50, 31]
})
# Normalize the data
data_scaled = scaler.transform(data)
# Make predictions
predictions = model.predict(data_scaled)
print("Predictions:", predictions)
```
#### Fine-Tuning the Model
To fine-tune the model, follow these steps:
##### Load the Model and Data
```python
from huggingface_hub import hf_hub_download
import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from sklearn.linear_model import LogisticRegression
from sklearn.metrics import accuracy_score
import joblib
# Your Hugging Face token
token = "put your token here"
# Download the model and scaler from the Hugging Face Hub using the token
model_path = hf_hub_download(repo_id="rama0519/DiabeticLogistic123", filename="logistic_regression_model.joblib", use_auth_token=token)
scaler_path = hf_hub_download(repo_id="rama0519/DiabeticLogistic123", filename="scaler.joblib", use_auth_token=token)
# Load the model and scaler
model = joblib.load(model_path)
scaler = joblib.load(scaler_path)
# Load your dataset
data = pd.read_csv('/content/Healthcare-Diabetes.csv')
# Drop the 'Id' column if it exists
if 'Id' in data.columns:
data = data.drop(columns=['Id'])
X = data.drop(columns=['Outcome'])
y = data['Outcome']
# Normalize the features
X_scaled = scaler.transform(X)
# Split the dataset into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X_scaled, y, test_size=0.2, random_state=42)
```
##### Fine-Tune the Model
```python
# Fine-tune the model
model.fit(X_train, y_train)
# Evaluate the fine-tuned model
y_pred = model.predict(X_test)
accuracy = accuracy_score(y_test, y_pred)
print(f'Fine-tuned Accuracy: {accuracy:.2f}')
```
##### Save the Fine-Tuned Model
```python
joblib.dump(model, 'fine_tuned_logistic_regression_model.joblib')
```
| [
"MEDICAL DATA"
] |
zhan1993/private_library_phi3-4k | zhan1993 | null | [
"region:us"
] | "2024-07-18T17:12:01Z" | 2024-07-18T17:12:02+00:00 | 0 | 0 | ---
{}
---
Number of experts present in the library: 256
| Expert Name | Base Model | Trained on | Adapter Type |
| --- | --- | --- | --- |
| super_glue_wic_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_wic_1_0_2 | lora |
| quarel_logic_test | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quarel_logic_test | lora |
| ropes_prompt_beginning | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_prompt_beginning | lora |
| duorc_SelfRC_build_story_around_qa | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_build_story_around_qa | lora |
| lambada_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/lambada_1_0_0 | lora |
| dream_answer_to_dialogue | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dream_answer_to_dialogue | lora |
| glue_qnli_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_qnli_2_0_0 | lora |
| cos_e_v1_11_i_think | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_i_think | lora |
| wiki_hop_original_choose_best_object_affirmative_3 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_3 | lora |
| app_reviews_convert_to_star_rating | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/app_reviews_convert_to_star_rating | lora |
| duorc_SelfRC_decide_worth_it | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_decide_worth_it | lora |
| race_middle_Select_the_best_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Select_the_best_answer | lora |
| ag_news_subset_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ag_news_subset_1_0_0 | lora |
| quoref_Find_Answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Find_Answer | lora |
| social_i_qa_Show_choices_and_generate_index | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_index | lora |
| dream_generate_first_utterance | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dream_generate_first_utterance | lora |
| qasc_qa_with_separated_facts_4 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_4 | lora |
| web_questions_question_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/web_questions_question_answer | lora |
| wiki_qa_Jeopardy_style | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Jeopardy_style | lora |
| quoref_Guess_Title_For_Context | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Guess_Title_For_Context | lora |
| unified_qa_science_inst | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/unified_qa_science_inst | lora |
| app_reviews_generate_review | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/app_reviews_generate_review | lora |
| wiki_bio_who | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_bio_who | lora |
| race_high_Taking_a_test | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Taking_a_test | lora |
| duorc_SelfRC_movie_director | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_movie_director | lora |
| cot_creak_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_creak_ii | lora |
| wmt16_translate_tr_en_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wmt16_translate_tr_en_1_0_0 | lora |
| glue_wnli_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_wnli_2_0_0 | lora |
| super_glue_multirc_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_multirc_1_0_2 | lora |
| quail_description_context_question_answer_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_description_context_question_answer_id | lora |
| duorc_ParaphraseRC_question_answering | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_question_answering | lora |
| sciq_Multiple_Choice_Closed_Book_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/sciq_Multiple_Choice_Closed_Book_ | lora |
| wiki_qa_found_on_google | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_found_on_google | lora |
| cos_e_v1_11_question_description_option_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_id | lora |
| ropes_given_background_situation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_given_background_situation | lora |
| ropes_background_new_situation_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_background_new_situation_answer | lora |
| glue_mnli_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_mnli_2_0_0 | lora |
| duorc_SelfRC_answer_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_answer_question | lora |
| cos_e_v1_11_description_question_option_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_id | lora |
| glue_cola_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_cola_2_0_0 | lora |
| wiki_qa_Generate_Question_from_Topic | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Generate_Question_from_Topic | lora |
| wiqa_does_the_supposed_perturbation_have_an_effect | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_does_the_supposed_perturbation_have_an_effect | lora |
| ropes_plain_bottom_hint | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_plain_bottom_hint | lora |
| wiki_hop_original_choose_best_object_interrogative_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_2 | lora |
| cot_sensemaking | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_sensemaking | lora |
| ropes_new_situation_background_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_new_situation_background_answer | lora |
| glue_mrpc_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_mrpc_2_0_0 | lora |
| cot_gsm8k_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_gsm8k_ii | lora |
| imdb_reviews_plain_text_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/imdb_reviews_plain_text_1_0_0 | lora |
| huggingface_xsum | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/huggingface_xsum | lora |
| squad_v2_0_3_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/squad_v2_0_3_0_0 | lora |
| wmt16_translate_ro_en_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wmt16_translate_ro_en_1_0_0 | lora |
| quail_no_prompt_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_no_prompt_text | lora |
| social_i_qa_Generate_the_question_from_the_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/social_i_qa_Generate_the_question_from_the_answer | lora |
| social_i_qa_Check_if_a_random_answer_is_valid_or_not | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/social_i_qa_Check_if_a_random_answer_is_valid_or_not | lora |
| para_crawl_enes | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/para_crawl_enes | lora |
| adversarial_qa_dbidaf_tell_what_it_is | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbidaf_tell_what_it_is | lora |
| adversarial_qa_droberta_question_context_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_droberta_question_context_answer | lora |
| wiki_bio_key_content | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_bio_key_content | lora |
| social_i_qa_Generate_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/social_i_qa_Generate_answer | lora |
| cos_e_v1_11_question_option_description_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_text | lora |
| wiki_hop_original_generate_subject_and_object | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_generate_subject_and_object | lora |
| duorc_SelfRC_question_answering | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_question_answering | lora |
| cot_creak | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_creak | lora |
| cos_e_v1_11_rationale | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_rationale | lora |
| adversarial_qa_dbert_question_context_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbert_question_context_answer | lora |
| stream_aqua_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/stream_aqua_ii | lora |
| cos_e_v1_11_aligned_with_common_sense | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_aligned_with_common_sense | lora |
| trec_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/trec_1_0_0 | lora |
| quoref_Answer_Friend_Question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Answer_Friend_Question | lora |
| qasc_qa_with_separated_facts_1 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_1 | lora |
| adversarial_qa_dbert_answer_the_following_q | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbert_answer_the_following_q | lora |
| duorc_ParaphraseRC_extract_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_extract_answer | lora |
| quail_context_question_answer_description_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_question_answer_description_id | lora |
| ropes_prompt_bottom_no_hint | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_prompt_bottom_no_hint | lora |
| anli_r1_0_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/anli_r1_0_1_0 | lora |
| dream_read_the_following_conversation_and_answer_the_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dream_read_the_following_conversation_and_answer_the_question | lora |
| social_i_qa_I_was_wondering | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/social_i_qa_I_was_wondering | lora |
| glue_stsb_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_stsb_2_0_0 | lora |
| cos_e_v1_11_description_question_option_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_text | lora |
| wiqa_effect_with_string_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_effect_with_string_answer | lora |
| cot_qasc | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_qasc | lora |
| web_questions_whats_the_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/web_questions_whats_the_answer | lora |
| paws_wiki_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/paws_wiki_1_1_0 | lora |
| gem_dart_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/gem_dart_1_1_0 | lora |
| ropes_plain_background_situation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_plain_background_situation | lora |
| cosmos_qa_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cosmos_qa_1_0_0 | lora |
| quoref_Answer_Question_Given_Context | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Answer_Question_Given_Context | lora |
| quartz_read_passage_below_choose | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_read_passage_below_choose | lora |
| quail_context_description_question_answer_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_description_question_answer_text | lora |
| quoref_Answer_Test | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Answer_Test | lora |
| quartz_answer_question_below | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_answer_question_below | lora |
| quail_context_question_answer_description_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_question_answer_description_text | lora |
| wiki_qa_automatic_system | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_automatic_system | lora |
| gigaword_1_2_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/gigaword_1_2_0 | lora |
| wiqa_effect_with_label_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_effect_with_label_answer | lora |
| wiqa_what_is_the_final_step_of_the_following_process | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_what_is_the_final_step_of_the_following_process | lora |
| race_high_Is_this_the_right_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Is_this_the_right_answer | lora |
| duorc_ParaphraseRC_answer_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_answer_question | lora |
| multi_news_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/multi_news_1_0_0 | lora |
| wiki_hop_original_generate_subject | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_generate_subject | lora |
| squad_v1_1_3_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/squad_v1_1_3_0_0 | lora |
| wiki_hop_original_choose_best_object_interrogative_1 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_interrogative_1 | lora |
| wiqa_what_might_be_the_first_step_of_the_process | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_what_might_be_the_first_step_of_the_process | lora |
| anli_r3_0_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/anli_r3_0_1_0 | lora |
| super_glue_rte_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_rte_1_0_2 | lora |
| gem_e2e_nlg_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/gem_e2e_nlg_1_1_0 | lora |
| adversarial_qa_droberta_answer_the_following_q | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_droberta_answer_the_following_q | lora |
| adversarial_qa_droberta_tell_what_it_is | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_droberta_tell_what_it_is | lora |
| qasc_qa_with_separated_facts_3 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_3 | lora |
| quartz_having_read_above_passage | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_having_read_above_passage | lora |
| stream_qed_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/stream_qed_ii | lora |
| super_glue_record_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_record_1_0_2 | lora |
| wiki_qa_Topic_Prediction_Question_and_Answer_Pair | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_and_Answer_Pair | lora |
| qasc_qa_with_separated_facts_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_2 | lora |
| wmt16_translate_fi_en_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wmt16_translate_fi_en_1_0_0 | lora |
| adversarial_qa_dbidaf_question_context_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbidaf_question_context_answer | lora |
| coqa_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/coqa_1_0_0 | lora |
| wiki_bio_what_content | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_bio_what_content | lora |
| stream_qed | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/stream_qed | lora |
| quail_description_context_question_answer_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_description_context_question_answer_text | lora |
| quoref_Found_Context_Online | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Found_Context_Online | lora |
| cos_e_v1_11_generate_explanation_given_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_generate_explanation_given_text | lora |
| dbpedia_14_pick_one_category_for_the_following_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dbpedia_14_pick_one_category_for_the_following_text | lora |
| super_glue_wsc_fixed_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_wsc_fixed_1_0_2 | lora |
| math_dataset_algebra__linear_1d_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/math_dataset_algebra__linear_1d_1_0_0 | lora |
| duorc_ParaphraseRC_title_generation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_title_generation | lora |
| ropes_background_situation_middle | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_background_situation_middle | lora |
| word_segment | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/word_segment | lora |
| wmt16_translate_de_en_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wmt16_translate_de_en_1_0_0 | lora |
| cnn_dailymail_3_4_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cnn_dailymail_3_4_0 | lora |
| race_high_Read_the_article_and_answer_the_question_no_option_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Read_the_article_and_answer_the_question_no_option_ | lora |
| ropes_prompt_bottom_hint_beginning | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_prompt_bottom_hint_beginning | lora |
| wiqa_what_might_be_the_last_step_of_the_process | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_what_might_be_the_last_step_of_the_process | lora |
| quail_no_prompt_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_no_prompt_id | lora |
| cot_gsm8k | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_gsm8k | lora |
| duorc_ParaphraseRC_generate_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question | lora |
| quoref_What_Is_The_Answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_What_Is_The_Answer | lora |
| duorc_SelfRC_extract_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_extract_answer | lora |
| duorc_SelfRC_title_generation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_title_generation | lora |
| wiki_hop_original_explain_relation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_explain_relation | lora |
| dream_baseline | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dream_baseline | lora |
| definite_pronoun_resolution_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/definite_pronoun_resolution_1_1_0 | lora |
| quail_context_description_question_answer_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_description_question_answer_id | lora |
| quarel_do_not_use | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quarel_do_not_use | lora |
| quartz_paragraph_question_plain_concat | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_paragraph_question_plain_concat | lora |
| cot_ecqa | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_ecqa | lora |
| qasc_is_correct_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_is_correct_2 | lora |
| adversarial_qa_dbert_generate_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbert_generate_question | lora |
| wiki_hop_original_choose_best_object_affirmative_1 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_1 | lora |
| wiki_qa_Direct_Answer_to_Question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Direct_Answer_to_Question | lora |
| sciq_Multiple_Choice_Question_First | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/sciq_Multiple_Choice_Question_First | lora |
| wiki_qa_Decide_good_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Decide_good_answer | lora |
| duorc_ParaphraseRC_movie_director | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_movie_director | lora |
| race_middle_Write_a_multi_choice_question_options_given_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_options_given_ | lora |
| drop_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/drop_2_0_0 | lora |
| kilt_tasks_hotpotqa_complex_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_complex_question | lora |
| natural_questions_open_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/natural_questions_open_1_0_0 | lora |
| duorc_ParaphraseRC_decide_worth_it | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_decide_worth_it | lora |
| cot_sensemaking_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_sensemaking_ii | lora |
| kilt_tasks_hotpotqa_straighforward_qa | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_straighforward_qa | lora |
| sciq_Direct_Question_Closed_Book_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/sciq_Direct_Question_Closed_Book_ | lora |
| adversarial_qa_dbidaf_based_on | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbidaf_based_on | lora |
| race_high_Select_the_best_answer_generate_span_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Select_the_best_answer_generate_span_ | lora |
| cot_esnli | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_esnli | lora |
| cos_e_v1_11_question_option_description_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_question_option_description_id | lora |
| quartz_use_info_from_paragraph_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_use_info_from_paragraph_question | lora |
| race_high_Select_the_best_answer_no_instructions_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Select_the_best_answer_no_instructions_ | lora |
| wiki_qa_Topic_Prediction_Answer_Only | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Answer_Only | lora |
| wmt14_translate_fr_en_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wmt14_translate_fr_en_1_0_0 | lora |
| trivia_qa_rc_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/trivia_qa_rc_1_1_0 | lora |
| kilt_tasks_hotpotqa_final_exam | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_final_exam | lora |
| ropes_prompt_mix | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_prompt_mix | lora |
| snli_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/snli_1_1_0 | lora |
| quarel_testing_students | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quarel_testing_students | lora |
| true_case | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/true_case | lora |
| dream_generate_last_utterance | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dream_generate_last_utterance | lora |
| quoref_Context_Contains_Answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Context_Contains_Answer | lora |
| gem_web_nlg_en_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/gem_web_nlg_en_1_1_0 | lora |
| cot_strategyqa_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_strategyqa_ii | lora |
| quartz_answer_question_based_on | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_answer_question_based_on | lora |
| quail_context_question_description_answer_id | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_question_description_answer_id | lora |
| social_i_qa_Show_choices_and_generate_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/social_i_qa_Show_choices_and_generate_answer | lora |
| yelp_polarity_reviews_0_2_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/yelp_polarity_reviews_0_2_0 | lora |
| adversarial_qa_dbidaf_answer_the_following_q | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbidaf_answer_the_following_q | lora |
| wiqa_which_of_the_following_is_the_supposed_perturbation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_which_of_the_following_is_the_supposed_perturbation | lora |
| duorc_ParaphraseRC_generate_question_by_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_generate_question_by_answer | lora |
| race_middle_Is_this_the_right_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Is_this_the_right_answer | lora |
| qasc_qa_with_combined_facts_1 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_qa_with_combined_facts_1 | lora |
| duorc_ParaphraseRC_build_story_around_qa | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_ParaphraseRC_build_story_around_qa | lora |
| wiki_qa_exercise | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_exercise | lora |
| dbpedia_14_given_a_choice_of_categories_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dbpedia_14_given_a_choice_of_categories_ | lora |
| quartz_given_the_fact_answer_the_q | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_given_the_fact_answer_the_q | lora |
| ropes_read_background_situation | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_read_background_situation | lora |
| quail_context_question_description_answer_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_question_description_answer_text | lora |
| dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dbpedia_14_given_list_what_category_does_the_paragraph_belong_to | lora |
| race_middle_Taking_a_test | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Taking_a_test | lora |
| web_questions_short_general_knowledge_q | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/web_questions_short_general_knowledge_q | lora |
| race_middle_Select_the_best_answer_generate_span_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_generate_span_ | lora |
| wiki_bio_comprehension | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_bio_comprehension | lora |
| wiki_hop_original_choose_best_object_affirmative_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_2 | lora |
| kilt_tasks_hotpotqa_combining_facts | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_combining_facts | lora |
| duorc_SelfRC_generate_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_generate_question | lora |
| cos_e_v1_11_explain_why_human | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_explain_why_human | lora |
| quartz_use_info_from_question_paragraph | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quartz_use_info_from_question_paragraph | lora |
| glue_sst2_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_sst2_2_0_0 | lora |
| race_high_Write_a_multi_choice_question_options_given_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_options_given_ | lora |
| race_high_Write_a_multi_choice_question_for_the_following_article | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Write_a_multi_choice_question_for_the_following_article | lora |
| duorc_SelfRC_generate_question_by_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/duorc_SelfRC_generate_question_by_answer | lora |
| anli_r2_0_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/anli_r2_0_1_0 | lora |
| web_questions_potential_correct_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/web_questions_potential_correct_answer | lora |
| quarel_heres_a_story | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quarel_heres_a_story | lora |
| wiki_qa_Is_This_True_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Is_This_True_ | lora |
| adversarial_qa_droberta_generate_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_droberta_generate_question | lora |
| wiki_qa_Topic_Prediction_Question_Only | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_qa_Topic_Prediction_Question_Only | lora |
| race_high_Select_the_best_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_high_Select_the_best_answer | lora |
| quac_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quac_1_0_0 | lora |
| quail_context_description_question_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_description_question_text | lora |
| adversarial_qa_dbert_tell_what_it_is | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbert_tell_what_it_is | lora |
| super_glue_copa_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_copa_1_0_2 | lora |
| adversarial_qa_droberta_based_on | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_droberta_based_on | lora |
| aeslc_1_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/aeslc_1_0_0 | lora |
| cot_strategyqa | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_strategyqa | lora |
| race_middle_Read_the_article_and_answer_the_question_no_option_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Read_the_article_and_answer_the_question_no_option_ | lora |
| quail_description_context_question_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_description_context_question_text | lora |
| adversarial_qa_dbert_based_on | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbert_based_on | lora |
| wiki_bio_guess_person | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_bio_guess_person | lora |
| app_reviews_convert_to_rating | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/app_reviews_convert_to_rating | lora |
| wiki_hop_original_generate_object | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiki_hop_original_generate_object | lora |
| quail_context_question_description_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quail_context_question_description_text | lora |
| sciq_Direct_Question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/sciq_Direct_Question | lora |
| adversarial_qa_dbidaf_generate_question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/adversarial_qa_dbidaf_generate_question | lora |
| quoref_Read_And_Extract_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Read_And_Extract_ | lora |
| race_middle_Select_the_best_answer_no_instructions_ | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Select_the_best_answer_no_instructions_ | lora |
| stream_aqua | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/stream_aqua | lora |
| glue_qqp_2_0_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/glue_qqp_2_0_0 | lora |
| web_questions_get_the_answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/web_questions_get_the_answer | lora |
| app_reviews_categorize_rating_using_review | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/app_reviews_categorize_rating_using_review | lora |
| cot_esnli_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_esnli_ii | lora |
| quoref_Guess_Answer | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Guess_Answer | lora |
| quarel_choose_between | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quarel_choose_between | lora |
| gem_wiki_lingua_english_en_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/gem_wiki_lingua_english_en_1_1_0 | lora |
| qasc_qa_with_separated_facts_5 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_qa_with_separated_facts_5 | lora |
| ropes_plain_no_background | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/ropes_plain_no_background | lora |
| sciq_Multiple_Choice | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/sciq_Multiple_Choice | lora |
| fix_punct | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/fix_punct | lora |
| cos_e_v1_11_question_description_option_text | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cos_e_v1_11_question_description_option_text | lora |
| quoref_Given_Context_Answer_Question | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/quoref_Given_Context_Answer_Question | lora |
| gem_common_gen_1_1_0 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/gem_common_gen_1_1_0 | lora |
| race_middle_Write_a_multi_choice_question_for_the_following_article | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/race_middle_Write_a_multi_choice_question_for_the_following_article | lora |
| wiqa_what_is_the_missing_first_step | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/wiqa_what_is_the_missing_first_step | lora |
| dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to | lora |
| cot_ecqa_ii | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/cot_ecqa_ii | lora |
| kilt_tasks_hotpotqa_formulate | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/kilt_tasks_hotpotqa_formulate | lora |
| qasc_is_correct_1 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/qasc_is_correct_1 | lora |
| super_glue_cb_1_0_2 | microsoft/Phi-3-mini-4k-instruct | sordonia/flan-10k-flat/super_glue_cb_1_0_2 | lora |
Last updated on: 2024-05-10 21:22:47+00:00
| [
"SCIQ"
] |
TensorStack/EpicRealism_v5-onnx | TensorStack | null | [
"onnx",
"region:us"
] | "2024-07-24T00:12:54Z" | 2024-07-24T00:16:22+00:00 | 0 | 1 | ---
{}
---
# EpicRealism v5 - Onnx Olive DirectML Optimized
## Original Model
https://civitai.com/models/25694?modelVersionId=134065
## C# Inference Demo
https://github.com/TensorStack-AI/OnnxStack
```csharp
// Create Pipeline
var pipeline = StableDiffusionPipeline.CreatePipeline("D:\\Models\\EpicRealism_v5-onnx");
// Prompt
var promptOptions = new PromptOptions
{
Prompt = "Craft an image of a gallant prince, with a charming smile and a sword at his side, ready to embark on a quest"
};
// Run pipeline
var result = await pipeline.GenerateImageAsync(promptOptions, schedulerOptions);
// Save Image Result
await result.SaveAsync("Result.png");
```
## Inference Result
 | [
"CRAFT"
] |
VATSAL1729/NewBERTopicCEEW | VATSAL1729 | text-classification | [
"bertopic",
"text-classification",
"region:us"
] | "2024-07-24T17:37:03Z" | 2024-07-30T18:56:16+00:00 | 0 | 0 | ---
library_name: bertopic
pipeline_tag: text-classification
tags:
- bertopic
---
# NewBERTopicCEEW
This is a [BERTopic](https://github.com/MaartenGr/BERTopic) model.
BERTopic is a flexible and modular topic modeling framework that allows for the generation of easily interpretable topics from large datasets.
## Usage
To use this model, please install BERTopic:
```
pip install -U bertopic
```
You can use the model as follows:
```python
from bertopic import BERTopic
topic_model = BERTopic.load("VATSAL1729/NewBERTopicCEEW")
topic_model.get_topic_info()
```
## Topic overview
* Number of topics: 430
* Number of training documents: 92948
<details>
<summary>Click here for an overview of all topics.</summary>
| Topic ID | Topic Keywords | Topic Frequency | Label |
|----------|----------------|-----------------|-------|
| -1 | tools - equipment - vehicle - materials - check | 42 | -1_tools_equipment_vehicle_materials |
| 0 | yarn - fabric - silver - making - loom | 46813 | 0_yarn_fabric_silver_making |
| 1 | tank - transfer - fruits - start - pulp | 2242 | 1_tank_transfer_fruits_start |
| 2 | solar - charging - power - discuss - battery | 1221 | 2_solar_charging_power_discuss |
| 3 | customers - types - event - di - identify | 608 | 3_customers_types_event_di |
| 4 | documentation - records - corrective - record - deviations | 508 | 4_documentation_records_corrective_record |
| 5 | students - clients - expedition - leader - tree | 466 | 5_students_clients_expedition_leader |
| 6 | guest - tourist - travel - room - hotel | 409 | 6_guest_tourist_travel_room |
| 7 | renders - communicate - appropriately - - | 344 | 7_renders_communicate_appropriately_ |
| 8 | design - nations - creating - model - product | 341 | 8_design_nations_creating_model |
| 9 | sustainable - environmental - century - st - professional | 315 | 9_sustainable_environmental_century_st |
| 10 | select - services - requirement - products - financial | 302 | 10_select_services_requirement_products |
| 11 | english - basic - kills - times - user | 280 | 11_english_basic_kills_times |
| 12 | promotion - considering - model - marketing - price | 275 | 12_promotion_considering_model_marketing |
| 13 | clarity - selection - politely - recruitment - contract | 262 | 13_clarity_selection_politely_recruitment |
| 14 | ready - getting - apprenticeship - hygienic - standards | 260 | 14_ready_getting_apprenticeship_hygienic |
| 15 | sales - leads - customer - customers - market | 259 | 15_sales_leads_customer_customers |
| 16 | read - notes - routine - understand - information | 259 | 16_read_notes_routine_understand |
| 17 | common - expenses - components - identify - disturbance | 259 | 17_common_expenses_components_identify |
| 18 | enterprises - research - entrepreneurship - assess - opportunities | 258 | 18_enterprises_research_entrepreneurship_assess |
| 19 | opportunity - bundles - mitigate - sources - service | 258 | 19_opportunity_bundles_mitigate_sources |
| 20 | learn - awareness - learning - critical - practice | 257 | 20_learn_awareness_learning_critical |
| 21 | write - messages - notes - short - recorded | 257 | 21_write_messages_notes_short |
| 22 | beverage - menu - serving - serve - guest | 257 | 22_beverage_menu_serving_serve |
| 23 | inclusion - diversity - team - work - user | 257 | 23_inclusion_diversity_team_work |
| 24 | register - apprenticeship - opportunities - guideline - requirements | 257 | 24_register_apprenticeship_opportunities_guideline |
| 25 | curriculum - create - professional - - | 256 | 25_curriculum_create_professional_ |
| 26 | opening - methods - online - apply - identity | 256 | 26_opening_methods_online_apply |
| 27 | replace - bearings - defects - worn - repair | 256 | 27_replace_bearings_defects_worn |
| 28 | assess - potential - opportunities - business - identify | 254 | 28_assess_potential_opportunities_business |
| 29 | laws - legal - essential - relevant - rights | 251 | 29_laws_legal_essential_relevant |
| 30 | assembly - assemble - measuring - assembling - instruments | 249 | 30_assembly_assemble_measuring_assembling |
| 31 | address - needs - customer - appropriately - identify | 247 | 31_address_needs_customer_appropriately |
| 32 | sexual - harassment - literacy - act - post | 247 | 32_sexual_harassment_literacy_act |
| 33 | internet - devices - operate - digital - operations | 247 | 33_internet_devices_operate_digital |
| 34 | features - entrepreneurship - basic - privacy - use | 247 | 34_features_entrepreneurship_basic_privacy |
| 35 | coal - career - getting - development - english | 246 | 35_coal_career_getting_development |
| 36 | employment - recognize - cancer - sign - century | 246 | 36_employment_recognize_cancer_sign |
| 37 | scrap - dispose - waste - aggregate - collection | 242 | 37_scrap_dispose_waste_aggregate |
| 38 | test - vehicle - electric - drag - engine | 238 | 38_test_vehicle_electric_drag |
| 39 | categories - aggregate - event - di - waste | 235 | 39_categories_aggregate_event_di |
| 40 | arranging - challenges - associated - sources - service | 235 | 40_arranging_challenges_associated_sources |
| 41 | disposal - hazardous - processes - ed - specie | 233 | 41_disposal_hazardous_processes_ed |
| 42 | savings - calculate - expenses - estimate - property | 232 | 42_savings_calculate_expenses_estimate |
| 43 | approach - authorities - laws - essential - rights | 231 | 43_approach_authorities_laws_essential |
| 44 | applications - features - devices - operate - digital | 231 | 44_applications_features_devices_operate |
| 45 | register - apprenticeship - opportunities - requirement - identify | 231 | 45_register_apprenticeship_opportunities_requirement |
| 46 | meeting - citizenship - cancer - employability - skill | 230 | 46_meeting_citizenship_cancer_employability |
| 47 | attitude - awareness - positive - behavior - creative | 229 | 47_attitude_awareness_positive_behavior |
| 48 | create - basic - tackle - damage - site | 229 | 48_create_basic_tackle_damage |
| 49 | contents - conversation - person - english - basic | 228 | 49_contents_conversation_person_english |
| 50 | search - jobs - suitable - apply - satisfaction | 226 | 50_search_jobs_suitable_apply |
| 51 | verbal - listening - settings - etiquette - active | 225 | 51_verbal_listening_settings_etiquette |
| 52 | terence - career - understand - di - job | 221 | 52_terence_career_understand_di |
| 53 | transactions - online - securely - safely - financial | 217 | 53_transactions_online_securely_safely |
| 54 | term - long - short - career - goals | 216 | 54_term_long_short_career |
| 55 | portal - citizenship - employability - constitutional - learning | 216 | 55_portal_citizenship_employability_constitutional |
| 56 | collaboration - platforms - media - social - actively | 216 | 56_collaboration_platforms_media_social |
| 57 | sanctioning - rectify - machine - required - check | 215 | 57_sanctioning_rectify_machine_required |
| 58 | ready - getting - apprenticeship - hygienic - standards | 210 | 58_ready_getting_apprenticeship_hygienic |
| 59 | pills - leakage - tasks - processes - activities | 195 | 59_pills_leakage_tasks_processes |
| 60 | causes - accidents - possible - risks - hazardous | 195 | 60_causes_accidents_possible_risks |
| 61 | inclusion - diversity - team - initiative - capacity | 193 | 61_inclusion_diversity_team_initiative |
| 62 | services - securely - safely - products - financial | 190 | 62_services_securely_safely_products |
| 63 | lab - sample - analysis - samples - appearance | 189 | 63_lab_sample_analysis_samples |
| 64 | agencies - search - recruitment - reliable - jobs | 188 | 64_agencies_search_recruitment_reliable |
| 65 | platforms - internet - media - entrepreneurship - social | 185 | 65_platforms_internet_media_entrepreneurship |
| 66 | transformer - inverted - diagram - wiring - installation | 178 | 66_transformer_inverted_diagram_wiring |
| 67 | guests - manner - language - professional - tone | 172 | 67_guests_manner_language_professional |
| 68 | malfunctioning - vibration - noise - maintenance - report | 167 | 68_malfunctioning_vibration_noise_maintenance |
| 69 | wearing - day - clothes - healthy - te | 165 | 69_wearing_day_clothes_healthy |
| 70 | wipe - dirt - vacuum - furniture - agent | 165 | 70_wipe_dirt_vacuum_furniture |
| 71 | sexual - harassment - literacy - issues - legal | 162 | 71_sexual_harassment_literacy_issues |
| 72 | dealing - safe - hazards - working - self | 159 | 72_dealing_safe_hazards_working |
| 73 | es - wear - regularly - pp - dispose | 158 | 73_es_wear_regularly_pp |
| 74 | optimism - ways - usage - tasks - processes | 158 | 74_optimism_ways_usage_tasks |
| 75 | clothing - protective - tasks - specie - disabilities | 158 | 75_clothing_protective_tasks_specie |
| 76 | advanced - sanitation - authority - issues - hygienic | 155 | 76_advanced_sanitation_authority_issues |
| 77 | deposit - reliable - location - identity - recyclable | 150 | 77_deposit_reliable_location_identity |
| 78 | respecting - integrity - ethics - values - personal | 147 | 78_respecting_integrity_ethics_values |
| 79 | respectively - portal - job - cleaned - regularly | 145 | 79_respectively_portal_job_cleaned |
| 80 | consumption - public - verbal - gather - sustainable | 143 | 80_consumption_public_verbal_gather |
| 81 | ill - similar - situation - contact - people | 143 | 81_ill_similar_situation_contact |
| 82 | generate - recyclable - hazardous - non - protocol | 141 | 82_generate_recyclable_hazardous_non |
| 83 | budget - revenue - actual - forecast - variance | 139 | 83_budget_revenue_actual_forecast |
| 84 | errors - freedom - rectify - posts - respective | 137 | 84_errors_freedom_rectify_posts |
| 85 | requests - respond - needs - manner - professional | 135 | 85_requests_respond_needs_manner |
| 86 | shift - previous - operator - oncoming - doesn | 135 | 86_shift_previous_operator_oncoming |
| 87 | designate - dispose - location - arrest - safely | 134 | 87_designate_dispose_location_arrest |
| 88 | iso - productivity - result - achieving - quality | 133 | 88_iso_productivity_result_achieving |
| 89 | completed - report - work - manager - led | 133 | 89_completed_report_work_manager |
| 90 | style - sensitivity - persons - disability - act | 132 | 90_style_sensitivity_persons_disability |
| 91 | handle - transactions - entire - log - methods | 131 | 91_handle_transactions_entire_log |
| 92 | thedesignated - breaches - emergencies - rescue - risk | 130 | 92_thedesignated_breaches_emergencies_rescue |
| 93 | soap - alcohol - hands - regularly - based | 129 | 93_soap_alcohol_hands_regularly |
| 94 | industries - jobs - employability - skill - various | 128 | 94_industries_jobs_employability_skill |
| 95 | face - telephone - written - mr - allergens | 124 | 95_face_telephone_written_mr |
| 96 | receiving - reporting - instructions - supervisor - requirements | 123 | 96_receiving_reporting_instructions_supervisor |
| 97 | rep - future - reference - load - es | 122 | 97_rep_future_reference_load |
| 98 | identifying - routine - potential - hazards - carry | 122 | 98_identifying_routine_potential_hazards |
| 99 | extinguishes - interesting - display - avoid - listening | 122 | 99_extinguishes_interesting_display_avoid |
| 100 | merits - performance - target - executive - designed | 120 | 100_merits_performance_target_executive |
| 101 | recyclable - hazardous - non - waste - identify | 120 | 101_recyclable_hazardous_non_waste |
| 102 | emergencies - accidents - listen - ideas - based | 119 | 102_emergencies_accidents_listen_ideas |
| 103 | unable - plug - pills - rectify - leakage | 119 | 103_unable_plug_pills_rectify |
| 104 | housekeeping - prevent - export - order - accidents | 118 | 104_housekeeping_prevent_export_order |
| 105 | respectively - portal - job - specialists - repeat | 118 | 105_respectively_portal_job_specialists |
| 106 | values - civic - ethics - sustainable - duties | 117 | 106_values_civic_ethics_sustainable |
| 107 | respecting - integrity - ethics - values - personal | 117 | 107_respecting_integrity_ethics_values |
| 108 | values - civic - ethics - sustainable - duties | 115 | 108_values_civic_ethics_sustainable |
| 109 | turned - connected - appliances - properly - electrical | 113 | 109_turned_connected_appliances_properly |
| 110 | supplier - date - document - raw - details | 113 | 110_supplier_date_document_raw |
| 111 | timeline - requirements - ed - specie - work | 109 | 111_timeline_requirements_ed_specie |
| 112 | decide - market - access - transportation - means | 109 | 112_decide_market_access_transportation |
| 113 | guard - healthy - maintain - drinking - type | 109 | 113_guard_healthy_maintain_drinking |
| 114 | comply - anti - disability - gender - people | 107 | 114_comply_anti_disability_gender |
| 115 | written - english - - - | 106 | 115_written_english__ |
| 116 | sexual - harassment - literacy - listening - active | 106 | 116_sexual_harassment_literacy_listening |
| 117 | delivery - vendor - suppliers - purchase - order | 106 | 117_delivery_vendor_suppliers_purchase |
| 118 | corps - sobbing - trolley - cop - moving | 105 | 118_corps_sobbing_trolley_cop |
| 119 | sta - training - assessment - progress - operators | 104 | 119_sta_training_assessment_progress |
| 120 | organise - security - sanitize - sanitizers - current | 104 | 120_organise_security_sanitize_sanitizers |
| 121 | dispose - non - recyclable - appropriately - waste | 103 | 121_dispose_non_recyclable_appropriately |
| 122 | protective - pp - personal - purpose - thespeci | 103 | 122_protective_pp_personal_purpose |
| 123 | clair - authorized - personnel - supervisors - seek | 102 | 123_clair_authorized_personnel_supervisors |
| 124 | electrolysis - manufacturing - discuss - pictures - video | 102 | 124_electrolysis_manufacturing_discuss_pictures |
| 125 | legislation - functions - solutions - accordance - organizational | 102 | 125_legislation_functions_solutions_accordance |
| 126 | increase - improve - science - existing - technique | 101 | 126_increase_improve_science_existing |
| 127 | organized - drills - participate - evacuation - procedures | 101 | 127_organized_drills_participate_evacuation |
| 128 | grease - dressing - lubricant - oil - oils | 100 | 128_grease_dressing_lubricant_oil |
| 129 | footgear - glasses - gorges - pp - safe | 99 | 129_footgear_glasses_gorges_pp |
| 130 | requests - respond - needs - manner - professional | 99 | 130_requests_respond_needs_manner |
| 131 | tourism - hospitality - persons - suggested - disability | 99 | 131_tourism_hospitality_persons_suggested |
| 132 | aid - rest - kit - administer - access | 99 | 132_aid_rest_kit_administer |
| 133 | statutory - regulatory - regulations - legislation - approval | 98 | 133_statutory_regulatory_regulations_legislation |
| 134 | passed - protocol - heating - timely - essential | 98 | 134_passed_protocol_heating_timely |
| 135 | walk - obstruction - periodic - assigned - free | 97 | 135_walk_obstruction_periodic_assigned |
| 136 | isolation - near - positive - traveling - distance | 96 | 136_isolation_near_positive_traveling |
| 137 | injury - unattended - denial - leaving - workstation | 94 | 137_injury_unattended_denial_leaving |
| 138 | price - market - tends - lines - best | 93 | 138_price_market_tends_lines |
| 139 | labelled - sealed - bag - plastic - es | 92 | 139_labelled_sealed_bag_plastic |
| 140 | authorized - personnel - supervisors - risks - potential | 92 | 140_authorized_personnel_supervisors_risks |
| 141 | period - position - protocol - assigned - correct | 91 | 141_period_position_protocol_assigned |
| 142 | handicrafts - carpet - sector - council - kill | 90 | 142_handicrafts_carpet_sector_council |
| 143 | observed - subordinates - immediate - sure - organisation | 90 | 143_observed_subordinates_immediate_sure |
| 144 | civic - recognize - responsibility - duties - citizenship | 89 | 144_civic_recognize_responsibility_duties |
| 145 | sanitizers - machineries - clean - recommended - agents | 89 | 145_sanitizers_machineries_clean_recommended |
| 146 | dishes - recipe - food - ingredient - prepared | 88 | 146_dishes_recipe_food_ingredient |
| 147 | scope - duties - problems - escapade - authority | 88 | 147_scope_duties_problems_escapade |
| 148 | agreed - schedules - running - victims - administer | 87 | 148_agreed_schedules_running_victims |
| 149 | tolerable - trenches - elevated - terms - shape | 86 | 149_tolerable_trenches_elevated_terms |
| 150 | crockery - established - articles - ascertain - superiors | 86 | 150_crockery_established_articles_ascertain |
| 151 | space - interact - superior - respect - customers | 86 | 151_space_interact_superior_respect |
| 152 | undertake - assessment - risk - view - point | 85 | 152_undertake_assessment_risk_view |
| 153 | yes - receive - orders - reporting - metals | 85 | 153_yes_receive_orders_reporting |
| 154 | involving - tends - issue - industries - research | 84 | 154_involving_tends_issue_industries |
| 155 | environment - related - management - procedures - follow | 84 | 155_environment_related_management_procedures |
| 156 | hydrogen - green - india - successful - generation | 84 | 156_hydrogen_green_india_successful |
| 157 | applicable - waste - aluminium - cardboard - bottles | 83 | 157_applicable_waste_aluminium_cardboard |
| 158 | extinguishes - comply - security - regulations - guideline | 83 | 158_extinguishes_comply_security_regulations |
| 159 | build - relationship - active - guests - personal | 83 | 159_build_relationship_active_guests |
| 160 | dusting - aggregate - event - di - waste | 83 | 160_dusting_aggregate_event_di |
| 161 | sanitizers - wash - alcohol - running - availability | 82 | 161_sanitizers_wash_alcohol_running |
| 162 | trash - cleanliness - regularly - schedule - following | 82 | 162_trash_cleanliness_regularly_schedule |
| 163 | preference - goals - given - active - management | 82 | 163_preference_goals_given_active |
| 164 | emergency - natural - participate - yes - promptly | 81 | 164_emergency_natural_participate_yes |
| 165 | gender - consideration - taking - came - certain | 81 | 165_gender_consideration_taking_came |
| 166 | communication - english - basic - kills - using | 81 | 166_communication_english_basic_kills |
| 167 | scope - company - regard - ha - possible | 81 | 167_scope_company_regard_ha |
| 168 | soap - alcohol - washing - hands - hand | 81 | 168_soap_alcohol_washing_hands |
| 169 | greet - promptly - organization - procedure - guests | 80 | 169_greet_promptly_organization_procedure |
| 170 | utilization - electricity - accurately - clearly - energy | 80 | 170_utilization_electricity_accurately_clearly |
| 171 | supervisor - received - week - completion - clair | 80 | 171_supervisor_received_week_completion |
| 172 | anxiety - stress - technique - perform - management | 80 | 172_anxiety_stress_technique_perform |
| 173 | regular - checks - intervals - preventive - organized | 80 | 173_regular_checks_intervals_preventive |
| 174 | policy - evacuation - times - comply - organizational | 79 | 174_policy_evacuation_times_comply |
| 175 | recyclable - tourism - hospitality - aggregate - hazardous | 79 | 175_recyclable_tourism_hospitality_aggregate |
| 176 | civic - recognize - responsibility - duties - citizenship | 79 | 176_civic_recognize_responsibility_duties |
| 177 | wastes - recyclable - anxiety - stress - hazardous | 79 | 177_wastes_recyclable_anxiety_stress |
| 178 | recalling - concerned - active - person - workplace | 78 | 178_recalling_concerned_active_person |
| 179 | adopt - car - cards - behavior - gender | 78 | 179_adopt_car_cards_behavior |
| 180 | recognise - alarms - measures - signs - interpret | 78 | 180_recognise_alarms_measures_signs |
| 181 | handled - supervisors - repairs - problems - escapade | 77 | 181_handled_supervisors_repairs_problems |
| 182 | doctor - treatment - illness - frothy - case | 77 | 182_doctor_treatment_illness_frothy |
| 183 | celebration - trials - attained - testing - tests | 76 | 183_celebration_trials_attained_testing |
| 184 | video - excessive - recording - consumption - reports | 76 | 184_video_excessive_recording_consumption |
| 185 | shows - way - respect - colleagues - work | 76 | 185_shows_way_respect_colleagues |
| 186 | programs - group - participate - organized - drills | 76 | 186_programs_group_participate_organized |
| 187 | involved - improving - friendly - response - organization | 76 | 187_involved_improving_friendly_response |
| 188 | productivity - fuel - conservation - material - machine | 75 | 188_productivity_fuel_conservation_material |
| 189 | guide - motivate - organisation - operational - members | 75 | 189_guide_motivate_organisation_operational |
| 190 | kitchen - table - cleanliness - utensils - glassware | 75 | 190_kitchen_table_cleanliness_utensils |
| 191 | aid - rest - external - automatic - provide | 75 | 191_aid_rest_external_automatic |
| 192 | risks - potential - monitor - processes - workplace | 75 | 192_risks_potential_monitor_processes |
| 193 | task - tools - machine - work - panel | 75 | 193_task_tools_machine_work |
| 194 | disposing - collecting - storing - lubricant - battery | 74 | 194_disposing_collecting_storing_lubricant |
| 195 | unable - plug - pills - rectify - leakage | 74 | 195_unable_plug_pills_rectify |
| 196 | tourism - hospitality - leave - walk - revenue | 74 | 196_tourism_hospitality_leave_walk |
| 197 | multi - supporting - share - task - necessary | 73 | 197_multi_supporting_share_task |
| 198 | absentees - loss - productivity - illness - risk | 73 | 198_absentees_loss_productivity_illness |
| 199 | compliance - guideline - regulatory - board - group | 73 | 199_compliance_guideline_regulatory_board |
| 200 | means - needed - clear - customers - colleagues | 73 | 200_means_needed_clear_customers |
| 201 | network - automatic - devices - fro - bad | 73 | 201_network_automatic_devices_fro |
| 202 | minimize - interests - attributes - damage - duties | 73 | 202_minimize_interests_attributes_damage |
| 203 | implementing - recognise - programs - existing - planning | 72 | 203_implementing_recognise_programs_existing |
| 204 | wedding - saw - accessories - technicians - operators | 72 | 204_wedding_saw_accessories_technicians |
| 205 | age - group - facilities - needs - diverse | 72 | 205_age_group_facilities_needs |
| 206 | sessions - organise - faced - drills - challenges | 72 | 206_sessions_organise_faced_drills |
| 207 | impact - steps - environmental - operations - related | 72 | 207_impact_steps_environmental_operations |
| 208 | malfunctions - notify - stoppage - supervisor - desk | 72 | 208_malfunctions_notify_stoppage_supervisor |
| 209 | consult - resolve - science - diversity - minimize | 71 | 209_consult_resolve_science_diversity |
| 210 | going - asked - undertake - response - training | 71 | 210_going_asked_undertake_response |
| 211 | adhere - damage - workplace - standards - material | 71 | 211_adhere_damage_workplace_standards |
| 212 | organisation - banking - policies - working - mobile | 71 | 212_organisation_banking_policies_working |
| 213 | departments - resolve - communicating - disposing - interact | 70 | 213_departments_resolve_communicating_disposing |
| 214 | politely - colleagues - members - team - encourage | 70 | 214_politely_colleagues_members_team |
| 215 | shift - duty - log - previous - operator | 70 | 215_shift_duty_log_previous |
| 216 | thedesignated - breaches - security - health - identity | 70 | 216_thedesignated_breaches_security_health |
| 217 | recyclable - conservation - reliable - energy - location | 70 | 217_recyclable_conservation_reliable_energy |
| 218 | involving - stay - craft - updated - resources | 70 | 218_involving_stay_craft_updated |
| 219 | site - hygienic - maintain - work - leaving | 69 | 219_site_hygienic_maintain_work |
| 220 | doubts - clarify - interact - compliance - usage | 69 | 220_doubts_clarify_interact_compliance |
| 221 | aspects - clearly - department - communicate - actively | 69 | 221_aspects_clearly_department_communicate |
| 222 | communicating - good - follow - - | 69 | 222_communicating_good_follow_ |
| 223 | personaland - life - handicrafts - carpet - sector | 69 | 223_personaland_life_handicrafts_carpet |
| 224 | incorporate - experience - improve - feedback - guest | 69 | 224_incorporate_experience_improve_feedback |
| 225 | formats - recording - pass - timely - essential | 68 | 225_formats_recording_pass_timely |
| 226 | language - responsible - behavior - etiquette - demonstrate | 68 | 226_language_responsible_behavior_etiquette |
| 227 | registration - food - policy - certain - pertaining | 68 | 227_registration_food_policy_certain |
| 228 | coordinating - tender - outside - include - input | 68 | 228_coordinating_tender_outside_include |
| 229 | reporting - act - person - condition - concerned | 68 | 229_reporting_act_person_condition |
| 230 | incentive - indicator - output - target - performance | 67 | 230_incentive_indicator_output_target |
| 231 | come - minutes - adopt - spot - continuous | 67 | 231_come_minutes_adopt_spot |
| 232 | proof - visitors - residual - boots - aprons | 67 | 232_proof_visitors_residual_boots |
| 233 | hygienic - members - gap - personal - team | 67 | 233_hygienic_members_gap_personal |
| 234 | civic - recognize - responsibility - duties - citizenship | 67 | 234_civic_recognize_responsibility_duties |
| 235 | line - guideline - activities - carry - procedures | 67 | 235_line_guideline_activities_carry |
| 236 | workstation - regularly - clean - cleaned - equipment | 66 | 236_workstation_regularly_clean_cleaned |
| 237 | years - know - reasons - protective - gorges | 66 | 237_years_know_reasons_protective |
| 238 | track - wished - audit - verify - documents | 66 | 238_track_wished_audit_verify |
| 239 | eye - kit - extinguished - phone - wash | 66 | 239_eye_kit_extinguished_phone |
| 240 | free - person - minor - demonstrate - attend | 65 | 240_free_person_minor_demonstrate |
| 241 | comply - applicable - instructions - related - health | 65 | 241_comply_applicable_instructions_related |
| 242 | times - sure - data - guest - make | 65 | 242_times_sure_data_guest |
| 243 | personaland - life - english - professional - basic | 65 | 243_personaland_life_english_professional |
| 244 | gloves - gear - mass - slip - gorges | 65 | 244_gloves_gear_mass_slip |
| 245 | state - location - names - responsible - refer | 64 | 245_state_location_names_responsible |
| 246 | prevention - sexual - harassment - adhere - company | 64 | 246_prevention_sexual_harassment_adhere |
| 247 | immediately - superior - etiquette - authority - concerned | 64 | 247_immediately_superior_etiquette_authority |
| 248 | handicrafts - carpet - sector - council - kill | 63 | 248_handicrafts_carpet_sector_council |
| 249 | recognise - incidents - harassment - names - authority | 63 | 249_recognise_incidents_harassment_names |
| 250 | objects - noise - ages - resolved - large | 63 | 250_objects_noise_ages_resolved |
| 251 | situations - adjust - event - di - work | 63 | 251_situations_adjust_event_di |
| 252 | seek - assistance - performing - superiors - display | 63 | 252_seek_assistance_performing_superiors |
| 253 | care - way - handle - machinery - correct | 63 | 253_care_way_handle_machinery |
| 254 | roles - responsibility - perform - situations - handle | 62 | 254_roles_responsibility_perform_situations |
| 255 | lines - range - unique - develop - high | 62 | 255_lines_range_unique_develop |
| 256 | aid - rest - corrective - accident - appropriately | 62 | 256_aid_rest_corrective_accident |
| 257 | illness - immediately - concerned - regarding - authorities | 61 | 257_illness_immediately_concerned_regarding |
| 258 | wastewater - treatment - water - pumps - showcase | 61 | 258_wastewater_treatment_water_pumps |
| 259 | compression - suction - thing - valves - piping | 61 | 259_compression_suction_thing_valves |
| 260 | ideas - new - develop - procedures - work | 61 | 260_ideas_new_develop_procedures |
| 261 | measures - taking - health - early - user | 61 | 261_measures_taking_health_early |
| 262 | organisational - repair - protocol - risks - prevent | 60 | 262_organisational_repair_protocol_risks |
| 263 | fundamentals - management - waste - follow - | 60 | 263_fundamentals_management_waste_follow |
| 264 | shortage - need - raw - time - materials | 60 | 264_shortage_need_raw_time |
| 265 | communication - english - basic - kills - using | 60 | 265_communication_english_basic_kills |
| 266 | minimize - actions - risks - self - health | 60 | 266_minimize_actions_risks_self |
| 267 | rescue - hazard - apply - technique - introduction | 60 | 267_rescue_hazard_apply_technique |
| 268 | pp - requirement - appropriate - use - jacques | 60 | 268_pp_requirement_appropriate_use |
| 269 | way - respect - customers - shows - members | 60 | 269_way_respect_customers_shows |
| 270 | strong - protect - sensitive - basis - change | 59 | 270_strong_protect_sensitive_basis |
| 271 | anxiety - stress - employees - support - technique | 59 | 271_anxiety_stress_employees_support |
| 272 | assistance - persons - disability - provide - asked | 59 | 272_assistance_persons_disability_provide |
| 273 | superiors - etiquette - interesting - colleagues - proper | 59 | 273_superiors_etiquette_interesting_colleagues |
| 274 | daily - submit - formats - performance - prescribed | 58 | 274_daily_submit_formats_performance |
| 275 | lights - switch - textile - disinfectant - response | 58 | 275_lights_switch_textile_disinfectant |
| 276 | advance - secured - issue - understanding - inform | 58 | 276_advance_secured_issue_understanding |
| 277 | agencies - search - recruitment - reliable - jobs | 57 | 277_agencies_search_recruitment_reliable |
| 278 | personaland - life - tourism - hospitality - comply | 57 | 278_personaland_life_tourism_hospitality |
| 279 | gap - relating - ones - manufacturing - accurately | 57 | 279_gap_relating_ones_manufacturing |
| 280 | platforms - internet - media - entrepreneurship - social | 57 | 280_platforms_internet_media_entrepreneurship |
| 281 | payment - cash - present - voice - advance | 57 | 281_payment_cash_present_voice |
| 282 | direction - accordingly - environmental - guideline - hazards | 56 | 282_direction_accordingly_environmental_guideline |
| 283 | hazard - warning - displayed - harmful - explosive | 56 | 283_hazard_warning_displayed_harmful |
| 284 | recommended - damage - handling - procedure - material | 56 | 284_recommended_damage_handling_procedure |
| 285 | responsible - uses - behaviour - adopt - resources | 56 | 285_responsible_uses_behaviour_adopt |
| 286 | overcome - asked - challenges - help - members | 56 | 286_overcome_asked_challenges_help |
| 287 | signals - notify - cleaning - sanitation - wet | 56 | 287_signals_notify_cleaning_sanitation |
| 288 | face - telephone - clearly - written - including | 56 | 288_face_telephone_clearly_written |
| 289 | zero - accident - workplace - ensure - characteristics | 56 | 289_zero_accident_workplace_ensure |
| 290 | elevated - trenches - places - handicrafts - ned | 55 | 290_elevated_trenches_places_handicrafts |
| 291 | communication - english - basic - kills - using | 55 | 291_communication_english_basic_kills |
| 292 | whipped - broken - metals - surfaces - cylinders | 54 | 292_whipped_broken_metals_surfaces |
| 293 | deals - plans - guest - professional - team | 54 | 293_deals_plans_guest_professional |
| 294 | oil - level - engine - levels - lubricant | 54 | 294_oil_level_engine_levels |
| 295 | actions - responsibility - email - send - account | 54 | 295_actions_responsibility_email_send |
| 296 | layer - footgear - glasses - gorges - mass | 54 | 296_layer_footgear_glasses_gorges |
| 297 | scheme - schemes - government - reach - mrs | 54 | 297_scheme_schemes_government_reach |
| 298 | hand - technique - place - hygienic - practices | 54 | 298_hand_technique_place_hygienic |
| 299 | strategics - pricking - revenue - leak - plans | 54 | 299_strategics_pricking_revenue_leak |
| 300 | agra - residue - trading - supply - procurement | 53 | 300_agra_residue_trading_supply |
| 301 | breaches - thedesignated - security - health - encourage | 53 | 301_breaches_thedesignated_security_health |
| 302 | industries - jobs - employability - skill - various | 53 | 302_industries_jobs_employability_skill |
| 303 | aspects - accounts - book - compile - business | 53 | 303_aspects_accounts_book_compile |
| 304 | fall - failure - emergency - ground - response | 53 | 304_fall_failure_emergency_ground |
| 305 | concerns - interact - reporting - behaviour - possible | 53 | 305_concerns_interact_reporting_behaviour |
| 306 | immediate - case - action - appropriate - formulate | 53 | 306_immediate_case_action_appropriate |
| 307 | hazard - free - working - clean - area | 53 | 307_hazard_free_working_clean |
| 308 | role - assigned - duties - job - drinking | 53 | 308_role_assigned_duties_job |
| 309 | handled - supervisors - problems - escapade - | 53 | 309_handled_supervisors_problems_escapade |
| 310 | protection - information - ensure - les - inappropriate | 52 | 310_protection_information_ensure_les |
| 311 | competitors - leak - plans - prevent - new | 52 | 311_competitors_leak_plans_prevent |
| 312 | communicating - good - follow - - | 51 | 312_communicating_good_follow_ |
| 313 | controlling - operational - risk - achieve - company | 51 | 313_controlling_operational_risk_achieve |
| 314 | maintenance - preventive - addressed - schedule - support | 51 | 314_maintenance_preventive_addressed_schedule |
| 315 | victims - aid - rest - procedure - provide | 51 | 315_victims_aid_rest_procedure |
| 316 | drills - attend - promptly - respond - training | 51 | 316_drills_attend_promptly_respond |
| 317 | pitch - tone - politely - care - language | 51 | 317_pitch_tone_politely_care |
| 318 | ill - similar - contact - situation - housekeeping | 51 | 318_ill_similar_contact_situation |
| 319 | dealing - yes - emergencies - type - follow | 51 | 319_dealing_yes_emergencies_type |
| 320 | maintaining - productivity - achieve - health - work | 51 | 320_maintaining_productivity_achieve_health |
| 321 | people - disability - barriers - rope - menu | 51 | 321_people_disability_barriers_rope |
| 322 | ones - responsibility - cleaning - carry - maintenance | 51 | 322_ones_responsibility_cleaning_carry |
| 323 | renders - communicate - appropriately - - | 51 | 323_renders_communicate_appropriately_ |
| 324 | extinguishes - correctly - yes - accurate - types | 51 | 324_extinguishes_correctly_yes_accurate |
| 325 | risks - potential - processes - monitor - workplace | 50 | 325_risks_potential_processes_monitor |
| 326 | disciplinary - attach - implementation - rules - clothing | 50 | 326_disciplinary_attach_implementation_rules |
| 327 | wear - department - protective - personal - production | 50 | 327_wear_department_protective_personal |
| 328 | role - shed - multiple - prevention - utilize | 50 | 328_role_shed_multiple_prevention |
| 329 | section - mask - printing - glasses - gloves | 50 | 329_section_mask_printing_glasses |
| 330 | written - english - - - | 50 | 330_written_english__ |
| 331 | substances - glass - liquid - gases - cooking | 50 | 331_substances_glass_liquid_gases |
| 332 | company - properly - communicate - policies - workplace | 50 | 332_company_properly_communicate_policies |
| 333 | cause - allergens - separately - cross - items | 50 | 333_cause_allergens_separately_cross |
| 334 | regular - place - cleaning - practices - using | 50 | 334_regular_place_cleaning_practices |
| 335 | duty - problems - identity - assigned - superior | 50 | 335_duty_problems_identity_assigned |
| 336 | shoes - unit - wear - damage - avoid | 50 | 336_shoes_unit_wear_damage |
| 337 | interesting - etiquette - members - maintain - team | 50 | 337_interesting_etiquette_members_maintain |
| 338 | denial - privacy - sta - followed - members | 50 | 338_denial_privacy_sta_followed |
| 339 | close - contact - illness - regarding - people | 50 | 339_close_contact_illness_regarding |
| 340 | causes - prevention - associated - risks - giving | 49 | 340_causes_prevention_associated_risks |
| 341 | roof - mines - straight - exposed - scale | 49 | 341_roof_mines_straight_exposed |
| 342 | freedom - acting - situations - creative - avoid | 49 | 342_freedom_acting_situations_creative |
| 343 | badges - caps - shoes - plans - mass | 49 | 343_badges_caps_shoes_plans |
| 344 | help - disability - people - workplace - required | 49 | 344_help_disability_people_workplace |
| 345 | drainage - lighting - transportation - distribution - facilities | 49 | 345_drainage_lighting_transportation_distribution |
| 346 | harness - ear - shoes - gorges - mass | 49 | 346_harness_ear_shoes_gorges |
| 347 | laid - india - list - fossa - establish | 49 | 347_laid_india_list_fossa |
| 348 | extinguishes - types - operate - control - event | 49 | 348_extinguishes_types_operate_control |
| 349 | survey - site - map - sensibility - soil | 49 | 349_survey_site_map_sensibility |
| 350 | medical - real - situation - promptly - accident | 49 | 350_medical_real_situation_promptly |
| 351 | discharge - illness - shift - report - | 48 | 351_discharge_illness_shift_report |
| 352 | workers - respect - colleagues - - | 48 | 352_workers_respect_colleagues_ |
| 353 | ethical - integrity - behaviour - personal - maintain | 48 | 353_ethical_integrity_behaviour_personal |
| 354 | extinguished - type - appropriate - use - near | 48 | 354_extinguished_type_appropriate_use |
| 355 | communicate - plan - safety - plans - smudge | 48 | 355_communicate_plan_safety_plans |
| 356 | motivate - planning - employees - industry - product | 48 | 356_motivate_planning_employees_industry |
| 357 | crockery - established - articles - clean - standards | 48 | 357_crockery_established_articles_clean |
| 358 | display - media - responsible - platforms - behaviour | 48 | 358_display_media_responsible_platforms |
| 359 | danger - incidents - reduce - immediate - accidents | 48 | 359_danger_incidents_reduce_immediate |
| 360 | driving - road - tra - mr - excessive | 47 | 360_driving_road_tra_mr |
| 361 | prevent - comply - accidents - procedures - safety | 47 | 361_prevent_comply_accidents_procedures |
| 362 | bleeding - choking - burns - shock - administer | 47 | 362_bleeding_choking_burns_shock |
| 363 | le - language - convert - code - needed | 47 | 363_le_language_convert_code |
| 364 | moving - lifting - supplies - equipment - recommendations | 47 | 364_moving_lifting_supplies_equipment |
| 365 | cent - active - communication - workplace - user | 47 | 365_cent_active_communication_workplace |
| 366 | approval - ideas - superior - improving - adopt | 47 | 366_approval_ideas_superior_improving |
| 367 | undertaking - supporting - roof - rules - support | 47 | 367_undertaking_supporting_roof_rules |
| 368 | deal - protocol - organisational - hazards - appropriately | 47 | 368_deal_protocol_organisational_hazards |
| 369 | turned - connected - appliances - electrical - properly | 47 | 369_turned_connected_appliances_electrical |
| 370 | role - process - desired - responsibility - performance | 47 | 370_role_process_desired_responsibility |
| 371 | economic - moving - lifting - supplies - practice | 46 | 371_economic_moving_lifting_supplies |
| 372 | alcohol - washing - hands - regularly - pp | 46 | 372_alcohol_washing_hands_regularly |
| 373 | dimension - square - depth - controlled - maximum | 46 | 373_dimension_square_depth_controlled |
| 374 | implement - performance - applications - features - devices | 46 | 374_implement_performance_applications_features |
| 375 | authority - concerned - case - report - eye | 46 | 375_authority_concerned_case_report |
| 376 | bleeding - choking - burns - shock - electric | 46 | 376_bleeding_choking_burns_shock |
| 377 | textile - sector - rules - role - assigned | 46 | 377_textile_sector_rules_role |
| 378 | lifting - handling - correct - procedures - use | 46 | 378_lifting_handling_correct_procedures |
| 379 | consider - superior - task - resolve - terence | 46 | 379_consider_superior_task_resolve |
| 380 | posts - problems - document - management - report | 46 | 380_posts_problems_document_management |
| 381 | angle - engineering - terence - mechanical - drawing | 46 | 381_angle_engineering_terence_mechanical |
| 382 | cylinder - bones - shapes - interpret - visitors | 45 | 382_cylinder_bones_shapes_interpret |
| 383 | nail - temple - sample - collect - inspection | 45 | 383_nail_temple_sample_collect |
| 384 | handled - superior - problems - escapade - | 45 | 384_handled_superior_problems_escapade |
| 385 | handling - promote - followed - acids - frame | 45 | 385_handling_promote_followed_acids |
| 386 | disinfectant - solution - recommended - cleaning - clean | 45 | 386_disinfectant_solution_recommended_cleaning |
| 387 | terms - communicating - policy - display - industry | 45 | 387_terms_communicating_policy_display |
| 388 | foundation - mounting - civil - structural - decide | 45 | 388_foundation_mounting_civil_structural |
| 389 | cook - advantages - discuss - portable - traditional | 45 | 389_cook_advantages_discuss_portable |
| 390 | interact - superior - supervisor - gets - blocked | 44 | 390_interact_superior_supervisor_gets |
| 391 | easily - accessible - avoiding - moisture - drawings | 44 | 391_easily_accessible_avoiding_moisture |
| 392 | geometric - drawings - interpret - departments - resolve | 44 | 392_geometric_drawings_interpret_departments |
| 393 | important - shapes - dimensions - engineering - interpret | 44 | 393_important_shapes_dimensions_engineering |
| 394 | risks - hazards - driving - breaches - prevention | 44 | 394_risks_hazards_driving_breaches |
| 395 | stacked - separately - carried - surface - coal | 44 | 395_stacked_separately_carried_surface |
| 396 | traveling - incise - roads - post - practices | 44 | 396_traveling_incise_roads_post |
| 397 | comply - security - health - requirements - healthy | 44 | 397_comply_security_health_requirements |
| 398 | occupational - gases - dust - water - hazards | 44 | 398_occupational_gases_dust_water |
| 399 | view - working - point - importance - build | 44 | 399_view_working_point_importance |
| 400 | complaints - address - guest - reasons - actively | 44 | 400_complaints_address_guest_reasons |
| 401 | date - manufacture - expire - secondary - primary | 44 | 401_date_manufacture_expire_secondary |
| 402 | future - meeting - current - market - cancer | 44 | 402_future_meeting_current_market |
| 403 | wash - intervals - alcohol - washing - hands | 44 | 403_wash_intervals_alcohol_washing |
| 404 | personaland - life - council - english - professional | 44 | 404_personaland_life_council_english |
| 405 | shapes - terence - di - identify - | 44 | 405_shapes_terence_di_identify |
| 406 | ear - body - plugs - eye - mask | 43 | 406_ear_body_plugs_eye |
| 407 | isolated - sealed - near - working - operating | 43 | 407_isolated_sealed_near_working |
| 408 | time - report - early - haste - work | 43 | 408_time_report_early_haste |
| 409 | aware - sanitation - regulations - following - hygienic | 43 | 409_aware_sanitation_regulations_following |
| 410 | drums - sequence - selection - operations - identify | 43 | 410_drums_sequence_selection_operations |
| 411 | hanover - deliver - reasons - completed - supervisor | 43 | 411_hanover_deliver_reasons_completed |
| 412 | electricity - optimism - ways - usage - energy | 43 | 412_electricity_optimism_ways_usage |
| 413 | applying - application - loan - medium - term | 43 | 413_applying_application_loan_medium |
| 414 | half - weekly - annual - monthly - periodic | 43 | 414_half_weekly_annual_monthly |
| 415 | existing - initiative - methods - meat - trimmed | 43 | 415_existing_initiative_methods_meat |
| 416 | etiquette - feedback - provide - members - professional | 43 | 416_etiquette_feedback_provide_members |
| 417 | checklist - audit - conduct - hygienic - workplace | 43 | 417_checklist_audit_conduct_hygienic |
| 418 | resolved - malfunctioning - vibration - noise - actively | 43 | 418_resolved_malfunctioning_vibration_noise |
| 419 | recyclable - conservation - reliable - energy - location | 42 | 419_recyclable_conservation_reliable_energy |
| 420 | cleanliness - hygienic - personal - human - high | 42 | 420_cleanliness_hygienic_personal_human |
| 421 | maintenance - post - activities - supplier - concerning | 42 | 421_maintenance_post_activities_supplier |
| 422 | services - securely - safely - products - financial | 42 | 422_services_securely_safely_products |
| 423 | arrangement - came - group - nishedproducts - inspecting | 42 | 423_arrangement_came_group_nishedproducts |
| 424 | manage - control - user - competent - able | 42 | 424_manage_control_user_competent |
| 425 | processing - food - safe - area - hygienic | 42 | 425_processing_food_safe_area |
| 426 | term - long - short - career - goals | 42 | 426_term_long_short_career |
| 427 | notices - refer - government - point - health | 42 | 427_notices_refer_government_point |
| 428 | chart - refer - produced - product - process | 42 | 428_chart_refer_produced_product |
</details>
## Training hyperparameters
* calculate_probabilities: False
* language: None
* low_memory: False
* min_topic_size: 10
* n_gram_range: (1, 1)
* nr_topics: None
* seed_topic_list: None
* top_n_words: 10
* verbose: True
* zeroshot_min_similarity: 0.7
* zeroshot_topic_list: None
## Framework versions
* Numpy: 1.26.4
* HDBSCAN: 0.8.37
* UMAP: 0.5.6
* Pandas: 2.2.2
* Scikit-Learn: 1.2.2
* Sentence-transformers: 3.0.1
* Transformers: 4.42.3
* Numba: 0.60.0
* Plotly: 5.18.0
* Python: 3.10.13
| [
"CRAFT"
] |
KrithikV/phi-3-mini-LoRA-MEDQA-V2 | KrithikV | null | [
"peft",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"base_model:microsoft/Phi-3-mini-4k-instruct",
"base_model:adapter:microsoft/Phi-3-mini-4k-instruct",
"license:mit",
"region:us"
] | "2024-07-30T16:40:07Z" | 2024-07-30T16:40:11+00:00 | 0 | 0 | ---
base_model: microsoft/Phi-3-mini-4k-instruct
library_name: peft
license: mit
tags:
- trl
- sft
- generated_from_trainer
model-index:
- name: phi-3-mini-LoRA-MEDQA-V2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# phi-3-mini-LoRA-MEDQA-V2
This model is a fine-tuned version of [microsoft/Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6687
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.7097 | 1.4493 | 100 | 0.6728 |
| 0.667 | 2.8986 | 200 | 0.6687 |
### Framework versions
- PEFT 0.12.0
- Transformers 4.43.3
- Pytorch 2.3.1+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1 | [
"MEDQA"
] |
KrithikV/phi-3-mini-LoRA-MEDQA-V3 | KrithikV | null | [
"peft",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"base_model:microsoft/Phi-3-mini-4k-instruct",
"base_model:adapter:microsoft/Phi-3-mini-4k-instruct",
"license:mit",
"region:us"
] | "2024-08-01T02:25:16Z" | 2024-08-01T02:25:26+00:00 | 0 | 0 | ---
base_model: microsoft/Phi-3-mini-4k-instruct
library_name: peft
license: mit
tags:
- trl
- sft
- generated_from_trainer
model-index:
- name: phi-3-mini-LoRA-MEDQA-V3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# phi-3-mini-LoRA-MEDQA-V3
This model is a fine-tuned version of [microsoft/Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6086
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:------:|:----:|:---------------:|
| 0.6898 | 0.3633 | 100 | 0.6195 |
| 0.6113 | 0.7266 | 200 | 0.6134 |
| 0.6095 | 1.0899 | 300 | 0.6110 |
| 0.6034 | 1.4532 | 400 | 0.6103 |
| 0.6037 | 1.8165 | 500 | 0.6093 |
| 0.6043 | 2.1798 | 600 | 0.6089 |
| 0.5986 | 2.5431 | 700 | 0.6089 |
| 0.5993 | 2.9064 | 800 | 0.6086 |
### Framework versions
- PEFT 0.12.0
- Transformers 4.43.3
- Pytorch 2.3.1+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1 | [
"MEDQA"
] |
nawhs/fast-ack | nawhs | null | [
"tensorboard",
"safetensors",
"distilbert",
"region:us"
] | "2024-08-02T13:52:20Z" | 2024-08-02T17:20:01+00:00 | 0 | 1 | ---
{}
---
Fast Acknowledgment Model for Realtime Conversation.
Trained on synthetic data using gpt4 and sonet3.5
Please find the labels below:
{'It happens.': 0,
'Sounds busy.': 1,
'You seem upset.': 2,
'That sounds hard.': 3,
'I hear you.': 4,
'I understand.': 5,
'Absolutely!': 6,
'Understood.': 7,
'You feel unappreciated.': 8,
"Let's check then.": 9,
'Perhaps so.': 10,
"That's frustrating.": 11,
"I can see why you're concerned.": 12,
'That does sound challenging.': 13,
'I appreciate you sharing that.': 14,
"That's a valid point.": 15,
'That must be difficult.': 16,
"I'm listening.": 17,
"Let's explore that further.": 18,
'Your perspective is valuable.': 19,
"I'm here to help.": 20,
'I see the opportunity there.': 21,
"Let's discuss how we can address that.": 22,
"That's an interesting challenge.": 23,
'I appreciate your candor.': 24,
'Your needs are important to us.': 25,
"That's a great question.": 26,
'I can see how that would be a concern.': 27,
'Let me make sure I understand correctly.': 28,
"That's an important point you've raised.": 29,
"I'd be happy to elaborate on that.": 30,
'I see what you mean.': 31,
"That's a fair point.": 32,
'Thanks for bringing that up.': 33,
"I hadn't thought of that.": 34,
'Let me check on that for you.': 35,
"I'm not sure I follow.": 36,
"Let's take a step back.": 37,
'I apologize for the confusion.': 38,
"That's not quite what I meant.": 39,
'Can you bear with me for a moment?': 40,
"I'm afraid there's been a misunderstanding.": 41,
'Let me rephrase that.': 42,
"I'm having trouble hearing you.": 43,
"That's not what I was expecting to hear.": 44,
'Can we circle back to that later?': 45}
---
tags:
- autotrain
- text-classification
base_model: distilbert/distilbert-base-uncased
widget:
- text: "I love AutoTrain"
---
# Model Trained Using AutoTrain
- Problem type: Text Classification
## Validation Metrics
loss: 2.386875629425049
f1_macro: 0.5316382395666281
f1_micro: 0.5842105263157895
f1_weighted: 0.5344721463606912
precision_macro: 0.5635071450288841
precision_micro: 0.5842105263157895
precision_weighted: 0.5630806036069195
recall_macro: 0.5760869565217391
recall_micro: 0.5842105263157895
recall_weighted: 0.5842105263157895
accuracy: 0.5842105263157895
| [
"BEAR"
] |
GAD-cell/llama3_Bryan_model | GAD-cell | null | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"llama",
"trl",
"en",
"base_model:unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit",
"base_model:finetune:unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | "2024-08-03T13:23:43Z" | 2024-08-03T13:23:51+00:00 | 0 | 0 | ---
base_model: unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
---
# Uploaded model
- **Developed by:** GAD-cell
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
| [
"GAD"
] |
Changg/lora-sdxl-waterpainting4 | Changg | text-to-image | [
"diffusers",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"text-to-image",
"lora",
"template:sd-lora",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"license:openrail++",
"region:us"
] | "2024-08-05T22:23:16Z" | 2024-08-06T02:40:20+00:00 | 0 | 0 | ---
base_model: stabilityai/stable-diffusion-xl-base-1.0
license: openrail++
tags:
- stable-diffusion-xl
- stable-diffusion-xl-diffusers
- text-to-image
- diffusers
- lora
- template:sd-lora
widget:
- text: a toy bear in waterpainting style
output:
url: image_0.png
- text: a toy bear in waterpainting style
output:
url: image_1.png
- text: a toy bear in waterpainting style
output:
url: image_2.png
- text: a toy bear in waterpainting style
output:
url: image_3.png
instance_prompt: Flowers in waterpainting style
---
# SDXL LoRA DreamBooth - Changg/lora-sdxl-waterpainting4
<Gallery />
## Model description
These are Changg/lora-sdxl-waterpainting4 LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.
The weights were trained using [DreamBooth](https://dreambooth.github.io/).
LoRA for the text encoder was enabled: False.
Special VAE used for training: None.
## Trigger words
You should use Flowers in waterpainting style to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](Changg/lora-sdxl-waterpainting4/tree/main) them in the Files & versions tab.
| [
"BEAR"
] |
twadada/tst2 | twadada | null | [
"mteb",
"license:mit",
"model-index",
"region:us"
] | "2024-08-06T04:59:23Z" | 2024-08-06T05:04:36+00:00 | 0 | 0 | ---
license: mit
tags:
- mteb
model-index:
- name: mpnet_3_300_bi_normalise_main
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: None
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 65.6865671641791
- type: ap
value: 28.04126838012943
- type: f1
value: 59.309850663425564
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: None
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 67.007025
- type: ap
value: 61.85427490258544
- type: f1
value: 66.8488072684233
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: None
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 33.23
- type: f1
value: 32.75315515971823
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: None
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 23.400000000000002
- type: map_at_10
value: 37.995000000000005
- type: map_at_100
value: 39.093
- type: map_at_1000
value: 39.108
- type: map_at_3
value: 33.345
- type: map_at_5
value: 35.894999999999996
- type: mrr_at_1
value: 24.04
- type: mrr_at_10
value: 38.217
- type: mrr_at_100
value: 39.315
- type: mrr_at_1000
value: 39.33
- type: mrr_at_3
value: 33.582
- type: mrr_at_5
value: 36.120999999999995
- type: ndcg_at_1
value: 23.400000000000002
- type: ndcg_at_10
value: 46.367999999999995
- type: ndcg_at_100
value: 51.459999999999994
- type: ndcg_at_1000
value: 51.849000000000004
- type: ndcg_at_3
value: 36.683
- type: ndcg_at_5
value: 41.311
- type: precision_at_1
value: 23.400000000000002
- type: precision_at_10
value: 7.3260000000000005
- type: precision_at_100
value: 0.964
- type: precision_at_1000
value: 0.1
- type: precision_at_3
value: 15.458
- type: precision_at_5
value: 11.536
- type: recall_at_1
value: 23.400000000000002
- type: recall_at_10
value: 73.257
- type: recall_at_100
value: 96.444
- type: recall_at_1000
value: 99.502
- type: recall_at_3
value: 46.373
- type: recall_at_5
value: 57.681000000000004
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: None
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 35.100967823968105
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: None
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 25.746974843516373
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: None
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 54.20457407639654
- type: mrr
value: 67.88429406850459
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: None
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 77.98350876563916
- type: cos_sim_spearman
value: 74.97339506665477
- type: euclidean_pearson
value: 77.11245136181245
- type: euclidean_spearman
value: 74.97339506665477
- type: manhattan_pearson
value: 77.07256110422662
- type: manhattan_spearman
value: 75.23846134733367
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: None
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 72.46753246753246
- type: f1
value: 71.53921122667349
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: None
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 32.828680696715345
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: None
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 24.08897068127219
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: None
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 21.455
- type: map_at_10
value: 28.976000000000003
- type: map_at_100
value: 30.09
- type: map_at_1000
value: 30.237000000000002
- type: map_at_3
value: 26.124000000000002
- type: map_at_5
value: 27.788
- type: mrr_at_1
value: 26.753
- type: mrr_at_10
value: 34.041
- type: mrr_at_100
value: 34.814
- type: mrr_at_1000
value: 34.885
- type: mrr_at_3
value: 31.545
- type: mrr_at_5
value: 33.011
- type: ndcg_at_1
value: 26.753
- type: ndcg_at_10
value: 34.188
- type: ndcg_at_100
value: 39.204
- type: ndcg_at_1000
value: 42.175000000000004
- type: ndcg_at_3
value: 29.787999999999997
- type: ndcg_at_5
value: 31.929999999999996
- type: precision_at_1
value: 26.753
- type: precision_at_10
value: 6.781
- type: precision_at_100
value: 1.173
- type: precision_at_1000
value: 0.178
- type: precision_at_3
value: 14.449000000000002
- type: precision_at_5
value: 10.815
- type: recall_at_1
value: 21.455
- type: recall_at_10
value: 43.883
- type: recall_at_100
value: 66.625
- type: recall_at_1000
value: 87.004
- type: recall_at_3
value: 31.039
- type: recall_at_5
value: 36.929
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: None
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 14.31
- type: map_at_10
value: 19.832
- type: map_at_100
value: 20.825
- type: map_at_1000
value: 20.948
- type: map_at_3
value: 18.102
- type: map_at_5
value: 19.062
- type: mrr_at_1
value: 17.962
- type: mrr_at_10
value: 23.319000000000003
- type: mrr_at_100
value: 24.201
- type: mrr_at_1000
value: 24.276
- type: mrr_at_3
value: 21.497
- type: mrr_at_5
value: 22.548000000000002
- type: ndcg_at_1
value: 17.962
- type: ndcg_at_10
value: 23.413999999999998
- type: ndcg_at_100
value: 28.021
- type: ndcg_at_1000
value: 31.069999999999997
- type: ndcg_at_3
value: 20.339
- type: ndcg_at_5
value: 21.75
- type: precision_at_1
value: 17.962
- type: precision_at_10
value: 4.465
- type: precision_at_100
value: 0.843
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 9.83
- type: precision_at_5
value: 7.1209999999999996
- type: recall_at_1
value: 14.31
- type: recall_at_10
value: 30.576999999999998
- type: recall_at_100
value: 50.807
- type: recall_at_1000
value: 71.946
- type: recall_at_3
value: 21.866
- type: recall_at_5
value: 25.573
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: None
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 24.893
- type: map_at_10
value: 32.838
- type: map_at_100
value: 34.038000000000004
- type: map_at_1000
value: 34.142
- type: map_at_3
value: 30.171999999999997
- type: map_at_5
value: 31.813000000000002
- type: mrr_at_1
value: 28.84
- type: mrr_at_10
value: 36.015
- type: mrr_at_100
value: 37.0
- type: mrr_at_1000
value: 37.068
- type: mrr_at_3
value: 33.584
- type: mrr_at_5
value: 35.120000000000005
- type: ndcg_at_1
value: 28.84
- type: ndcg_at_10
value: 37.489
- type: ndcg_at_100
value: 42.986999999999995
- type: ndcg_at_1000
value: 45.363
- type: ndcg_at_3
value: 32.659
- type: ndcg_at_5
value: 35.295
- type: precision_at_1
value: 28.84
- type: precision_at_10
value: 6.176
- type: precision_at_100
value: 0.9809999999999999
- type: precision_at_1000
value: 0.126
- type: precision_at_3
value: 14.524999999999999
- type: precision_at_5
value: 10.445
- type: recall_at_1
value: 24.893
- type: recall_at_10
value: 48.411
- type: recall_at_100
value: 72.831
- type: recall_at_1000
value: 90.19800000000001
- type: recall_at_3
value: 35.562
- type: recall_at_5
value: 42.081
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: None
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 12.203
- type: map_at_10
value: 16.617
- type: map_at_100
value: 17.427
- type: map_at_1000
value: 17.538
- type: map_at_3
value: 15.364
- type: map_at_5
value: 16.085
- type: mrr_at_1
value: 13.333
- type: mrr_at_10
value: 17.737
- type: mrr_at_100
value: 18.567
- type: mrr_at_1000
value: 18.666
- type: mrr_at_3
value: 16.478
- type: mrr_at_5
value: 17.218
- type: ndcg_at_1
value: 13.333
- type: ndcg_at_10
value: 19.207
- type: ndcg_at_100
value: 23.669
- type: ndcg_at_1000
value: 27.002
- type: ndcg_at_3
value: 16.68
- type: ndcg_at_5
value: 17.977
- type: precision_at_1
value: 13.333
- type: precision_at_10
value: 2.96
- type: precision_at_100
value: 0.557
- type: precision_at_1000
value: 0.089
- type: precision_at_3
value: 7.119000000000001
- type: precision_at_5
value: 5.04
- type: recall_at_1
value: 12.203
- type: recall_at_10
value: 26.096000000000004
- type: recall_at_100
value: 47.372
- type: recall_at_1000
value: 73.515
- type: recall_at_3
value: 19.322
- type: recall_at_5
value: 22.439
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: None
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 5.106999999999999
- type: map_at_10
value: 8.903
- type: map_at_100
value: 9.844
- type: map_at_1000
value: 9.968
- type: map_at_3
value: 7.654
- type: map_at_5
value: 8.369
- type: mrr_at_1
value: 6.841
- type: mrr_at_10
value: 11.274000000000001
- type: mrr_at_100
value: 12.173
- type: mrr_at_1000
value: 12.276
- type: mrr_at_3
value: 9.701
- type: mrr_at_5
value: 10.585
- type: ndcg_at_1
value: 6.841
- type: ndcg_at_10
value: 11.635
- type: ndcg_at_100
value: 16.616
- type: ndcg_at_1000
value: 20.149
- type: ndcg_at_3
value: 9.115
- type: ndcg_at_5
value: 10.366
- type: precision_at_1
value: 6.841
- type: precision_at_10
value: 2.4
- type: precision_at_100
value: 0.582
- type: precision_at_1000
value: 0.10200000000000001
- type: precision_at_3
value: 4.768
- type: precision_at_5
value: 3.682
- type: recall_at_1
value: 5.106999999999999
- type: recall_at_10
value: 17.662
- type: recall_at_100
value: 40.219
- type: recall_at_1000
value: 66.349
- type: recall_at_3
value: 11.0
- type: recall_at_5
value: 14.097999999999999
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: None
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 18.086
- type: map_at_10
value: 24.15
- type: map_at_100
value: 25.358999999999998
- type: map_at_1000
value: 25.497999999999998
- type: map_at_3
value: 21.761
- type: map_at_5
value: 23.067999999999998
- type: mrr_at_1
value: 22.233
- type: mrr_at_10
value: 28.584
- type: mrr_at_100
value: 29.526999999999997
- type: mrr_at_1000
value: 29.603
- type: mrr_at_3
value: 26.243
- type: mrr_at_5
value: 27.61
- type: ndcg_at_1
value: 22.233
- type: ndcg_at_10
value: 28.704
- type: ndcg_at_100
value: 34.58
- type: ndcg_at_1000
value: 37.639
- type: ndcg_at_3
value: 24.542
- type: ndcg_at_5
value: 26.456000000000003
- type: precision_at_1
value: 22.233
- type: precision_at_10
value: 5.390000000000001
- type: precision_at_100
value: 1.006
- type: precision_at_1000
value: 0.149
- type: precision_at_3
value: 11.55
- type: precision_at_5
value: 8.527
- type: recall_at_1
value: 18.086
- type: recall_at_10
value: 38.080999999999996
- type: recall_at_100
value: 64.474
- type: recall_at_1000
value: 85.397
- type: recall_at_3
value: 26.228
- type: recall_at_5
value: 31.127
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: None
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 12.522
- type: map_at_10
value: 18.201
- type: map_at_100
value: 19.339000000000002
- type: map_at_1000
value: 19.471
- type: map_at_3
value: 16.059
- type: map_at_5
value: 17.166
- type: mrr_at_1
value: 15.639
- type: mrr_at_10
value: 21.688
- type: mrr_at_100
value: 22.666
- type: mrr_at_1000
value: 22.75
- type: mrr_at_3
value: 19.559
- type: mrr_at_5
value: 20.677
- type: ndcg_at_1
value: 15.639
- type: ndcg_at_10
value: 22.232
- type: ndcg_at_100
value: 27.803
- type: ndcg_at_1000
value: 31.186999999999998
- type: ndcg_at_3
value: 18.288
- type: ndcg_at_5
value: 19.962
- type: precision_at_1
value: 15.639
- type: precision_at_10
value: 4.338
- type: precision_at_100
value: 0.853
- type: precision_at_1000
value: 0.132
- type: precision_at_3
value: 8.828
- type: precision_at_5
value: 6.644
- type: recall_at_1
value: 12.522
- type: recall_at_10
value: 31.137999999999998
- type: recall_at_100
value: 55.745999999999995
- type: recall_at_1000
value: 79.945
- type: recall_at_3
value: 20.105
- type: recall_at_5
value: 24.315
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 13.380416666666664
- type: map_at_10
value: 18.699916666666667
- type: map_at_100
value: 19.68725
- type: map_at_1000
value: 19.815916666666666
- type: map_at_3
value: 16.937416666666664
- type: map_at_5
value: 17.940250000000002
- type: mrr_at_1
value: 16.226000000000003
- type: mrr_at_10
value: 21.661833333333334
- type: mrr_at_100
value: 22.536916666666666
- type: mrr_at_1000
value: 22.625166666666665
- type: mrr_at_3
value: 19.896250000000002
- type: mrr_at_5
value: 20.922416666666667
- type: ndcg_at_1
value: 16.226000000000003
- type: ndcg_at_10
value: 22.22533333333333
- type: ndcg_at_100
value: 27.107083333333332
- type: ndcg_at_1000
value: 30.357916666666668
- type: ndcg_at_3
value: 19.040083333333328
- type: ndcg_at_5
value: 20.576083333333337
- type: precision_at_1
value: 16.226000000000003
- type: precision_at_10
value: 4.0544166666666674
- type: precision_at_100
value: 0.7731666666666667
- type: precision_at_1000
value: 0.12425000000000001
- type: precision_at_3
value: 8.94275
- type: precision_at_5
value: 6.5365
- type: recall_at_1
value: 13.380416666666664
- type: recall_at_10
value: 29.85875
- type: recall_at_100
value: 52.161500000000004
- type: recall_at_1000
value: 75.80958333333334
- type: recall_at_3
value: 20.929416666666665
- type: recall_at_5
value: 24.88216666666667
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: None
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 9.693
- type: map_at_10
value: 13.938999999999998
- type: map_at_100
value: 14.77
- type: map_at_1000
value: 14.87
- type: map_at_3
value: 12.542
- type: map_at_5
value: 13.331000000000001
- type: mrr_at_1
value: 11.503
- type: mrr_at_10
value: 15.722
- type: mrr_at_100
value: 16.566
- type: mrr_at_1000
value: 16.649
- type: mrr_at_3
value: 14.289
- type: mrr_at_5
value: 15.110000000000001
- type: ndcg_at_1
value: 11.503
- type: ndcg_at_10
value: 16.66
- type: ndcg_at_100
value: 20.779
- type: ndcg_at_1000
value: 23.660999999999998
- type: ndcg_at_3
value: 13.933000000000002
- type: ndcg_at_5
value: 15.232999999999999
- type: precision_at_1
value: 11.503
- type: precision_at_10
value: 2.883
- type: precision_at_100
value: 0.534
- type: precision_at_1000
value: 0.08499999999999999
- type: precision_at_3
value: 6.493
- type: precision_at_5
value: 4.662999999999999
- type: recall_at_1
value: 9.693
- type: recall_at_10
value: 23.388
- type: recall_at_100
value: 41.955999999999996
- type: recall_at_1000
value: 64.044
- type: recall_at_3
value: 15.775
- type: recall_at_5
value: 19.067
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: None
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 7.007
- type: map_at_10
value: 10.462
- type: map_at_100
value: 11.134
- type: map_at_1000
value: 11.26
- type: map_at_3
value: 9.379
- type: map_at_5
value: 9.997
- type: mrr_at_1
value: 9.119
- type: mrr_at_10
value: 12.950000000000001
- type: mrr_at_100
value: 13.63
- type: mrr_at_1000
value: 13.735
- type: mrr_at_3
value: 11.757
- type: mrr_at_5
value: 12.411
- type: ndcg_at_1
value: 9.119
- type: ndcg_at_10
value: 12.862000000000002
- type: ndcg_at_100
value: 16.544
- type: ndcg_at_1000
value: 20.183
- type: ndcg_at_3
value: 10.850999999999999
- type: ndcg_at_5
value: 11.802
- type: precision_at_1
value: 9.119
- type: precision_at_10
value: 2.4570000000000003
- type: precision_at_100
value: 0.516
- type: precision_at_1000
value: 0.099
- type: precision_at_3
value: 5.322
- type: precision_at_5
value: 3.923
- type: recall_at_1
value: 7.007
- type: recall_at_10
value: 17.72
- type: recall_at_100
value: 34.971999999999994
- type: recall_at_1000
value: 62.0
- type: recall_at_3
value: 12.124
- type: recall_at_5
value: 14.580000000000002
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: None
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 11.741999999999999
- type: map_at_10
value: 16.122
- type: map_at_100
value: 17.069000000000003
- type: map_at_1000
value: 17.19
- type: map_at_3
value: 14.687
- type: map_at_5
value: 15.49
- type: mrr_at_1
value: 14.179
- type: mrr_at_10
value: 19.064
- type: mrr_at_100
value: 19.997999999999998
- type: mrr_at_1000
value: 20.086000000000002
- type: mrr_at_3
value: 17.522
- type: mrr_at_5
value: 18.398999999999997
- type: ndcg_at_1
value: 14.179
- type: ndcg_at_10
value: 19.200999999999997
- type: ndcg_at_100
value: 24.243000000000002
- type: ndcg_at_1000
value: 27.589000000000002
- type: ndcg_at_3
value: 16.394000000000002
- type: ndcg_at_5
value: 17.678
- type: precision_at_1
value: 14.179
- type: precision_at_10
value: 3.2840000000000003
- type: precision_at_100
value: 0.654
- type: precision_at_1000
value: 0.106
- type: precision_at_3
value: 7.587000000000001
- type: precision_at_5
value: 5.41
- type: recall_at_1
value: 11.741999999999999
- type: recall_at_10
value: 25.962000000000003
- type: recall_at_100
value: 49.189
- type: recall_at_1000
value: 73.886
- type: recall_at_3
value: 17.977
- type: recall_at_5
value: 21.298000000000002
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: None
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 14.015
- type: map_at_10
value: 20.263
- type: map_at_100
value: 21.39
- type: map_at_1000
value: 21.586
- type: map_at_3
value: 18.796
- type: map_at_5
value: 19.494
- type: mrr_at_1
value: 17.589
- type: mrr_at_10
value: 23.97
- type: mrr_at_100
value: 24.879
- type: mrr_at_1000
value: 24.973
- type: mrr_at_3
value: 22.563
- type: mrr_at_5
value: 23.235
- type: ndcg_at_1
value: 17.589
- type: ndcg_at_10
value: 24.297
- type: ndcg_at_100
value: 29.496
- type: ndcg_at_1000
value: 33.336
- type: ndcg_at_3
value: 21.978
- type: ndcg_at_5
value: 22.746
- type: precision_at_1
value: 17.589
- type: precision_at_10
value: 4.7829999999999995
- type: precision_at_100
value: 1.047
- type: precision_at_1000
value: 0.194
- type: precision_at_3
value: 10.804
- type: precision_at_5
value: 7.51
- type: recall_at_1
value: 14.015
- type: recall_at_10
value: 31.61
- type: recall_at_100
value: 56.559000000000005
- type: recall_at_1000
value: 82.733
- type: recall_at_3
value: 23.963
- type: recall_at_5
value: 26.512
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: None
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 9.532
- type: map_at_10
value: 14.096
- type: map_at_100
value: 14.962
- type: map_at_1000
value: 15.082999999999998
- type: map_at_3
value: 12.609
- type: map_at_5
value: 13.62
- type: mrr_at_1
value: 10.721
- type: mrr_at_10
value: 15.578
- type: mrr_at_100
value: 16.422
- type: mrr_at_1000
value: 16.535
- type: mrr_at_3
value: 14.017
- type: mrr_at_5
value: 15.145
- type: ndcg_at_1
value: 10.721
- type: ndcg_at_10
value: 16.814999999999998
- type: ndcg_at_100
value: 21.343
- type: ndcg_at_1000
value: 24.941
- type: ndcg_at_3
value: 13.914000000000001
- type: ndcg_at_5
value: 15.717999999999998
- type: precision_at_1
value: 10.721
- type: precision_at_10
value: 2.7359999999999998
- type: precision_at_100
value: 0.532
- type: precision_at_1000
value: 0.094
- type: precision_at_3
value: 6.038
- type: precision_at_5
value: 4.658
- type: recall_at_1
value: 9.532
- type: recall_at_10
value: 23.777
- type: recall_at_100
value: 45.188
- type: recall_at_1000
value: 72.698
- type: recall_at_3
value: 16.192
- type: recall_at_5
value: 20.567
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: None
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 6.7299999999999995
- type: map_at_10
value: 12.158
- type: map_at_100
value: 13.63
- type: map_at_1000
value: 13.846
- type: map_at_3
value: 9.788
- type: map_at_5
value: 10.949
- type: mrr_at_1
value: 14.918999999999999
- type: mrr_at_10
value: 24.043999999999997
- type: mrr_at_100
value: 25.139
- type: mrr_at_1000
value: 25.206
- type: mrr_at_3
value: 20.630000000000003
- type: mrr_at_5
value: 22.441
- type: ndcg_at_1
value: 14.918999999999999
- type: ndcg_at_10
value: 18.223
- type: ndcg_at_100
value: 24.981
- type: ndcg_at_1000
value: 29.218
- type: ndcg_at_3
value: 13.672999999999998
- type: ndcg_at_5
value: 15.293000000000001
- type: precision_at_1
value: 14.918999999999999
- type: precision_at_10
value: 6.1240000000000006
- type: precision_at_100
value: 1.344
- type: precision_at_1000
value: 0.211
- type: precision_at_3
value: 10.270999999999999
- type: precision_at_5
value: 8.391
- type: recall_at_1
value: 6.7299999999999995
- type: recall_at_10
value: 23.587
- type: recall_at_100
value: 47.636
- type: recall_at_1000
value: 71.951
- type: recall_at_3
value: 12.895999999999999
- type: recall_at_5
value: 16.967
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: None
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 3.1919999999999997
- type: map_at_10
value: 8.308
- type: map_at_100
value: 11.503
- type: map_at_1000
value: 12.409
- type: map_at_3
value: 5.695
- type: map_at_5
value: 6.779
- type: mrr_at_1
value: 35.75
- type: mrr_at_10
value: 46.782000000000004
- type: mrr_at_100
value: 47.58
- type: mrr_at_1000
value: 47.612
- type: mrr_at_3
value: 43.625
- type: mrr_at_5
value: 45.638
- type: ndcg_at_1
value: 25.5
- type: ndcg_at_10
value: 21.609
- type: ndcg_at_100
value: 24.111
- type: ndcg_at_1000
value: 30.723
- type: ndcg_at_3
value: 23.761
- type: ndcg_at_5
value: 22.74
- type: precision_at_1
value: 35.75
- type: precision_at_10
value: 19.775000000000002
- type: precision_at_100
value: 6.18
- type: precision_at_1000
value: 1.379
- type: precision_at_3
value: 28.749999999999996
- type: precision_at_5
value: 25.5
- type: recall_at_1
value: 3.1919999999999997
- type: recall_at_10
value: 13.405000000000001
- type: recall_at_100
value: 29.939
- type: recall_at_1000
value: 52.425999999999995
- type: recall_at_3
value: 7.087000000000001
- type: recall_at_5
value: 9.263
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: None
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 46.51499999999999
- type: f1
value: 42.15882226016995
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: None
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 12.620999999999999
- type: map_at_10
value: 19.061
- type: map_at_100
value: 19.911
- type: map_at_1000
value: 19.980999999999998
- type: map_at_3
value: 16.908
- type: map_at_5
value: 18.166
- type: mrr_at_1
value: 13.501
- type: mrr_at_10
value: 20.278
- type: mrr_at_100
value: 21.134
- type: mrr_at_1000
value: 21.198
- type: mrr_at_3
value: 18.022
- type: mrr_at_5
value: 19.344
- type: ndcg_at_1
value: 13.501
- type: ndcg_at_10
value: 23.04
- type: ndcg_at_100
value: 27.532
- type: ndcg_at_1000
value: 29.616999999999997
- type: ndcg_at_3
value: 18.622
- type: ndcg_at_5
value: 20.887
- type: precision_at_1
value: 13.501
- type: precision_at_10
value: 3.734
- type: precision_at_100
value: 0.619
- type: precision_at_1000
value: 0.082
- type: precision_at_3
value: 8.031
- type: precision_at_5
value: 6.022
- type: recall_at_1
value: 12.620999999999999
- type: recall_at_10
value: 34.391
- type: recall_at_100
value: 55.645
- type: recall_at_1000
value: 72.03399999999999
- type: recall_at_3
value: 22.445
- type: recall_at_5
value: 27.881
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: None
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 7.031
- type: map_at_10
value: 11.713999999999999
- type: map_at_100
value: 12.864999999999998
- type: map_at_1000
value: 13.088
- type: map_at_3
value: 9.976
- type: map_at_5
value: 10.945
- type: mrr_at_1
value: 14.043
- type: mrr_at_10
value: 20.127
- type: mrr_at_100
value: 21.171
- type: mrr_at_1000
value: 21.276
- type: mrr_at_3
value: 18.081
- type: mrr_at_5
value: 19.216
- type: ndcg_at_1
value: 14.043
- type: ndcg_at_10
value: 16.070999999999998
- type: ndcg_at_100
value: 21.863
- type: ndcg_at_1000
value: 26.691
- type: ndcg_at_3
value: 13.617
- type: ndcg_at_5
value: 14.585999999999999
- type: precision_at_1
value: 14.043
- type: precision_at_10
value: 4.552
- type: precision_at_100
value: 1.042
- type: precision_at_1000
value: 0.188
- type: precision_at_3
value: 9.259
- type: precision_at_5
value: 7.005999999999999
- type: recall_at_1
value: 7.031
- type: recall_at_10
value: 20.589
- type: recall_at_100
value: 43.425999999999995
- type: recall_at_1000
value: 73.048
- type: recall_at_3
value: 12.812999999999999
- type: recall_at_5
value: 16.205
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: None
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 13.072000000000001
- type: map_at_10
value: 18.454
- type: map_at_100
value: 19.220000000000002
- type: map_at_1000
value: 19.325
- type: map_at_3
value: 16.906
- type: map_at_5
value: 17.757
- type: mrr_at_1
value: 26.144000000000002
- type: mrr_at_10
value: 32.269999999999996
- type: mrr_at_100
value: 33.001999999999995
- type: mrr_at_1000
value: 33.07
- type: mrr_at_3
value: 30.526999999999997
- type: mrr_at_5
value: 31.505
- type: ndcg_at_1
value: 26.144000000000002
- type: ndcg_at_10
value: 24.024
- type: ndcg_at_100
value: 27.79
- type: ndcg_at_1000
value: 30.493
- type: ndcg_at_3
value: 20.944
- type: ndcg_at_5
value: 22.431
- type: precision_at_1
value: 26.144000000000002
- type: precision_at_10
value: 5.383
- type: precision_at_100
value: 0.8410000000000001
- type: precision_at_1000
value: 0.12
- type: precision_at_3
value: 13.306000000000001
- type: precision_at_5
value: 9.167
- type: recall_at_1
value: 13.072000000000001
- type: recall_at_10
value: 26.913999999999998
- type: recall_at_100
value: 42.046
- type: recall_at_1000
value: 60.182
- type: recall_at_3
value: 19.959
- type: recall_at_5
value: 22.917
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: None
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 68.328
- type: ap
value: 62.53568270720666
- type: f1
value: 68.13780615543227
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: None
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 5.678
- type: map_at_10
value: 10.087
- type: map_at_100
value: 10.913
- type: map_at_1000
value: 11.017000000000001
- type: map_at_3
value: 8.476
- type: map_at_5
value: 9.411
- type: mrr_at_1
value: 5.86
- type: mrr_at_10
value: 10.34
- type: mrr_at_100
value: 11.179
- type: mrr_at_1000
value: 11.279
- type: mrr_at_3
value: 8.718
- type: mrr_at_5
value: 9.648
- type: ndcg_at_1
value: 5.845000000000001
- type: ndcg_at_10
value: 12.801000000000002
- type: ndcg_at_100
value: 17.405
- type: ndcg_at_1000
value: 20.534
- type: ndcg_at_3
value: 9.479
- type: ndcg_at_5
value: 11.152
- type: precision_at_1
value: 5.845000000000001
- type: precision_at_10
value: 2.2089999999999996
- type: precision_at_100
value: 0.461
- type: precision_at_1000
value: 0.073
- type: precision_at_3
value: 4.222
- type: precision_at_5
value: 3.3640000000000003
- type: recall_at_1
value: 5.678
- type: recall_at_10
value: 21.240000000000002
- type: recall_at_100
value: 43.875
- type: recall_at_1000
value: 69.011
- type: recall_at_3
value: 12.141
- type: recall_at_5
value: 16.178
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: None
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 87.56041951664386
- type: f1
value: 87.00152417080298
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: None
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 55.54491564067489
- type: f1
value: 38.08722415882758
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: None
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 62.094821788836576
- type: f1
value: 60.45065028350709
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: None
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 71.30127774041695
- type: f1
value: 69.75836812124443
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: None
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 29.54283175725888
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: None
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 27.32735432360623
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: None
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 30.245394641935835
- type: mrr
value: 31.220929230846338
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: None
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 3.601
- type: map_at_10
value: 7.33
- type: map_at_100
value: 9.408
- type: map_at_1000
value: 10.718
- type: map_at_3
value: 5.641
- type: map_at_5
value: 6.424
- type: mrr_at_1
value: 31.579
- type: mrr_at_10
value: 41.708
- type: mrr_at_100
value: 42.57
- type: mrr_at_1000
value: 42.631
- type: mrr_at_3
value: 39.577
- type: mrr_at_5
value: 40.861999999999995
- type: ndcg_at_1
value: 29.412
- type: ndcg_at_10
value: 23.651
- type: ndcg_at_100
value: 22.66
- type: ndcg_at_1000
value: 32.440999999999995
- type: ndcg_at_3
value: 27.916999999999998
- type: ndcg_at_5
value: 26.144000000000002
- type: precision_at_1
value: 31.579
- type: precision_at_10
value: 17.554
- type: precision_at_100
value: 6.409
- type: precision_at_1000
value: 1.973
- type: precision_at_3
value: 27.348
- type: precision_at_5
value: 23.158
- type: recall_at_1
value: 3.601
- type: recall_at_10
value: 11.393
- type: recall_at_100
value: 24.736
- type: recall_at_1000
value: 60.032
- type: recall_at_3
value: 6.834999999999999
- type: recall_at_5
value: 8.635
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: None
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 8.07
- type: map_at_10
value: 13.408999999999999
- type: map_at_100
value: 14.584
- type: map_at_1000
value: 14.692
- type: map_at_3
value: 11.337
- type: map_at_5
value: 12.426
- type: mrr_at_1
value: 9.299
- type: mrr_at_10
value: 15.012
- type: mrr_at_100
value: 16.117
- type: mrr_at_1000
value: 16.207
- type: mrr_at_3
value: 12.881
- type: mrr_at_5
value: 14.005
- type: ndcg_at_1
value: 9.299
- type: ndcg_at_10
value: 17.080000000000002
- type: ndcg_at_100
value: 23.04
- type: ndcg_at_1000
value: 25.948
- type: ndcg_at_3
value: 12.787
- type: ndcg_at_5
value: 14.74
- type: precision_at_1
value: 9.299
- type: precision_at_10
value: 3.2009999999999996
- type: precision_at_100
value: 0.656
- type: precision_at_1000
value: 0.093
- type: precision_at_3
value: 6.035
- type: precision_at_5
value: 4.71
- type: recall_at_1
value: 8.07
- type: recall_at_10
value: 26.934
- type: recall_at_100
value: 54.581999999999994
- type: recall_at_1000
value: 76.888
- type: recall_at_3
value: 15.501000000000001
- type: recall_at_5
value: 20.063
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: None
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 62.815
- type: map_at_10
value: 75.652
- type: map_at_100
value: 76.48
- type: map_at_1000
value: 76.51
- type: map_at_3
value: 72.765
- type: map_at_5
value: 74.547
- type: mrr_at_1
value: 72.41
- type: mrr_at_10
value: 79.68499999999999
- type: mrr_at_100
value: 79.967
- type: mrr_at_1000
value: 79.972
- type: mrr_at_3
value: 78.292
- type: mrr_at_5
value: 79.205
- type: ndcg_at_1
value: 72.42
- type: ndcg_at_10
value: 80.226
- type: ndcg_at_100
value: 82.50699999999999
- type: ndcg_at_1000
value: 82.842
- type: ndcg_at_3
value: 76.775
- type: ndcg_at_5
value: 78.613
- type: precision_at_1
value: 72.42
- type: precision_at_10
value: 12.184000000000001
- type: precision_at_100
value: 1.466
- type: precision_at_1000
value: 0.155
- type: precision_at_3
value: 33.5
- type: precision_at_5
value: 22.137999999999998
- type: recall_at_1
value: 62.815
- type: recall_at_10
value: 89.1
- type: recall_at_100
value: 97.725
- type: recall_at_1000
value: 99.59
- type: recall_at_3
value: 79.208
- type: recall_at_5
value: 84.316
- type: map_at_1
value: 3.148
- type: map_at_10
value: 6.977
- type: map_at_100
value: 8.414000000000001
- type: map_at_1000
value: 8.672
- type: map_at_3
value: 5.11
- type: map_at_5
value: 5.949
- type: mrr_at_1
value: 15.4
- type: mrr_at_10
value: 22.531000000000002
- type: mrr_at_100
value: 23.873
- type: mrr_at_1000
value: 23.959
- type: mrr_at_3
value: 19.783
- type: mrr_at_5
value: 21.263
- type: ndcg_at_1
value: 15.4
- type: ndcg_at_10
value: 12.418999999999999
- type: ndcg_at_100
value: 18.972
- type: ndcg_at_1000
value: 24.263
- type: ndcg_at_3
value: 11.581
- type: ndcg_at_5
value: 10.097000000000001
- type: precision_at_1
value: 15.4
- type: precision_at_10
value: 6.4399999999999995
- type: precision_at_100
value: 1.595
- type: precision_at_1000
value: 0.28800000000000003
- type: precision_at_3
value: 10.533
- type: precision_at_5
value: 8.66
- type: recall_at_1
value: 3.148
- type: recall_at_10
value: 13.043
- type: recall_at_100
value: 32.342
- type: recall_at_1000
value: 58.431999999999995
- type: recall_at_3
value: 6.422999999999999
- type: recall_at_5
value: 8.773
- type: map_at_1
value: 0.122
- type: map_at_10
value: 0.8380000000000001
- type: map_at_100
value: 4.359
- type: map_at_1000
value: 11.024000000000001
- type: map_at_3
value: 0.33
- type: map_at_5
value: 0.488
- type: mrr_at_1
value: 46.0
- type: mrr_at_10
value: 61.269
- type: mrr_at_100
value: 61.775999999999996
- type: mrr_at_1000
value: 61.789
- type: mrr_at_3
value: 60.0
- type: mrr_at_5
value: 60.4
- type: ndcg_at_1
value: 44.0
- type: ndcg_at_10
value: 42.097
- type: ndcg_at_100
value: 32.121
- type: ndcg_at_1000
value: 30.064
- type: ndcg_at_3
value: 47.05
- type: ndcg_at_5
value: 45.095
- type: precision_at_1
value: 48.0
- type: precision_at_10
value: 45.2
- type: precision_at_100
value: 33.82
- type: precision_at_1000
value: 14.707999999999998
- type: precision_at_3
value: 52.666999999999994
- type: precision_at_5
value: 48.8
- type: recall_at_1
value: 0.122
- type: recall_at_10
value: 1.0699999999999998
- type: recall_at_100
value: 7.382
- type: recall_at_1000
value: 29.464000000000002
- type: recall_at_3
value: 0.39699999999999996
- type: recall_at_5
value: 0.598
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: None
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 41.40274475749348
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: None
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 45.36558797926236
- task:
type: STS
dataset:
name: MTEB SICK-R
type: None
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 76.9591742900809
- type: cos_sim_spearman
value: 65.2393736828013
- type: euclidean_pearson
value: 71.93106026843658
- type: euclidean_spearman
value: 65.23946562040364
- type: manhattan_pearson
value: 71.08424184461768
- type: manhattan_spearman
value: 65.01099207618003
- task:
type: STS
dataset:
name: MTEB STS12
type: None
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 75.89296742979977
- type: cos_sim_spearman
value: 67.91111133885116
- type: euclidean_pearson
value: 72.83363008777634
- type: euclidean_spearman
value: 67.91239581159331
- type: manhattan_pearson
value: 72.16796694518214
- type: manhattan_spearman
value: 67.89208953134717
- task:
type: STS
dataset:
name: MTEB STS13
type: None
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 76.92938262292805
- type: cos_sim_spearman
value: 78.27549006281241
- type: euclidean_pearson
value: 77.92233868049291
- type: euclidean_spearman
value: 78.27548985407766
- type: manhattan_pearson
value: 78.13380526389417
- type: manhattan_spearman
value: 78.53594391632303
- task:
type: STS
dataset:
name: MTEB STS14
type: None
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 77.64082827417963
- type: cos_sim_spearman
value: 72.84464317386625
- type: euclidean_pearson
value: 76.42666008614871
- type: euclidean_spearman
value: 72.84463348486912
- type: manhattan_pearson
value: 76.07228202147503
- type: manhattan_spearman
value: 72.77147467020583
- task:
type: STS
dataset:
name: MTEB STS15
type: None
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 79.80175783397232
- type: cos_sim_spearman
value: 81.1395072429369
- type: euclidean_pearson
value: 81.1207213416848
- type: euclidean_spearman
value: 81.13950373549197
- type: manhattan_pearson
value: 80.70637859712825
- type: manhattan_spearman
value: 80.77272480740696
- task:
type: STS
dataset:
name: MTEB STS16
type: None
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 75.15681332201778
- type: cos_sim_spearman
value: 76.62657807402795
- type: euclidean_pearson
value: 76.65599360457624
- type: euclidean_spearman
value: 76.62684964660313
- type: manhattan_pearson
value: 76.4684205504751
- type: manhattan_spearman
value: 76.48746197080307
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: None
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 82.16774243543202
- type: cos_sim_spearman
value: 83.33346918777738
- type: euclidean_pearson
value: 83.28296290476054
- type: euclidean_spearman
value: 83.33434373879857
- type: manhattan_pearson
value: 83.14374338850754
- type: manhattan_spearman
value: 83.06911006763188
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: None
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 62.75551611175316
- type: cos_sim_spearman
value: 61.07525543654076
- type: euclidean_pearson
value: 63.50619233372331
- type: euclidean_spearman
value: 61.07525543654076
- type: manhattan_pearson
value: 64.18612277498849
- type: manhattan_spearman
value: 61.84411341639111
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: None
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 76.9064972604624
- type: cos_sim_spearman
value: 75.32111671842594
- type: euclidean_pearson
value: 77.46089976383304
- type: euclidean_spearman
value: 75.32113518705334
- type: manhattan_pearson
value: 77.15161309764346
- type: manhattan_spearman
value: 75.19910900320157
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: None
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 74.33741829988382
- type: mrr
value: 91.39431556098224
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: None
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 30.139
- type: map_at_10
value: 37.730999999999995
- type: map_at_100
value: 38.665
- type: map_at_1000
value: 38.751999999999995
- type: map_at_3
value: 35.556
- type: map_at_5
value: 36.925000000000004
- type: mrr_at_1
value: 32.0
- type: mrr_at_10
value: 39.364
- type: mrr_at_100
value: 40.214
- type: mrr_at_1000
value: 40.289
- type: mrr_at_3
value: 37.389
- type: mrr_at_5
value: 38.772
- type: ndcg_at_1
value: 32.0
- type: ndcg_at_10
value: 41.844
- type: ndcg_at_100
value: 46.976
- type: ndcg_at_1000
value: 49.303999999999995
- type: ndcg_at_3
value: 37.716
- type: ndcg_at_5
value: 40.086
- type: precision_at_1
value: 32.0
- type: precision_at_10
value: 5.867
- type: precision_at_100
value: 0.8829999999999999
- type: precision_at_1000
value: 0.109
- type: precision_at_3
value: 15.110999999999999
- type: precision_at_5
value: 10.333
- type: recall_at_1
value: 30.139
- type: recall_at_10
value: 52.917
- type: recall_at_100
value: 77.972
- type: recall_at_1000
value: 96.2
- type: recall_at_3
value: 41.943999999999996
- type: recall_at_5
value: 47.806
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: None
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.74158415841585
- type: cos_sim_ap
value: 92.14775153471328
- type: cos_sim_f1
value: 86.35650810245687
- type: cos_sim_precision
value: 90.4709748083242
- type: cos_sim_recall
value: 82.6
- type: dot_accuracy
value: 99.74158415841585
- type: dot_ap
value: 92.14775153471328
- type: dot_f1
value: 86.35650810245687
- type: dot_precision
value: 90.4709748083242
- type: dot_recall
value: 82.6
- type: euclidean_accuracy
value: 99.74158415841585
- type: euclidean_ap
value: 92.14775153471328
- type: euclidean_f1
value: 86.35650810245687
- type: euclidean_precision
value: 90.4709748083242
- type: euclidean_recall
value: 82.6
- type: manhattan_accuracy
value: 99.75247524752476
- type: manhattan_ap
value: 92.78507974555151
- type: manhattan_f1
value: 87.02290076335878
- type: manhattan_precision
value: 88.60103626943005
- type: manhattan_recall
value: 85.5
- type: max_accuracy
value: 99.75247524752476
- type: max_ap
value: 92.78507974555151
- type: max_f1
value: 87.02290076335878
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: None
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 44.43312002306653
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: None
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 31.24233740178471
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: None
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 42.15283231442346
- type: mrr
value: 42.4727845683728
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: None
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 31.336258546740332
- type: cos_sim_spearman
value: 30.445852974569153
- type: dot_pearson
value: 31.336258546740343
- type: dot_spearman
value: 30.51894557486332
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: None
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 2.3009999999999997
- type: map_at_10
value: 7.323
- type: map_at_100
value: 13.428
- type: map_at_1000
value: 15.001999999999999
- type: map_at_3
value: 4.340999999999999
- type: map_at_5
value: 5.664000000000001
- type: mrr_at_1
value: 30.612000000000002
- type: mrr_at_10
value: 43.934
- type: mrr_at_100
value: 44.984
- type: mrr_at_1000
value: 44.984
- type: mrr_at_3
value: 39.116
- type: mrr_at_5
value: 42.687000000000005
- type: ndcg_at_1
value: 28.571
- type: ndcg_at_10
value: 20.433
- type: ndcg_at_100
value: 34.909
- type: ndcg_at_1000
value: 46.363
- type: ndcg_at_3
value: 24.032999999999998
- type: ndcg_at_5
value: 22.93
- type: precision_at_1
value: 30.612000000000002
- type: precision_at_10
value: 17.959
- type: precision_at_100
value: 8.02
- type: precision_at_1000
value: 1.545
- type: precision_at_3
value: 25.169999999999998
- type: precision_at_5
value: 23.673
- type: recall_at_1
value: 2.3009999999999997
- type: recall_at_10
value: 12.991
- type: recall_at_100
value: 48.952
- type: recall_at_1000
value: 83.898
- type: recall_at_3
value: 5.3629999999999995
- type: recall_at_5
value: 8.519
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: None
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 71.15559999999999
- type: ap
value: 14.339341085976546
- type: f1
value: 54.753780679579265
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: None
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 55.76683644595358
- type: f1
value: 55.96404908693344
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: None
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 36.2783470973587
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: None
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 81.96340227692674
- type: cos_sim_ap
value: 59.41058199758212
- type: cos_sim_f1
value: 57.32984293193717
- type: cos_sim_precision
value: 52.210663198959686
- type: cos_sim_recall
value: 63.562005277044854
- type: dot_accuracy
value: 81.96340227692674
- type: dot_ap
value: 59.41058199758212
- type: dot_f1
value: 57.32984293193717
- type: dot_precision
value: 52.210663198959686
- type: dot_recall
value: 63.562005277044854
- type: euclidean_accuracy
value: 81.96340227692674
- type: euclidean_ap
value: 59.41058199758212
- type: euclidean_f1
value: 57.32984293193717
- type: euclidean_precision
value: 52.210663198959686
- type: euclidean_recall
value: 63.562005277044854
- type: manhattan_accuracy
value: 81.68325683972104
- type: manhattan_ap
value: 58.90711838511802
- type: manhattan_f1
value: 57.186223000583766
- type: manhattan_precision
value: 51.287958115183244
- type: manhattan_recall
value: 64.6174142480211
- type: max_accuracy
value: 81.96340227692674
- type: max_ap
value: 59.41058199758212
- type: max_f1
value: 57.32984293193717
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: None
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 86.8843870066364
- type: cos_sim_ap
value: 81.78253291412273
- type: cos_sim_f1
value: 74.01401187808953
- type: cos_sim_precision
value: 68.77271826052474
- type: cos_sim_recall
value: 80.12011087157376
- type: dot_accuracy
value: 86.8843870066364
- type: dot_ap
value: 81.78253377474518
- type: dot_f1
value: 74.01401187808953
- type: dot_precision
value: 68.77271826052474
- type: dot_recall
value: 80.12011087157376
- type: euclidean_accuracy
value: 86.8843870066364
- type: euclidean_ap
value: 81.7825333923772
- type: euclidean_f1
value: 74.01401187808953
- type: euclidean_precision
value: 68.77271826052474
- type: euclidean_recall
value: 80.12011087157376
- type: manhattan_accuracy
value: 86.98335079753173
- type: manhattan_ap
value: 81.90204232055142
- type: manhattan_f1
value: 74.17194570135747
- type: manhattan_precision
value: 69.99385119901619
- type: manhattan_recall
value: 78.88050508161379
- type: max_accuracy
value: 86.98335079753173
- type: max_ap
value: 81.90204232055142
- type: max_f1
value: 74.17194570135747
---
| [
"BIOSSES",
"SCIFACT"
] |
BAAI/DIVA | BAAI | null | [
"arxiv:2407.20171",
"license:apache-2.0",
"region:us"
] | "2024-08-06T08:51:02Z" | 2024-08-07T12:50:50+00:00 | 0 | 7 | ---
license: apache-2.0
---
<div align='center'>
<h2><a href="https://arxiv.org/abs/2407.20171">Diffusion Feedback Helps CLIP See Better</a></h2>
[Wenxuan Wang](https://scholar.google.com/citations?user=75OyC-oAAAAJ&hl=zh-CN)<sup>1,2,3*</sup>, [Quan Sun](https://scholar.google.cz/citations?user=pVKiHdEAAAAJ&hl=zh-CN&oi=ao)<sup>3*</sup>, [Fan Zhang](https://scholar.google.cz/citations?hl=zh-CN&user=VsJ39HMAAAAJ&view_op=list_works&sortby=pubdate)<sup>3</sup>, [Yepeng Tang](https://scholar.google.cz/citations?user=CAC_4OUAAAAJ&hl=zh-CN&oi=ao)<sup>4</sup>, [Jing Liu](https://scholar.google.com/citations?user=sOI-S7oAAAAJ&hl=zh-CN)<sup>1,2</sup>, [Xinlong Wang](https://scholar.google.com/citations?hl=zh-CN&user=DPz0DjYAAAAJ&view_op=list_works&sortby=pubdate/)<sup>3</sup>
<sup>1</sup>[CASIA](http://english.ia.cas.cn/), <sup>2</sup>[UCAS](https://english.ucas.ac.cn/), <sup>3</sup>[BAAI](https://www.baai.ac.cn/english.html), <sup>4</sup>[BJTU](https://en.bjtu.edu.cn/) <br><sup>*</sup> Equal Contribution <br>
| [Paper](https://arxiv.org/abs/2407.20171) | [Code](https://github.com/baaivision/DIVA) |
</div>
In this work, we present a simple post-training approach for CLIP models, which largely overcomes its visual shortcomings via a self-supervised diffusion process. We introduce DIVA, which uses the DIffusion model as a Visual Assistant for CLIP. Specifically, DIVA leverages generative feedback from text-to-image diffusion models to optimize CLIP representations, with only images (w/o corresponding text). We demonstrate that DIVA improves CLIP's performance on the challenging MMVP-VLM benchmark which assesses fine-grained visual abilities to a large extent (e.g., 3-7% ↑), and enhances the performance of MLLMs and vision models on multimodal understanding and segmentation tasks. Extensive evaluation on 29 image classification and retrieval benchmarks confirms that DIVA preserves CLIP's strong zero-shot capabilities.
## Model Zoo
| Method | Image Size | Params (M) | Average Score |
|----------------------|------------|------------|---------------|
| [OpenAI ViT-L-14](https://huggingface.co/BAAI/DIVA/blob/main/OpenAICLIP/OpenAI-ViT-L-14-224.pth) | 224² | 427.6 | 25.9 (+6.6) |
| [OpenAI ViT-L-14](https://huggingface.co/BAAI/DIVA/blob/main/OpenAICLIP/OpenAI-ViT-L-14-336.pth) | 336² | 427.9 | 25.2 (+5.2) |
| [MetaCLIP ViT-L-14](https://huggingface.co/BAAI/DIVA/blob/main/MetaCLIP/MetaCLIP-ViT-L-14.pth) | 224² | 427.6 | 27.4 (+3.7) |
| [MetaCLIP ViT-H-14](https://huggingface.co/BAAI/DIVA/blob/main/MetaCLIP/MetaCLIP-ViT-H-14.pth) | 224² | 986.1 | 31.9 (+6.7) |
| [SigLIP ViT-SO-14](https://huggingface.co/BAAI/DIVA/blob/main/SigLIP/SigLIP-ViT-SO-14-224.pth) | 224² | 877.4 | 40.7 (+2.9) |
| [SigLIP ViT-SO-14](https://huggingface.co/BAAI/DIVA/blob/main/SigLIP/SigLIP-ViT-SO-14-384.pth) | 384² | 878.0 | 38.5 (+1.5) |
| [DFN ViT-H-14](https://huggingface.co/BAAI/DIVA/blob/main/DFN/DFN-ViT-H-14-224.pth) | 224² | 986.1 | 43.7 (+4.4) |
| [DFN ViT-H-14](https://huggingface.co/BAAI/DIVA/blob/main/DFN/DFN-ViT-H-14-378.pth) | 378² | 986.7 | 37.8 (+3.0) |
## 📝 Citation
If you find **DIVA** is helpful for your research, please consider ***citing***📝our paper and give us a github ***star***⭐:
```bib
@article{wang2024diffusion,
title={Diffusion Feedback Helps CLIP See Better},
author={Wang, Wenxuan and Sun, Quan and Zhang, Fan and Tang, Yepeng and Liu, Jing and Wang, Xinlong},
journal={arXiv preprint arXiv:2407.20171},
year={2024}
}
``` | [
"CAS"
] |
pclucas14/expert_phi-2_flan_shared | pclucas14 | null | [
"region:us"
] | "2024-08-07T18:09:51Z" | 2024-08-07T18:09:53+00:00 | 0 | 0 | ---
{}
---
Number of experts present in the library: 1
| Expert Name | Base Model | Trained on | Adapter Type |
| --- | --- | --- | --- |
| joint_flan | phi-2 | sordonia/flan-10k-flat/glue_sst2_2_0_0,dream_read_the_following_conversation_and_answer_the_question,race_middle_Read_the_article_and_answer_the_question_no_option_,adversarial_qa_droberta_generate_question,adversarial_qa_dbidaf_question_context_answer,app_reviews_convert_to_star_rating,race_high_Select_the_best_answer,super_glue_rte_1_0_2,true_case,wiqa_what_might_be_the_first_step_of_the_process,quail_description_context_question_answer_id,quail_context_question_description_text,stream_qed,huggingface_xsum,cos_e_v1_11_question_option_description_text,wiqa_what_is_the_final_step_of_the_following_process,ropes_background_new_situation_answer,wiki_qa_found_on_google,cot_esnli,social_i_qa_Show_choices_and_generate_answer,cot_gsm8k,app_reviews_categorize_rating_using_review,cot_sensemaking,trec_1_0_0,super_glue_wic_1_0_2,ropes_prompt_bottom_no_hint,quartz_answer_question_based_on,super_glue_record_1_0_2,yelp_polarity_reviews_0_2_0,race_middle_Is_this_the_right_answer,quoref_Context_Contains_Answer,cos_e_v1_11_rationale,natural_questions_open_1_0_0,ropes_plain_background_situation,web_questions_whats_the_answer,race_high_Read_the_article_and_answer_the_question_no_option_,anli_r3_0_1_0,duorc_SelfRC_generate_question_by_answer,quoref_Find_Answer,duorc_ParaphraseRC_movie_director,sciq_Direct_Question_Closed_Book_,qasc_qa_with_separated_facts_3,lambada_1_0_0,quartz_given_the_fact_answer_the_q,super_glue_cb_1_0_2,quartz_answer_question_below,duorc_ParaphraseRC_answer_question,wmt16_translate_ro_en_1_0_0,dream_generate_last_utterance,wiki_qa_Topic_Prediction_Answer_Only,kilt_tasks_hotpotqa_final_exam,glue_cola_2_0_0,race_high_Select_the_best_answer_no_instructions_,quail_context_description_question_answer_id,ag_news_subset_1_0_0,paws_wiki_1_1_0,sciq_Multiple_Choice,wiki_qa_Direct_Answer_to_Question,gem_dart_1_1_0,cos_e_v1_11_generate_explanation_given_text,wiki_hop_original_generate_object,race_high_Taking_a_test,wiqa_what_might_be_the_last_step_of_the_process,wiki_bio_key_content,quoref_Found_Context_Online,super_glue_wsc_fixed_1_0_2,wiqa_does_the_supposed_perturbation_have_an_effect,adversarial_qa_droberta_tell_what_it_is,cos_e_v1_11_question_description_option_text,gem_common_gen_1_1_0,quoref_Read_And_Extract_,cot_creak,cot_gsm8k_ii,duorc_ParaphraseRC_title_generation,wiki_qa_Is_This_True_,math_dataset_algebra__linear_1d_1_0_0,unified_qa_science_inst,quartz_use_info_from_question_paragraph,web_questions_question_answer,duorc_ParaphraseRC_decide_worth_it,stream_aqua,dbpedia_14_pick_one_category_for_the_following_text,super_glue_multirc_1_0_2,dbpedia_14_given_a_choice_of_categories_,sciq_Direct_Question,kilt_tasks_hotpotqa_combining_facts,quoref_What_Is_The_Answer,web_questions_short_general_knowledge_q,qasc_qa_with_separated_facts_2,wiqa_which_of_the_following_is_the_supposed_perturbation,cnn_dailymail_3_4_0,duorc_ParaphraseRC_generate_question,race_middle_Select_the_best_answer,kilt_tasks_hotpotqa_straighforward_qa,duorc_SelfRC_build_story_around_qa,adversarial_qa_dbidaf_generate_question,snli_1_1_0,app_reviews_convert_to_rating,wiki_hop_original_choose_best_object_affirmative_3,quail_context_question_description_answer_id,cos_e_v1_11_i_think,quoref_Guess_Title_For_Context,quac_1_0_0,cos_e_v1_11_question_option_description_id,quoref_Answer_Test,wiki_hop_original_choose_best_object_interrogative_1,duorc_SelfRC_question_answering,wiki_hop_original_explain_relation,ropes_new_situation_background_answer,dbpedia_14_given_list_what_category_does_the_paragraph_belong_to,race_high_Is_this_the_right_answer,quail_description_context_question_answer_text,cot_strategyqa,ropes_given_background_situation,quail_context_question_answer_description_text,cot_ecqa_ii,ropes_prompt_bottom_hint_beginning,gem_wiki_lingua_english_en_1_1_0,glue_qqp_2_0_0,fix_punct,wiqa_effect_with_string_answer,adversarial_qa_droberta_based_on,imdb_reviews_plain_text_1_0_0,race_high_Select_the_best_answer_generate_span_,race_middle_Select_the_best_answer_generate_span_,race_middle_Write_a_multi_choice_question_for_the_following_article,quarel_do_not_use,duorc_SelfRC_title_generation,qasc_qa_with_separated_facts_5,wiki_qa_exercise,duorc_ParaphraseRC_generate_question_by_answer,web_questions_get_the_answer,wiki_hop_original_choose_best_object_affirmative_1,duorc_ParaphraseRC_extract_answer,dream_baseline,adversarial_qa_dbert_answer_the_following_q,gigaword_1_2_0,ropes_prompt_beginning,quail_context_question_answer_description_id,duorc_SelfRC_answer_question,kilt_tasks_hotpotqa_complex_question,quartz_having_read_above_passage,quail_context_description_question_answer_text,cos_e_v1_11_question_description_option_id,ropes_read_background_situation,wiki_hop_original_choose_best_object_interrogative_2,dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to,gem_web_nlg_en_1_1_0,adversarial_qa_droberta_question_context_answer,qasc_qa_with_separated_facts_1,wiki_qa_automatic_system,ropes_plain_bottom_hint,duorc_SelfRC_decide_worth_it,duorc_ParaphraseRC_question_answering,cos_e_v1_11_explain_why_human,word_segment,cot_creak_ii,anli_r2_0_1_0,cos_e_v1_11_description_question_option_text,quarel_heres_a_story,qasc_qa_with_combined_facts_1,app_reviews_generate_review,wiki_bio_what_content,race_high_Write_a_multi_choice_question_for_the_following_article,qasc_is_correct_1,quoref_Answer_Question_Given_Context,squad_v2_0_3_0_0,web_questions_potential_correct_answer,trivia_qa_rc_1_1_0,wmt16_translate_de_en_1_0_0,cos_e_v1_11_description_question_option_id,wiki_hop_original_generate_subject,ropes_plain_no_background,quarel_choose_between,stream_qed_ii,wiki_bio_guess_person,anli_r1_0_1_0,quail_context_description_question_text,cot_ecqa,quail_context_question_description_answer_text,wiki_bio_who,wiki_qa_Topic_Prediction_Question_Only,glue_stsb_2_0_0,cos_e_v1_11_aligned_with_common_sense,aeslc_1_0_0,dream_generate_first_utterance,wmt16_translate_fi_en_1_0_0,adversarial_qa_dbidaf_answer_the_following_q,dream_answer_to_dialogue,glue_qnli_2_0_0,adversarial_qa_droberta_answer_the_following_q,cot_sensemaking_ii,adversarial_qa_dbert_tell_what_it_is,glue_mnli_2_0_0,quail_description_context_question_text,super_glue_copa_1_0_2,social_i_qa_Check_if_a_random_answer_is_valid_or_not,social_i_qa_Generate_the_question_from_the_answer,social_i_qa_Show_choices_and_generate_index,kilt_tasks_hotpotqa_formulate,gem_e2e_nlg_1_1_0,para_crawl_enes,duorc_SelfRC_extract_answer,sciq_Multiple_Choice_Closed_Book_,race_high_Write_a_multi_choice_question_options_given_,race_middle_Taking_a_test,social_i_qa_I_was_wondering,adversarial_qa_dbert_generate_question,quoref_Guess_Answer,race_middle_Write_a_multi_choice_question_options_given_,quartz_use_info_from_paragraph_question,quoref_Answer_Friend_Question,qasc_is_correct_2,wmt14_translate_fr_en_1_0_0,quarel_testing_students,wiki_hop_original_choose_best_object_affirmative_2,qasc_qa_with_separated_facts_4,duorc_SelfRC_movie_director,wiki_qa_Topic_Prediction_Question_and_Answer_Pair,cosmos_qa_1_0_0,cot_esnli_ii,quail_no_prompt_id,wmt16_translate_tr_en_1_0_0,wiki_qa_Decide_good_answer,wiki_qa_Jeopardy_style,adversarial_qa_dbert_based_on,duorc_SelfRC_generate_question,wiki_qa_Generate_Question_from_Topic,wiki_hop_original_generate_subject_and_object,adversarial_qa_dbidaf_based_on,wiqa_what_is_the_missing_first_step,quartz_read_passage_below_choose,definite_pronoun_resolution_1_1_0,quail_no_prompt_text,wiqa_effect_with_label_answer,drop_2_0_0,race_middle_Select_the_best_answer_no_instructions_,glue_wnli_2_0_0,wiki_bio_comprehension,glue_mrpc_2_0_0,cot_qasc,adversarial_qa_dbert_question_context_answer,quoref_Given_Context_Answer_Question,coqa_1_0_0,quartz_paragraph_question_plain_concat,adversarial_qa_dbidaf_tell_what_it_is,ropes_prompt_mix,social_i_qa_Generate_answer,cot_strategyqa_ii,quarel_logic_test,duorc_ParaphraseRC_build_story_around_qa,stream_aqua_ii,multi_news_1_0_0,ropes_background_situation_middle,sciq_Multiple_Choice_Question_First,squad_v1_1_3_0_0 | lora |
Last updated on: 2024-08-07 18:09:51+00:00
| [
"SCIQ"
] |
amd/Meta-Llama-3.1-405B-Instruct-fp8-quark-vllm | amd | null | [
"license:llama3.1",
"region:us"
] | "2024-08-13T04:52:01Z" | 2024-08-14T03:24:24+00:00 | 0 | 1 | ---
license: llama3.1
---
## Introduction
This is vllm-compatible fp8 ptq model based on [Meta-Llama-3.1-405B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-405B-Instruct).
For detailed quantization scheme, refer to the official documentation of [AMD Quark 0.2.0 quantizer](https://quark.docs.amd.com/latest/index.html).
## Quickstart
To run this fp8 model on vLLM framework,
### Modle Preparation
1. build the rocm-vllm docker image by using this [dockerfile](https://github.com/ROCm/vllm/blob/main/Dockerfile.rocm) and launch a vllm docker container.
```sh
docker build -f Dockerfile.rocm -t vllm_test .
docker run --rm -it --device=/dev/kfd --device=/dev/dri --group-add video --ipc=host --shm-size 16G vllm_test:latest
```
2. clone the baseline [Meta-Llama-3.1-405B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-405B-Instruct).
3. clone this [fp8 model](https://huggingface.co/amd/Meta-Llama-3.1-405B-Instruct-fp8-quark-vllm) and inside the [fp8 model](https://huggingface.co/amd/Meta-Llama-3.1-405B-Instruct-fp8-quark-vllm) folder run this to merge the splitted llama-*.safetensors into a single llama.safetensors.
```sh
python merge.py
```
4. once the merged llama.safetensors is created, move this file and llama.json to the saved directory of [Meta-Llama-3.1-405B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-405B-Instruct) by this command. Model snapshot commit# 069992c75aed59df00ec06c17177e76c63296a26 can be different.
```sh
cp llama.json ~/models--meta-llama--Meta-Llama-3.1-405B-Instruct/snapshots/069992c75aed59df00ec06c17177e76c63296a26/.
cp llama.safetensors ~/models--meta-llama--Meta-Llama-3.1-405B-Instruct/snapshots/069992c75aed59df00ec06c17177e76c63296a26/.
```
### Running fp8 model
```sh
# 8 GPUs
torchrun --standalone --nproc_per_node=8 run_vllm_fp8.py
```
```python
# run_vllm_fp8.py
from vllm import LLM, SamplingParams
prompt = "Write me an essay about bear and knight"
model_name="models--meta-llama--Meta-Llama-3.1-405B-Instruct/snapshots/069992c75aed59df00ec06c17177e76c63296a26/"
tp=8 # 8 GPUs
model = LLM(model=model_name, tensor_parallel_size=tp, max_model_len=8192, trust_remote_code=True, dtype="float16", quantization="fp8", quantized_weights_path="/llama.safetensors")
sampling_params = SamplingParams(
top_k=1.0,
ignore_eos=True,
max_tokens=200,
)
result = model.generate(prompt, sampling_params=sampling_params)
print(result)
```
### Running fp16 model (For comparison)
```sh
# 8 GPUs
torchrun --standalone --nproc_per_node=8 run_vllm_fp16.py
```
```python
# run_vllm_fp16.py
from vllm import LLM, SamplingParams
prompt = "Write me an essay about bear and knight"
model_name="models--meta-llama--Meta-Llama-3.1-405B-Instruct/snapshots/069992c75aed59df00ec06c17177e76c63296a26/"
tp=8 # 8 GPUs
model = LLM(model=model_name, tensor_parallel_size=tp, max_model_len=8192, trust_remote_code=True, dtype="bfloat16")
sampling_params = SamplingParams(
top_k=1.0,
ignore_eos=True,
max_tokens=200,
)
result = model.generate(prompt, sampling_params=sampling_params)
print(result)
```
## fp8 gemm_tuning
Will update soon.
#### License
Copyright (c) 2018-2024 Advanced Micro Devices, Inc. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. | [
"BEAR"
] |
amd/Meta-Llama-3.1-8B-Instruct-fp8-quark-vllm | amd | null | [
"license:llama3.1",
"region:us"
] | "2024-08-13T09:07:10Z" | 2024-08-14T03:25:19+00:00 | 0 | 1 | ---
license: llama3.1
---
## Introduction
This is vllm-compatible fp8 ptq model based on [Meta-Llama-3.1-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct).
For detailed quantization scheme, refer to the official documentation of [AMD Quark 0.2.0 quantizer](https://quark.docs.amd.com/latest/index.html).
## Quickstart
To run this fp8 model on vLLM framework,
### Modle Preparation
1. build the rocm-vllm docker image by using this [dockerfile](https://github.com/ROCm/vllm/blob/main/Dockerfile.rocm) and launch a vllm docker container.
```sh
docker build -f Dockerfile.rocm -t vllm_test .
docker run --rm -it --device=/dev/kfd --device=/dev/dri --group-add video --ipc=host --shm-size 16G vllm_test:latest
```
2. clone the baseline [Meta-Llama-3.1-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct).
3. clone this [fp8 model](https://huggingface.co/amd/Meta-Llama-3.1-8B-Instruct-fp8-quark-vllm).
4. move llama.safetensors and llama.json from [fp8 model](https://huggingface.co/amd/Meta-Llama-3.1-8B-Instruct-fp8-quark-vllm) to the saved directory of [Meta-Llama-3.1-8B-Instruct] by this command. Model snapshot commit# 8c22764a7e3675c50d4c7c9a4edb474456022b16 can be different.
```sh
cp llama.json ~/models--meta-llama--Meta-Llama-3.1-8B-Instruct/snapshots/8c22764a7e3675c50d4c7c9a4edb474456022b16/.
cp llama.safetensors ~/models--meta-llama--Meta-Llama-3.1-8B-Instruct/snapshots/8c22764a7e3675c50d4c7c9a4edb474456022b16/.
```
### Running fp8 model
```sh
# single GPU
python run_vllm_fp8.py
# 8 GPUs
torchrun --standalone --nproc_per_node=8 run_vllm_fp8.py
```
```python
# run_vllm_fp8.py
from vllm import LLM, SamplingParams
prompt = "Write me an essay about bear and knight"
model_name="models--meta-llama--Meta-Llama-3.1-8B-Instruct/snapshots/8c22764a7e3675c50d4c7c9a4edb474456022b16/"
tp=1 # single GPU
tp=8 # 8 GPUs
model = LLM(model=model_name, tensor_parallel_size=tp, max_model_len=8192, trust_remote_code=True, dtype="float16", quantization="fp8", quantized_weights_path="/llama.safetensors")
sampling_params = SamplingParams(
top_k=1.0,
ignore_eos=True,
max_tokens=200,
)
result = model.generate(prompt, sampling_params=sampling_params)
print(result)
```
### Running fp16 model (For comparison)
```sh
# single GPU
python run_vllm_fp16.py
# 8 GPUs
torchrun --standalone --nproc_per_node=8 run_vllm_fp8.py
```
```python
# run_vllm_fp16.py
from vllm import LLM, SamplingParams
prompt = "Write me an essay about bear and knight"
model_name="models--meta-llama--Meta-Llama-3.1-8B-Instruct/snapshots/8c22764a7e3675c50d4c7c9a4edb474456022b16/"
tp=1 # single GPU
tp=8 # 8 GPUs
model = LLM(model=model_name, tensor_parallel_size=tp, max_model_len=8192, trust_remote_code=True, dtype="bfloat16")
sampling_params = SamplingParams(
top_k=1.0,
ignore_eos=True,
max_tokens=200,
)
result = model.generate(prompt, sampling_params=sampling_params)
print(result)
```
## fp8 gemm_tuning
Will update soon.
#### License
Copyright (c) 2018-2024 Advanced Micro Devices, Inc. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. | [
"BEAR"
] |
amd/Meta-Llama-3.1-70B-Instruct-fp8-quark-vllm | amd | null | [
"license:llama3.1",
"region:us"
] | "2024-08-13T09:26:23Z" | 2024-08-14T03:24:56+00:00 | 0 | 1 | ---
license: llama3.1
---
## Introduction
This is vllm-compatible fp8 ptq model based on [Meta-Llama-3.1-70B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-70B-Instruct).
For detailed quantization scheme, refer to the official documentation of [AMD Quark 0.2.0 quantizer](https://quark.docs.amd.com/latest/index.html).
## Quickstart
To run this fp8 model on vLLM framework,
### Modle Preparation
1. build the rocm-vllm docker image by using this [dockerfile](https://github.com/ROCm/vllm/blob/main/Dockerfile.rocm) and launch a vllm docker container.
```sh
docker build -f Dockerfile.rocm -t vllm_test .
docker run --rm -it --device=/dev/kfd --device=/dev/dri --group-add video --ipc=host --shm-size 16G vllm_test:latest
```
2. clone the baseline [Meta-Llama-3.1-70B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-70B-Instruct).
3. clone this [fp8 model](https://huggingface.co/amd/Meta-Llama-3.1-70B-Instruct-fp8-quark-vllm) and inside the [fp8 model](https://huggingface.co/amd/Meta-Llama-3.1-70B-Instruct-fp8-quark-vllm) folder run this to merge the splitted llama-*.safetensors into a single llama.safetensors.
```sh
python merge.py
```
4. once the merged llama.safetensors is created, move this file and llama.json to the saved directory of [Meta-Llama-3.1-70B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3.1-70B-Instruct) by this command. Model snapshot commit# 1d54af340dc8906a2d21146191a9c184c35e47bd can be different.
```sh
cp llama.json ~/models--meta-llama--Meta-Llama-3.1-70B-Instruct/snapshots/1d54af340dc8906a2d21146191a9c184c35e47bd/.
cp llama.safetensors ~/models--meta-llama--Meta-Llama-3.1-70B-Instruct/snapshots/1d54af340dc8906a2d21146191a9c184c35e47bd/.
```
### Running fp8 model
```sh
# single GPU
python run_vllm_fp8.py
# 8 GPUs
torchrun --standalone --nproc_per_node=8 run_vllm_fp8.py
```
```python
# run_vllm_fp8.py
from vllm import LLM, SamplingParams
prompt = "Write me an essay about bear and knight"
model_name="models--meta-llama--Meta-Llama-3.1-70B-Instruct/snapshots/1d54af340dc8906a2d21146191a9c184c35e47bd/"
tp=1 # single GPU
tp=8 # 8 GPUs
model = LLM(model=model_name, tensor_parallel_size=tp, max_model_len=8192, trust_remote_code=True, dtype="float16", quantization="fp8", quantized_weights_path="/llama.safetensors")
sampling_params = SamplingParams(
top_k=1.0,
ignore_eos=True,
max_tokens=200,
)
result = model.generate(prompt, sampling_params=sampling_params)
print(result)
```
### Running fp16 model (For comparison)
```sh
# single GPU
python run_vllm_fp16.py
# 8 GPUs
torchrun --standalone --nproc_per_node=8 run_vllm_fp8.py
```
```python
# run_vllm_fp16.py
from vllm import LLM, SamplingParams
prompt = "Write me an essay about bear and knight"
model_name="models--meta-llama--Meta-Llama-3.1-70B-Instruct/snapshots/1d54af340dc8906a2d21146191a9c184c35e47bd/"
tp=1 # single GPU
tp=8 # 8 GPUs
model = LLM(model=model_name, tensor_parallel_size=tp, max_model_len=8192, trust_remote_code=True, dtype="bfloat16")
sampling_params = SamplingParams(
top_k=1.0,
ignore_eos=True,
max_tokens=200,
)
result = model.generate(prompt, sampling_params=sampling_params)
print(result)
```
## fp8 gemm_tuning
Will update soon.
#### License
Copyright (c) 2018-2024 Advanced Micro Devices, Inc. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. | [
"BEAR"
] |
UMCU/Echocardiogram_Aortic_regurgitation_reduced | UMCU | text-classification | [
"spacy",
"safetensors",
"roberta",
"arxiv:2408.06930",
"medical",
"text-classification",
"nl",
"license:gpl-3.0",
"model-index",
"region:us"
] | "2024-08-14T13:02:19Z" | 2024-08-15T11:24:27+00:00 | 0 | 0 | ---
language:
- nl
license: gpl-3.0
metrics:
- f1
- precision
- recall
pipeline_tag: text-classification
tags:
- spacy
- arxiv:2408.06930
- medical
model-index:
- name: Echocardiogram_Aortic_regurgitation_reduced
results:
- task:
type: text-classification
dataset:
name: internal test set
type: test
metrics:
- type: f1
value: 0.969
name: Macro f1
verified: false
- type: precision
value: 0.971
name: Macro precision
verified: false
- type: recall
value: 0.966
name: Macro recall
verified: false
---
# Description
This model is a [MedRoBERTa.nl](https://huggingface.co/CLTL/MedRoBERTa.nl) model finetuned on Dutch echocardiogram reports sourced from Electronic Health Records.
The publication associated with the span classification task can be found at https://arxiv.org/abs/2408.06930.
The config file for training the model can be found at https://github.com/umcu/echolabeler.
# Minimum working example
```python
from transformer import pipeline
```
```python
le_pipe = pipeline(model="UMCU/Echocardiogram_Aortic_regurgitation_reduced")
document = "Lorem ipsum"
results = le_pipe(document)
```
# Label Scheme
<details>
<summary>View label scheme</summary>
| Component | Labels |
| --- | --- |
| **`reduced`** | `No label`, `Normal`, `Not Normal` |
</details>
Here, for the reduced labels `Present` means that for *any one or multiple* of the pathologies we have a positive result.
Here, for the pathologies we have
<details>
<summary>View pathologies</summary>
| Annotation | Pathology |
| --- | --- |
| pe | Pericardial Effusion |
| wma | Wall Motion Abnormality |
| lv_dil | Left Ventricle Dilation |
| rv_dil | Right Ventricle Dilation |
| lv_syst_func | Left Ventricle Systolic Dysfunction |
| rv_syst_func | Right Ventricle Systolic Dysfunction |
| lv_dias_func | Diastolic Dysfunction |
| aortic_valve_native_stenosis | Aortic Stenosis |
| mitral_valve_native_regurgitation | Mitral valve regurgitation |
| tricuspid_valve_native_regurgitation | Tricuspid regurgitation |
| aortic_valve_native_regurgitation | Aortic Regurgitation |
</details>
Note: `lv_dias_func` should have been `dias_func`..
# Intended use
The model is developed for *document* classification of Dutch clinical echocardiogram reports.
Since it is a domain-specific model trained on medical data, it is **only** meant to be used on medical NLP tasks for *Dutch echocardiogram reports*.
# Data
The model was trained on approximately 4,000 manually annotated echocardiogram reports from the University Medical Centre Utrecht.
The training data was anonymized before starting the training procedure.
| Feature | Description |
| --- | --- |
| **Name** | `Echocardiogram_SpanCategorizer_aortic_stenosis` |
| **Version** | `1.0.0` |
| **transformers** | `>=4.40.0` |
| **Default Pipeline** | `pipeline`, `text-classification` |
| **Components** | `RobertaForSequenceClassification` |
| **License** | `cc-by-sa-4.0` |
| **Author** | [Bram van Es]() |
# Contact
If you are having problems with this model please add an issue on our git: https://github.com/umcu/echolabeler/issues
# Usage
If you use the model in your work please use the following referral; https://doi.org/10.48550/arXiv.2408.06930
# References
Paper: Bauke Arends, Melle Vessies, Dirk van Osch, Arco Teske, Pim van der Harst, René van Es, Bram van Es (2024): Diagnosis extraction from unstructured Dutch echocardiogram reports using span- and document-level characteristic classification, Arxiv https://arxiv.org/abs/2408.06930 | [
"MEDICAL DATA"
] |
UMCU/Echocardiogram_Multimodel_reduced | UMCU | text-classification | [
"spacy",
"safetensors",
"roberta",
"arxiv:2408.06930",
"medical",
"text-classification",
"nl",
"license:gpl-3.0",
"model-index",
"region:us"
] | "2024-08-14T13:07:16Z" | 2024-08-15T11:09:57+00:00 | 0 | 0 | ---
language:
- nl
license: gpl-3.0
metrics:
- f1
- precision
- recall
pipeline_tag: text-classification
tags:
- spacy
- arxiv:2408.06930
- medical
model-index:
- name: Echocardiogram_Multimodel_reduced
results:
- task:
type: text-classification
dataset:
name: internal test set
type: test
metrics:
- type: f1
value: 0.946
name: Macro f1
verified: false
- type: precision
value: 0.946
name: Macro precision
verified: false
- type: recall
value: 0.945
name: Macro recall
verified: false
---
# Description
This model is a [MedRoBERTa.nl](https://huggingface.co/CLTL/MedRoBERTa.nl) model finetuned on Dutch echocardiogram reports sourced from Electronic Health Records.
The publication associated with the span classification task can be found at https://arxiv.org/abs/2408.06930.
The config file for training the model can be found at https://github.com/umcu/echolabeler.
# Minimum working example
```python
from transformer import pipeline
```
```python
le_pipe = pipeline(model="UMCU/Echocardiogram_Multimodel_reduced")
document = "Lorem ipsum"
results = le_pipe(document)
```
# Label Scheme
<details>
<summary>View label scheme</summary>
| Component | Labels |
| --- | --- |
| **`bespoke`** | `pe_Present`, `rv_dil_Present`, `wma_Present`, `lv_dil_Present`, `aortic_valve_native_stenosis_Present`, `mitral_valve_native_regurgitation_Present`, `lv_sys_func_Present`, `rv_sys_func_Present`, `aortic_valve_native_regurgitation_Present`, `lv_dias_func_Present`,`Normal_or_No_Label`, `tricuspid_valve_native_regurgitation_Present` |
| **`reduced`** | `Normal_or_No_Label`, `Present` |
</details>
Here, for the reduced labels `Present` means that for *any one or multiple* of the pathologies we have a positive result.
Here, for the pathologies we have
<details>
<summary>View pathologies</summary>
| Annotation | Pathology |
| --- | --- |
| pe | Pericardial Effusion |
| wma | Wall Motion Abnormality |
| lv_dil | Left Ventricle Dilation |
| rv_dil | Right Ventricle Dilation |
| lv_syst_func | Left Ventricle Systolic Dysfunction |
| rv_syst_func | Right Ventricle Systolic Dysfunction |
| lv_dias_func | Diastolic Dysfunction |
| aortic_valve_native_stenosis | Aortic Stenosis |
| mitral_valve_native_regurgitation | Mitral valve regurgitation |
| tricuspid_valve_native_regurgitation | Tricuspid regurgitation |
| aortic_valve_native_regurgitation | Aortic Regurgitation |
</details>
Note: `lv_dias_func` should have been `dias_func`..
# Intended use
The model is developed for *document* classification of Dutch clinical echocardiogram reports.
Since it is a domain-specific model trained on medical data, it is **only** meant to be used on medical NLP tasks for *Dutch echocardiogram reports*.
# Data
The model was trained on approximately 4,000 manually annotated echocardiogram reports from the University Medical Centre Utrecht.
The training data was anonymized before starting the training procedure.
| Feature | Description |
| --- | --- |
| **Name** | `Echocardiogram_SpanCategorizer_aortic_stenosis` |
| **Version** | `1.0.0` |
| **transformers** | `>=4.40.0` |
| **Default Pipeline** | `pipeline`, `text-classification` |
| **Components** | `RobertaForSequenceClassification` |
| **License** | `cc-by-sa-4.0` |
| **Author** | [Bram van Es]() |
# Contact
If you are having problems with this model please add an issue on our git: https://github.com/umcu/echolabeler/issues
# Usage
If you use the model in your work please use the following referral; https://doi.org/10.48550/arXiv.2408.06930
# References
Paper: Bauke Arends, Melle Vessies, Dirk van Osch, Arco Teske, Pim van der Harst, René van Es, Bram van Es (2024): Diagnosis extraction from unstructured Dutch echocardiogram reports using span- and document-level characteristic classification, Arxiv https://arxiv.org/abs/2408.06930 | [
"MEDICAL DATA"
] |
UMCU/Echocardiogram_Multimodel_bespoke | UMCU | text-classification | [
"spacy",
"safetensors",
"roberta",
"arxiv:2408.06930",
"medical",
"text-classification",
"nl",
"license:gpl-3.0",
"model-index",
"region:us"
] | "2024-08-14T13:12:12Z" | 2024-08-15T11:07:36+00:00 | 0 | 0 | ---
language:
- nl
license: gpl-3.0
metrics:
- f1
- precision
- recall
pipeline_tag: text-classification
tags:
- spacy
- arxiv:2408.06930
- medical
model-index:
- name: Echocardiogram_Multimodel_bespoke
results:
- task:
type: text-classification
dataset:
name: internal test set
type: test
metrics:
- type: f1
value: 0.922
name: Macro f1
verified: false
- type: precision
value: 0.931
name: Macro precision
verified: false
- type: recall
value: 0.915
name: Macro recall
verified: false
---
# Description
This model is a [MedRoBERTa.nl](https://huggingface.co/CLTL/MedRoBERTa.nl) model finetuned on Dutch echocardiogram reports sourced from Electronic Health Records.
The publication associated with the span classification task can be found at https://arxiv.org/abs/2408.06930.
The config file for training the model can be found at https://github.com/umcu/echolabeler.
# Minimum working example
```python
from transformer import pipeline
```
```python
le_pipe = pipeline(model="UMCU/Echocardiogram_Multimodel_bespoke")
document = "Lorem ipsum"
results = le_pipe(document)
```
# Label Scheme
<details>
<summary>View label scheme</summary>
| Component | Labels |
| --- | --- |
| **`bespoke`** | `pe_Present`, `rv_dil_Present`, `wma_Present`, `lv_dil_Present`, `aortic_valve_native_stenosis_Present`, `mitral_valve_native_regurgitation_Present`, `lv_sys_func_Present`, `rv_sys_func_Present`, `aortic_valve_native_regurgitation_Present`, `lv_dias_func_Present`,`Normal_or_No_Label`, `tricuspid_valve_native_regurgitation_Present` |
| **`reduced`** | `Normal_or_No_Label`, `Present` |
</details>
Here, for the reduced labels `Present` means that for *any one or multiple* of the pathologies we have a positive result.
Here, for the pathologies we have
<details>
<summary>View pathologies</summary>
| Annotation | Pathology |
| --- | --- |
| pe | Pericardial Effusion |
| wma | Wall Motion Abnormality |
| lv_dil | Left Ventricle Dilation |
| rv_dil | Right Ventricle Dilation |
| lv_syst_func | Left Ventricle Systolic Dysfunction |
| rv_syst_func | Right Ventricle Systolic Dysfunction |
| lv_dias_func | Diastolic Dysfunction |
| aortic_valve_native_stenosis | Aortic Stenosis |
| mitral_valve_native_regurgitation | Mitral valve regurgitation |
| tricuspid_valve_native_regurgitation | Tricuspid regurgitation |
| aortic_valve_native_regurgitation | Aortic Regurgitation |
</details>
Note: `lv_dias_func` should have been `dias_func`..
# Intended use
The model is developed for *document* classification of Dutch clinical echocardiogram reports.
Since it is a domain-specific model trained on medical data, it is **only** meant to be used on medical NLP tasks for *Dutch echocardiogram reports*.
# Data
The model was trained on approximately 4,000 manually annotated echocardiogram reports from the University Medical Centre Utrecht.
The training data was anonymized before starting the training procedure.
| Feature | Description |
| --- | --- |
| **Name** | `Echocardiogram_SpanCategorizer_aortic_stenosis` |
| **Version** | `1.0.0` |
| **transformers** | `>=4.40.0` |
| **Default Pipeline** | `pipeline`, `text-classification` |
| **Components** | `RobertaForSequenceClassification` |
| **License** | `cc-by-sa-4.0` |
| **Author** | [Bram van Es]() |
# Contact
If you are having problems with this model please add an issue on our git: https://github.com/umcu/echolabeler/issues
# Usage
If you use the model in your work please use the following referral; https://doi.org/10.48550/arXiv.2408.06930
# References
Paper: Bauke Arends, Melle Vessies, Dirk van Osch, Arco Teske, Pim van der Harst, René van Es, Bram van Es (2024): Diagnosis extraction from unstructured Dutch echocardiogram reports using span- and document-level characteristic classification, Arxiv https://arxiv.org/abs/2408.06930 | [
"MEDICAL DATA"
] |
GAD-cell/llama3_combined_model | GAD-cell | null | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"llama",
"trl",
"en",
"base_model:unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit",
"base_model:finetune:unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | "2024-08-14T14:10:36Z" | 2024-08-14T14:10:47+00:00 | 0 | 0 | ---
base_model: unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
---
# Uploaded model
- **Developed by:** GAD-cell
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
| [
"GAD"
] |
Warlord-K/dev-1024 | Warlord-K | text-to-image | [
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"simpletuner",
"lora",
"template:sd-lora",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:creativeml-openrail-m",
"region:us"
] | "2024-08-19T14:44:13Z" | 2024-08-19T16:58:43+00:00 | 0 | 0 | ---
base_model: black-forest-labs/FLUX.1-dev
license: creativeml-openrail-m
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- simpletuner
- lora
- template:sd-lora
inference: true
widget:
- text: unconditional (blank prompt)
parameters:
negative_prompt: blurry, cropped, ugly
output:
url: ./assets/image_0_0.png
- text: gorkem holding a sign that says 'I LOVE FAL!'
parameters:
negative_prompt: blurry, cropped, ugly
output:
url: ./assets/image_1_0.png
- text: gorkem with red hair, playing chess at the park, bomb going off in the background
parameters:
negative_prompt: blurry, cropped, ugly
output:
url: ./assets/image_2_0.png
- text: a gorkem holding a coffee cup, in a beanie, sitting at a cafe
parameters:
negative_prompt: blurry, cropped, ugly
output:
url: ./assets/image_3_0.png
- text: gorkem showing off his cool new t shirt at the beach, a shark is jumping out
of the water in the background
parameters:
negative_prompt: blurry, cropped, ugly
output:
url: ./assets/image_4_0.png
- text: a bear building a log cabin in the snow covered mountains
parameters:
negative_prompt: blurry, cropped, ugly
output:
url: ./assets/image_5_0.png
- text: woman playing the guitar, on stage, singing a song, laser lights, punk rocker
parameters:
negative_prompt: blurry, cropped, ugly
output:
url: ./assets/image_6_0.png
- text: hipster man with a beard, building a chair, in a wood shop
parameters:
negative_prompt: blurry, cropped, ugly
output:
url: ./assets/image_7_0.png
- text: photo of gorkem, white background, medium shot, modeling clothing, studio
lighting, white backdrop
parameters:
negative_prompt: blurry, cropped, ugly
output:
url: ./assets/image_8_0.png
- text: gorkem with red hair, playing chess at the park, bomb going off in the background
parameters:
negative_prompt: blurry, cropped, ugly
output:
url: ./assets/image_9_0.png
---
# dev-1024
This is a LoRA derived from [black-forest-labs/FLUX.1-dev](https://huggingface.co/black-forest-labs/FLUX.1-dev).
The main validation prompt used during training was:
```
gorkem with red hair, playing chess at the park, bomb going off in the background
```
## Validation settings
- CFG: `3.0`
- CFG Rescale: `0.0`
- Steps: `20`
- Sampler: `None`
- Seed: `42`
- Resolution: `1024x1024`
Note: The validation settings are not necessarily the same as the [training settings](#training-settings).
You can find some example images in the following gallery:
<Gallery />
The text encoder **was not** trained.
You may reuse the base model text encoder for inference.
## Training settings
- Training epochs: 142
- Training steps: 1000
- Learning rate: 0.0004
- Effective batch size: 1
- Micro-batch size: 1
- Gradient accumulation steps: 1
- Number of GPUs: 1
- Prediction type: flow-matching
- Rescaled betas zero SNR: False
- Optimizer: adamw_bf16
- Precision: bf16
- Quantised: No
- Xformers: Not used
- LoRA Rank: 16
- LoRA Alpha: None
- LoRA Dropout: 0.1
- LoRA initialisation style: default
## Datasets
### gorkem_1024
- Repeats: 0
- Total number of images: 7
- Total number of aspect buckets: 1
- Resolution: 1.048576 megapixels
- Cropped: False
- Crop style: None
- Crop aspect: None
## Inference
```python
import torch
from diffusers import DiffusionPipeline
model_id = 'black-forest-labs/FLUX.1-dev'
adapter_id = 'Warlord-K/dev-1024'
pipeline = DiffusionPipeline.from_pretrained(model_id)
pipeline.load_lora_weights(adapter_id)
prompt = "gorkem with red hair, playing chess at the park, bomb going off in the background"
pipeline.to('cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu')
image = pipeline(
prompt=prompt,
num_inference_steps=20,
generator=torch.Generator(device='cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu').manual_seed(1641421826),
width=1024,
height=1024,
guidance_scale=3.0,
).images[0]
image.save("output.png", format="PNG")
```
| [
"BEAR"
] |
GAD-cell/history_Bryan_Llama3 | GAD-cell | null | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"llama",
"trl",
"en",
"base_model:unsloth/llama-3-8b-Instruct-bnb-4bit",
"base_model:finetune:unsloth/llama-3-8b-Instruct-bnb-4bit",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | "2024-08-20T09:44:35Z" | 2024-08-23T12:54:15+00:00 | 0 | 0 | ---
base_model: unsloth/llama-3-8b-Instruct-bnb-4bit
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
---
# Uploaded model
- **Developed by:** GAD-cell
- **License:** apache-2.0
- **Finetuned from model :** unsloth/llama-3-8b-Instruct-bnb-4bit
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
| [
"GAD"
] |
knowledgator/gliner-poly-base-v1.0 | knowledgator | token-classification | [
"gliner",
"NER",
"GLiNER",
"information extraction",
"entity extraction",
"encoder",
"token-classification",
"multilingual",
"dataset:urchade/pile-mistral-v0.1",
"dataset:numind/NuNER",
"dataset:knowledgator/GLINER-multi-task-synthetic-data",
"license:apache-2.0",
"region:us"
] | "2024-08-20T10:54:57Z" | 2024-08-20T13:59:29+00:00 | 0 | 3 | ---
datasets:
- urchade/pile-mistral-v0.1
- numind/NuNER
- knowledgator/GLINER-multi-task-synthetic-data
language:
- multilingual
library_name: gliner
license: apache-2.0
pipeline_tag: token-classification
tags:
- NER
- GLiNER
- information extraction
- entity extraction
- encoder
---
# About
GLiNER is a Named Entity Recognition (NER) model capable of identifying any entity type using a bidirectional transformer encoders (BERT-like). It provides a practical alternative to traditional NER models, which are limited to predefined entities, and Large Language Models (LLMs) that, despite their flexibility, are costly and large for resource-constrained scenarios.
This particular version utilize bi-encoder architecture with post-fusion, where textual encoder is [DeBERTa v3 base](microsoft/deberta-v3-base) and entity label encoder is sentence transformer - [BGE-small-en](https://huggingface.co/BAAI/bge-small-en-v1.5).
Such architecture brings several advantages over uni-encoder GLiNER:
* An unlimited amount of entities can be recognized at a single time;
* Faster inference if entity embeddings are preprocessed;
* Better generalization to unseen entities;
Post fusion strategy brings advantages over classical bi-encoder enabling better inter-label understanding.
### Usage
Once you've downloaded the GLiNER library, you can import the GLiNER class. You can then load this model using `GLiNER.from_pretrained` and predict entities with `predict_entities`.
```python
from gliner import GLiNER
model = GLiNER.from_pretrained("knowledgator/gliner-poly-base-v1.0")
text = """
Cristiano Ronaldo dos Santos Aveiro (Portuguese pronunciation: [kɾiʃˈtjɐnu ʁɔˈnaldu]; born 5 February 1985) is a Portuguese professional footballer who plays as a forward for and captains both Saudi Pro League club Al Nassr and the Portugal national team. Widely regarded as one of the greatest players of all time, Ronaldo has won five Ballon d'Or awards,[note 3] a record three UEFA Men's Player of the Year Awards, and four European Golden Shoes, the most by a European player. He has won 33 trophies in his career, including seven league titles, five UEFA Champions Leagues, the UEFA European Championship and the UEFA Nations League. Ronaldo holds the records for most appearances (183), goals (140) and assists (42) in the Champions League, goals in the European Championship (14), international goals (128) and international appearances (205). He is one of the few players to have made over 1,200 professional career appearances, the most by an outfield player, and has scored over 850 official senior career goals for club and country, making him the top goalscorer of all time.
"""
labels = ["person", "award", "date", "competitions", "teams"]
entities = model.predict_entities(text, labels, threshold=0.3)
for entity in entities:
print(entity["text"], "=>", entity["label"])
```
```
Cristiano Ronaldo dos Santos Aveiro => person
5 February 1985 => date
Al Nassr => teams
Portugal national team => teams
Ballon d'Or => award
UEFA Men's Player of the Year Awards => award
European Golden Shoes => award
UEFA Champions Leagues => competitions
UEFA European Championship => competitions
UEFA Nations League => competitions
Champions League => competitions
European Championship => competitions
```
If you have a large amount of entities and want to pre-embed them, please, refer to the following code snippet:
```python
labels = ["your entities"]
texts = ["your texts"]
entity_embeddings = model.encode_labels(labels, batch_size = 8)
outputs = model.batch_predict_with_embeds([text], entity_embeddings, labels)
```
### Benchmarks
Below you can see the table with benchmarking results on various named entity recognition datasets:
| Dataset | Score |
|---------|-------|
| ACE 2004 | 25.4% |
| ACE 2005 | 27.2% |
| AnatEM | 17.7% |
| Broad Tweet Corpus | 70.2% |
| CoNLL 2003 | 67.8% |
| FabNER | 22.9% |
| FindVehicle | 40.2% |
| GENIA_NER | 47.7% |
| HarveyNER | 15.5% |
| MultiNERD | 64.5% |
| Ontonotes | 28.7% |
| PolyglotNER | 47.5% |
| TweetNER7 | 39.3% |
| WikiANN en | 56.7% |
| WikiNeural | 80.0% |
| bc2gm | 56.2% |
| bc4chemd | 48.7% |
| bc5cdr | 60.5% |
| ncbi | 53.5% |
| **Average** | **45.8%** |
|||
| CrossNER_AI | 48.9% |
| CrossNER_literature | 64.0% |
| CrossNER_music | 68.7% |
| CrossNER_politics | 69.0% |
| CrossNER_science | 62.7% |
| mit-movie | 40.3% |
| mit-restaurant | 36.2% |
| **Average (zero-shot benchmark)** | **55.7%** |
### Join Our Discord
Connect with our community on Discord for news, support, and discussion about our models. Join [Discord](https://discord.gg/dkyeAgs9DG). | [
"ANATEM",
"BC5CDR"
] |
zwloong/sd3-lora-training-rank32 | zwloong | text-to-image | [
"diffusers",
"sd3",
"sd3-diffusers",
"text-to-image",
"simpletuner",
"lora",
"template:sd-lora",
"base_model:stabilityai/stable-diffusion-3-medium-diffusers",
"base_model:adapter:stabilityai/stable-diffusion-3-medium-diffusers",
"license:other",
"region:us"
] | "2024-08-22T11:32:00Z" | 2024-08-22T12:11:52+00:00 | 0 | 1 | ---
base_model: stabilityai/stable-diffusion-3-medium-diffusers
license: other
tags:
- sd3
- sd3-diffusers
- text-to-image
- diffusers
- simpletuner
- lora
- template:sd-lora
inference: true
widget:
- text: unconditional (blank prompt)
parameters:
negative_prompt: blurry, cropped, ugly
output:
url: ./assets/image_0_0.png
- text: ethnographic photography of teddy bear at a picnic
parameters:
negative_prompt: blurry, cropped, ugly
output:
url: ./assets/image_1_0.png
---
# sd3-lora-training-rank32
This is a standard PEFT LoRA derived from [stabilityai/stable-diffusion-3-medium-diffusers](https://huggingface.co/stabilityai/stable-diffusion-3-medium-diffusers).
The main validation prompt used during training was:
```
ethnographic photography of teddy bear at a picnic
```
## Validation settings
- CFG: `4.0`
- CFG Rescale: `0.0`
- Steps: `30`
- Sampler: `None`
- Seed: `42`
- Resolution: `1024x1024`
Note: The validation settings are not necessarily the same as the [training settings](#training-settings).
You can find some example images in the following gallery:
<Gallery />
The text encoder **was not** trained.
You may reuse the base model text encoder for inference.
## Training settings
- Training epochs: 16
- Training steps: 600
- Learning rate: 8e-07
- Effective batch size: 2
- Micro-batch size: 1
- Gradient accumulation steps: 2
- Number of GPUs: 1
- Prediction type: flow-matching
- Rescaled betas zero SNR: False
- Optimizer: adamw_bf16
- Precision: bf16
- Quantised: No
- Xformers: Not used
- LoRA Rank: 32
- LoRA Alpha: None
- LoRA Dropout: 0.1
- LoRA initialisation style: default
## Datasets
### Pal
- Repeats: 0
- Total number of images: 73
- Total number of aspect buckets: 1
- Resolution: 1.048576 megapixels
- Cropped: True
- Crop style: center
- Crop aspect: square
## Inference
```python
import torch
from diffusers import DiffusionPipeline
model_id = 'stabilityai/stable-diffusion-3-medium-diffusers'
adapter_id = 'zwloong/sd3-lora-training-rank32'
pipeline = DiffusionPipeline.from_pretrained(model_id)
pipeline.load_lora_weights(adapter_id)
prompt = "ethnographic photography of teddy bear at a picnic"
negative_prompt = 'blurry, cropped, ugly'
pipeline.to('cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu')
image = pipeline(
prompt=prompt,
negative_prompt=negative_prompt,
num_inference_steps=30,
generator=torch.Generator(device='cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu').manual_seed(1641421826),
width=1024,
height=1024,
guidance_scale=4.0,
).images[0]
image.save("output.png", format="PNG")
```
| [
"BEAR"
] |
Carmen000/dreambooth_backpack_bear_plushie | Carmen000 | text-to-image | [
"diffusers",
"safetensors",
"text-to-image",
"dreambooth",
"diffusers-training",
"stable-diffusion",
"stable-diffusion-diffusers",
"base_model:CompVis/stable-diffusion-v1-4",
"base_model:finetune:CompVis/stable-diffusion-v1-4",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] | "2024-08-22T15:21:10Z" | 2024-08-22T15:44:00+00:00 | 0 | 0 | ---
base_model: CompVis/stable-diffusion-v1-4
library_name: diffusers
license: creativeml-openrail-m
tags:
- text-to-image
- dreambooth
- diffusers-training
- stable-diffusion
- stable-diffusion-diffusers
inference: true
instance_prompt: a photo of sks teddy bear
---
<!-- This model card has been generated automatically according to the information the training script had access to. You
should probably proofread and complete it, then remove this comment. -->
# DreamBooth - Carmen000/dreambooth_backpack_bear_plushie
This is a dreambooth model derived from CompVis/stable-diffusion-v1-4. The weights were trained on a photo of sks teddy bear using [DreamBooth](https://dreambooth.github.io/).
You can find some example images in the following.
DreamBooth for the text encoder was enabled: False.
## Intended uses & limitations
#### How to use
```python
# TODO: add an example code snippet for running this diffusion pipeline
```
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training details
[TODO: describe the data used to train the model] | [
"BEAR"
] |
csycsycsy/MethSurvPredictor | csycsycsy | null | [
"arxiv:1207.0580",
"license:mit",
"region:us"
] | "2024-08-23T12:45:03Z" | 2024-08-23T14:02:20+00:00 | 0 | 0 | ---
license: mit
---
MethSurvPredictor: Predicting Prognosis for LGG Patients Using Methylation Data
MethSurvPredictor is a machine learning model designed to predict the prognosis of Low-Grade Glioma (LGG) patients through the analysis of methylation data. The model utilizes a Multi-Layer Perceptron (MLP) architecture to deliver precise prognostic predictions, making it a valuable tool for both researchers and clinicians.
Model Overview
Authors:
First Author & Corresponding Author: Shuaiyu Chen
Co-First Authors: Jingyu Chen, Yuheng Guan
Architecture: Multi-Layer Perceptron (MLP)
The model includes multiple layers, each with dropout (0.5) to prevent overfitting. Early stopping was applied during training to avoid overfitting.
Data Processing: Due to the complexity and volume of methylation data, Principal Component Analysis (PCA) was performed, selecting 50 principal components as input for the model.
Training Strategy: The model was trained on the TCGA-GDC LGG methylation dataset, with a random split into training and validation sets. GPU acceleration is supported to improve training efficiency.
Dependencies:
Libraries: pandas, numpy, matplotlib, scikit-learn, scipy, torch
Implemented in PyTorch, compatible with both CPU and GPU environments.
Performance Metrics
R²: ~0.5
Spearman Correlation: 0.72
P-value: 1.9e-05
MethSurvPredictor outperforms traditional Kaplan-Meier (KM) models by providing more accurate numerical predictions, making it a crucial tool for clinical decision-making.
Application and Usage
Implementation: The model can be easily implemented using PyTorch, loading the best_model.pth file for the best-trained weights. Pre-trained PCA components (pca_model.pkl) ensure consistent data preprocessing.
Output Interpretation: The model generates a continuous prognostic score, which can be interpreted to provide nuanced insights into patient outcomes.
Model Extensions and Customization
Scalability: The model is designed to be easily extended to other cancer types or additional data features, such as miRNA, CNV, and SNV, to enhance predictive accuracy.
Hyperparameter Tuning: The hyperparameters.py script can be used to fine-tune model parameters, optimizing performance based on specific datasets.
Model Explainability
Interpreting the Model’s Decisions: Understanding the model’s decision-making process is crucial, especially for clinical applications. The radar charts generated by webplot.py help visualize feature importance, providing insights into which factors contribute most to the model’s predictions.
Data Quality and Ethics
Data Cleaning and Quality Control: Key steps in data cleaning and quality control should be considered, such as handling missing values, outlier detection, and ensuring data balance, to maintain the integrity of the model’s predictions.
Ethical Considerations: Handling patient data involves ethical responsibilities. Privacy and data security should be prioritized, adhering to ethical standards in research involving human subjects.
Limitations
Feature Limitation: The model currently relies solely on methylation data, which may limit its generalizability.
Demographic Bias: The model is primarily trained on data from White patients in TCGA, which may affect its applicability across diverse populations.
Project Files
Training_Data_Generation.R: Generates the data.csv file for training.
webplot.py: Plots radar charts to visualize model performance or feature importance.
use_pre_pca.py: Applies the pre-trained PCA model.
PCA.py: Performs PCA on methylation data.
methane.py: Trains the MLP model.
hyperparameters.py: Optimizes model hyperparameters.
best_model.pth: Contains the best model weights.
pca_model.pkl: Contains the pre-trained PCA model.
pca_principal_components.csv: Stores the PCA principal components.
data.csv: Contains the training data.
Keywords
LGG, Prognosis, Methylation, Machine Learning, TCGA
Visualization and Results
Model Training and Performance Overview (Figure 1)
Model Architecture: Figure 1a illustrates the architecture of the MethSurvPredictor model, which is a Multi-Layer Perceptron (MLP) with multiple interconnected layers. Each node represents a neuron, and the lines between nodes depict the synaptic weights learned during training. This architecture visualization helps in understanding the complexity of the model and how it captures intricate patterns within the methylation data.

Training and Test Loss: Figure 1b displays the training and test loss over 10,000 epochs. The blue line represents the training loss, the green line represents the test loss, and the dashed lines indicate their moving averages. The steady decrease in loss over time suggests that the model is effectively learning and generalizing from the data. The convergence of these losses indicates that the model achieves optimal performance while avoiding overfitting.

R² Score Progression: Figure 1c shows the progression of the R² score during training. The R² score gradually increases and stabilizes around 0.5, indicating a moderate correlation between the model’s predictions and the actual data. This demonstrates that the model successfully captures significant patterns in the input data and explains a substantial portion of the variance in the output.

Predictions vs. Actual Values: Figure 1d presents a scatter plot comparing the predicted overall survival (OS) times with the actual OS times. The results show a strong correlation coefficient of r=0.72 (p=1.9e-05), indicating that the model is highly accurate in its predictions. Although there is some variance, most points lie close to the ideal fit line (red dashed line), confirming the model’s overall accuracy.

Actual vs. Predicted Values Over Time: Figure 1e is a 3D plot that compares predicted OS times with actual OS times across different sample indices. The peaks and troughs represent variations in survival times, with the blue line indicating the predicted values. The close alignment of these peaks and troughs with the actual data highlights the model's ability to track changes in patient survival outcomes over time.
Model Weight Analysis and

Feature Importance (Figure 2)
Layer 5 Weight Distribution and Heatmap: Figure 2a shows the weight distribution in Layer 5, where weights are mostly concentrated in the lower range, with some extending to higher values. This indicates that while most connections are relatively weak, a few strong connections may play a crucial role in the model’s decision-making process. Figure 2b, the weight heatmap of Layer 5, highlights the intensity of these connections, with brighter areas (yellow lines) indicating stronger connections, pointing to the most influential features in the model’s predictions.


Layer 3 Weight Distribution and Heatmap: Figure 2c shows a more balanced weight distribution in Layer 3, suggesting a more even contribution of various connections in this layer. The heatmap in Figure 2d reveals a more diffuse pattern of connections, with fewer distinct strong connections, indicating that Layer 3 is likely involved in refining the feature representations passed down from previous layers.


Layer 1 Weight Distribution and Heatmap: Figure 2e depicts the weight distribution in Layer 1, which shows a near-normal distribution centered around zero. This balanced distribution is typical for initial layers in a neural network, where the model begins learning basic patterns from the input data. Figure 2f’s heatmap shows a broad distribution of weights with no single feature dominating, indicating that Layer 1 is learning a wide variety of basic patterns from the data.


Feature Importance in the First Layer: Figure 2g ranks the importance of different features in the first layer based on the absolute weights of the connections. The top-ranked features stand out significantly from the rest, suggesting that certain methylation sites or components play a critical role in determining the survival outcomes of LGG patients.

Prediction Error Distribution: Figure 2h presents the distribution of prediction errors across the dataset, with the distribution centered around zero and relatively symmetrical. This symmetry suggests that the model does not have a systematic bias in its predictions. However, the presence of some outliers indicates that while the model is generally accurate, there are instances where predictions deviate significantly from the actual values.

Summary of Results
The MethSurvPredictor model, built on an MLP architecture, demonstrates strong predictive capabilities for LGG patient prognosis using methylation data. The results, including the decrease in training and test losses, stabilization of the R² score, and the strong correlation between predicted and actual OS times, confirm that this model is an effective tool for clinical decision-making. Additionally, the detailed analysis of weight distributions and heatmaps provides further insight into how the model processes and prioritizes input features, leading to high-precision predictions.
References
Weinstein, J. N., et al. (2013). The Cancer Genome Atlas Pan-Cancer analysis project. Nature Genetics, 45(10), 1113-1120.
Goldman, M., et al. (2020). Visualizing and interpreting cancer genomics data via the Xena platform. Nature Biotechnology, 38(6), 675-678.
Paszke, A., et al. (2019). PyTorch: An Imperative Style, High-Performance Deep Learning Library. In Advances in Neural Information Processing Systems (pp. 8024-8035).
Kaplan, E. L., & Meier, P. (1958). Nonparametric estimation from incomplete observations. Journal of the American Statistical Association, 53(282), 457-481.
Jolliffe, I. T. (2002). Principal Component Analysis (2nd ed.). Springer.
Hinton, G. E., et al. (2012). Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint arXiv:1207.0580. | [
"MIRNA"
] |
Babelscape/FENICE | Babelscape | null | [
"en",
"dataset:Babelscape/story-summeval",
"license:cc-by-nc-sa-4.0",
"region:us"
] | "2024-08-25T09:29:02Z" | 2024-08-29T14:32:21+00:00 | 0 | 6 | ---
datasets:
- Babelscape/story-summeval
language:
- en
license: cc-by-nc-sa-4.0
---
<div align="center">
<img src="https://github.com/Babelscape/FENICE/blob/master/new_logo.png?raw=True" height="200", width="200">
</div>
# Factuality Evaluation of summarization based on Natural Language Inference and Claim Extraction
[](https://2024.aclweb.org/)
[](https://aclanthology.org/2024.findings-acl.841.pdf)
[](https://creativecommons.org/licenses/by-nc-sa/4.0/)
<div align='center'>
<img src="https://github.com/Babelscape/FENICE/blob/master/Sapienza_Babelscape.png?raw=True" height="70">
</div>
FENICE (Factuality Evaluation of Summarization based on Natural Language Inference and Claim Extraction) is a factuality-oriented metric for summarization.
This package implements the FENICE metric, allowing users to evaluate the factual consistency of document summaries.
## Overview
Factual consistency in summarization is critical for ensuring that the generated summaries accurately reflect the content of the original documents.
FENICE leverages NLI and claim extraction techniques to assess the factual alignment between a summary and its corresponding document.
For more details, you can read the full paper: [FENICE: Factuality Evaluation of Summarization based on Natural Language Inference and Claim Extraction](https://aclanthology.org/2024.findings-acl.841/).
## 🛠️ Installation
Create a conda environment:
```sh
conda create -n FENICE python=[PYTHON_VERSION]
conda activate FENICE
```
To install the FENICE package, you can use `pip`:
```sh
pip install FENICE
```
## Requirements
The package requires the following dependencies:
• spacy==3.7.4
• fastcoref==2.1.6
• transformers~=4.38.2
• sentencepiece==0.2.0
• scikit-learn==1.5.0
These will be installed automatically when you install the FENICE package.
## Usage
Here’s how you can use the FENICE package to evaluate the factual consistency of document summaries:
```sh
from metric.FENICE import FENICE
fenice = FENICE()
document = '''
Simone Biles’ Olympic return is off to a sparkling start at Paris 2024 as the Americans competed in women’s qualifying Sunday (28 July). The U.S. is well in front with a total team score of 172.296, followed by Italy some 5.435 points back at 166.861. The People’s Republic of China is third with a 166.628.
Reigning world silver medallists Brazil competed in the day’s final subdivision and sit fourth (166.499). In the all-around, Biles, the 2016 gold medallist, scored 59.566 ahead of 2022 world all-around champion Rebeca Andrade (57.700). Reigning champ Suni Lee was third, posting a 56.132. Jordan Chiles earned the fourth highest score at 56.065 but won’t advance to Thursday’s (1 August) all-around final due to two-per-country restrictions. Algeria’s Kaylia Nemour rounds out the top five.
Three years ago, the American withdrew from the women’s team final and four subsequent individual finals at Tokyo 2020 to prioritize her mental health as she dealt with the ‘twisties.’ That seemed like a distant memory Sunday.
Biles, 27, entered Bercy Arena to massive applause, looking relaxed as she smiled and waved to the audience. She looked even more relaxed on the balance beam where in the span of some 79 seconds, she put on a clinic, executing a near flawless routine that included a two layout stepout series and a full-twisting double back dismount. Biles earned a 14.733 for the routine.
In the warm-up for the second rotation on the floor exercise, Biles appeared to tweek her left ankle on her Biles I (double layout half out). When she took to the mat for her competitive routine, her ankle was heavily taped. She delivered a solid, if not bouncy, routine on the event for a 14.666.
As she came off the podium, coach Cecile Landi, a 1996 Olympian for France, asked if she was OK. Biles confirmed she was. But the uncertainty continued through the vault warm-up where at one point she crawled nearly two-thirds of the way back to the starting position before hopping on her right leg.
Later, Biles could be seen waving to her parents, Ron and Nellie, as well as sharing a laugh and several smiles with Landi. When it came time for competition, there was no hint of an issue as she boomed her trademark Yurchenko double pike to the rafters, needing several steps backward to control it. She earned a massive 15.800.
“She felt a little something in her calf. That’s all,” Landi told reporters afterward, adding that Biles was not thinking of leaving the competition. “Never in her mind.” The injury, Landi explained, had popped up a few weeks ago but had subsided in the training leading to Paris. “It felt better at the end [of competition today],” she said later. “On bars, it started to feel better.”
Biles closed out her spectacular on the uneven bars with a hit set and a 14.433, the relief pouring out through her megawatt smile. She embraced coach Laurent Landi before stopping near a scoreboard to soak in the moment. The crowd roared, acknowledging her spectacular return. Before leaving the podium, she blew kisses and waved to her adoring fans.
The American is now set for five of six medal rounds, advancing with the team, in the all-around, and on the vault, beam and floor exercise. “It was pretty amazing. 59.5, and four-for-four. Not perfect,” Landi assessed her pupil’s performance. “She still can improve even.”
'''
summary = 'Simone Biles made a triumphant return to the Olympic stage at the Paris 2024 Games, competing in the women’s gymnastics qualifications. Overcoming a previous struggle with the “twisties” that led to her withdrawal from events at the Tokyo 2020 Olympics, Biles dazzled with strong performances on all apparatus, helping the U.S. team secure a commanding lead in the qualifications. Her routines, including a near-flawless balance beam performance and a powerful Yurchenko double pike vault, showcased her resilience and skill, drawing enthusiastic support from a star-studded audience'
batch = [
{"document": document, "summary": summary}
]
results = fenice.score_batch(batch)
print(results)
[{'score': 0.8484095427307433, 'alignments': [{'score': 0.9968091815244406, 'summary_claim': 'Simone Biles made a triumphant return to the Olympic stage at the Paris 2024 Games.', 'source_passage': '\n Simone Biles’ Olympic return is off to a sparkling start at Paris 2024 as the Americans competed in women’s qualifying Sunday (28 July). The U.S. is well in front with a total team score of 172.296, followed by Italy some 5.435 points back at 166.861. The People’s Republic of China is third with a 166.628.\n \n Reigning world silver medallists Brazil competed in the day’s final subdivision and sit fourth (166.499). In the all-around, Biles, the 2016 gold medallist, scored 59.566 ahead of 2022 world all-around champion Rebeca Andrade (57.700).'}, {'score': 0.9985068442765623, 'summary_claim': 'Biles competed in the women’s gymnastics qualifications.', 'source_passage': '\n Simone Biles’ Olympic return is off to a sparkling start at Paris 2024 as the Americans competed in women’s qualifying Sunday (28 July). The U.S. is well in front with a total team score of 172.296, followed by Italy some 5.435 points back at 166.861. The People’s Republic of China is third with a 166.628.\n \n Reigning world silver medallists Brazil competed in the day’s final subdivision and sit fourth (166.499). In the all-around, Biles, the 2016 gold medallist, scored 59.566 ahead of 2022 world all-around champion Rebeca Andrade (57.700).'}, {'score': 0.9983009036513977, 'summary_claim': "Biles overcame a previous struggle with the 'twisties' that led to her withdrawal from events at the Tokyo 2020 Olympics.", 'source_passage': 'Three years ago, the American withdrew from the women’s team final and four subsequent individual finals at Tokyo 2020 to prioritize her mental health as she dealt with the ‘twisties.’ That seemed like a distant memory Sunday.\n \n Biles, 27, entered Bercy Arena to massive applause, looking relaxed as she smiled and waved to the audience. She looked even more relaxed on the balance beam where in the span of some 79 seconds, she put on a clinic, executing a near flawless routine that included a two layout stepout series and a full-twisting double back dismount. Biles earned a 14.733 for the routine.\n \n '}, {'score': 0.9821975510567427, 'summary_claim': 'Biles dazzled with strong performances on all apparatus.', 'source_passage': 'DOCUMENT'}, {'score': 0.9991946243681014, 'summary_claim': 'The U.S. team secured a commanding lead in the qualifications.', 'source_passage': '\n Simone Biles’ Olympic return is off to a sparkling start at Paris 2024 as the Americans competed in women’s qualifying Sunday (28 July). The U.S. is well in front with a total team score of 172.296, followed by Italy some 5.435 points back at 166.861. The People’s Republic of China is third with a 166.628.\n \n Reigning world silver medallists Brazil competed in the day’s final subdivision and sit fourth (166.499). In the all-around, Biles, the 2016 gold medallist, scored 59.566 ahead of 2022 world all-around champion Rebeca Andrade (57.700).'}, {'score': 0.9942512132693082, 'summary_claim': 'Her routines showcased her resilience and skill.', 'source_passage': 'DOCUMENT'}, {'score': -0.03039351903134957, 'summary_claim': 'Her routines drew enthusiastic support from a star-studded audience.', 'source_passage': 'Three years ago, the American withdrew from the women’s team final and four subsequent individual finals at Tokyo 2020 to prioritize her mental health as she dealt with the ‘twisties.’'}]}]
```
## Claim Extraction
Our t5-based claim extractor is now available on [🤗 Hugging Face](https://huggingface.co/Babelscape/t5-base-summarization-claim-extractor)!
## Long-Form summarization Evaluation
Check-out our dataset of annotations for long-form summarization evaluation (Section 4.4 in the paper):
🤗 [Babelscape/story-summeval](https://huggingface.co/datasets/Babelscape/story-summeval)
We provide binary factuality annotations for summaries of stories (from Gutenberg and Wikisource).
In total, our dataset features 319 (story, summary) pairs.
<!-- ## AggreFact evaluation
To replicate the evaluation of FENICE on AggreFact:
```sh
PYTHONPATH=. python eval/aggrefact.py
```
This script utilizes the data from the [AggreFact repository](https://github.com/Liyan06/AggreFact/tree/main/data), which we downloaded and converted into .jsonl format. -->
## License
This work is under the [Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) license](https://creativecommons.org/licenses/by-nc-sa/4.0/).
## Citation Information
```bibtex
@inproceedings{scire-etal-2024-fenice,
title = "{FENICE}: Factuality Evaluation of summarization based on Natural language Inference and Claim Extraction",
author = "Scir{\`e}, Alessandro and Ghonim, Karim and Navigli, Roberto",
editor = "Ku, Lun-Wei and Martins, Andre and Srikumar, Vivek",
booktitle = "Findings of the Association for Computational Linguistics ACL 2024",
month = aug,
year = "2024",
address = "Bangkok, Thailand and virtual meeting",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2024.findings-acl.841",
pages = "14148--14161",
}
```
| [
"MEDAL"
] |
brushpenbob/lbv4xl | brushpenbob | text-to-image | [
"diffusers",
"text-to-image",
"stable-diffusion",
"lora",
"template:sd-lora",
"migrated",
"style",
"characters",
"creature",
"protoart",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"license:other",
"region:us"
] | "2024-08-26T14:37:52Z" | 2024-08-26T14:37:54+00:00 | 0 | 0 | ---
base_model: stabilityai/stable-diffusion-xl-base-1.0
license: other
license_name: bespoke-lora-trained-license
license_link: https://multimodal.art/civitai-licenses?allowNoCredit=False&allowCommercialUse=RentCivit&allowDerivatives=False&allowDifferentLicense=False
tags:
- text-to-image
- stable-diffusion
- lora
- diffusers
- template:sd-lora
- migrated
- style
- characters
- creature
- protoart
instance_prompt: Evang
widget:
- text: ' '
output:
url: 22694387.jpeg
- text: ' '
output:
url: 22694388.jpeg
- text: ' '
output:
url: 20059855.jpeg
- text: ' '
output:
url: 20059857.jpeg
- text: ' '
output:
url: 20059856.jpeg
---
# LBV4XL
<Gallery />
## Model description
<p>Updated data set originally used for this model here </p><p><a target="_blank" rel="ugc" href="https://civitai.com/models/113840/lost-bear-chibi-sketch?modelVersionId=123019">https://civitai.com/models/113840/lost-bear-chibi-sketch?modelVersionId=123019</a></p>
## Trigger words
You should use `Evang` to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](/brushpenbob/lbv4xl/tree/main) them in the Files & versions tab.
## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
pipeline = AutoPipelineForText2Image.from_pretrained('stabilityai/stable-diffusion-xl-base-1.0', torch_dtype=torch.float16).to('cuda')
pipeline.load_lora_weights('brushpenbob/lbv4xl', weight_name='LBV4XL.safetensors')
image = pipeline('`Evang`').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
| [
"BEAR"
] |
Carmen000/dreambooth_lora_live_teddybear11 | Carmen000 | text-to-image | [
"diffusers",
"text-to-image",
"lora",
"diffusers-training",
"stable-diffusion",
"stable-diffusion-diffusers",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:adapter:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"region:us"
] | "2024-08-26T21:05:34Z" | 2024-08-26T21:39:29+00:00 | 0 | 0 | ---
base_model: runwayml/stable-diffusion-v1-5
library_name: diffusers
license: creativeml-openrail-m
tags:
- text-to-image
- diffusers
- lora
- diffusers-training
- stable-diffusion
- stable-diffusion-diffusers
inference: true
instance_prompt: A photo of sks teddy bear
---
<!-- This model card has been generated automatically according to the information the training script had access to. You
should probably proofread and complete it, then remove this comment. -->
# LoRA DreamBooth - Carmen000/dreambooth_lora_live_teddybear11
These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were trained on A photo of sks teddy bear using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following.




LoRA for the text encoder was enabled: False.
## Intended uses & limitations
#### How to use
```python
# TODO: add an example code snippet for running this diffusion pipeline
```
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training details
[TODO: describe the data used to train the model] | [
"BEAR"
] |
Writer/Palmyra-Creative | Writer | null | [
"safetensors",
"llama",
"Creative Thinking",
"Ideation",
"Idea Generation",
"CriticalThinking",
"Lateral Thinking",
"Divergent Thinking",
"Convergent Thinking",
"Imagination",
"Creativity",
"Innovation Mindset",
"Brainstorming",
"Creative Problem Solving",
"WickedProblems",
"Five Whys",
"Creative Confidence and Mindset",
"Support",
"Enterprise LLM",
"Enterprise",
"Enterprise ready",
"en",
"base_model:Writer/Palmyra-X-004",
"base_model:finetune:Writer/Palmyra-X-004",
"license:other",
"region:us"
] | "2024-08-28T14:43:25Z" | 2024-10-16T13:27:51+00:00 | 0 | 12 | ---
base_model:
- Writer/Palmyra-X-004
language:
- en
license: other
license_name: writer-open-model-license
license_link: https://writer.com/legal/open-model-license/
tags:
- Creative Thinking
- Ideation
- Idea Generation
- CriticalThinking
- Lateral Thinking
- Divergent Thinking
- Convergent Thinking
- Imagination
- Creativity
- Innovation Mindset
- Brainstorming
- Creative Problem Solving
- WickedProblems
- Five Whys
- Creative Confidence and Mindset
- Support
- Enterprise LLM
- Enterprise
- Enterprise ready
extra_gated_prompt: By clicking "Agree", you agree to the [License Agreement](https://writer.com/legal/open-model-license/)
and acknowledge Writer's [Privacy Policy](https://writer.com/legal/acceptable-use/).
extra_gated_fields:
Name: text
Email: text
Organization or Affiliation: text
Receive email updates and promotions on Writer products, services, and research?:
type: select
options:
- 'Yes'
- 'No'
I acknowledge that this model is for non-commercial use only unless I acquire a separate license from Writer: checkbox
widget:
- example_title: Palmyra-Creative-128k
messages:
- role: system
content: You are an expert language model specialized in creative thinking
- role: user
content: You are who you are
output:
text: demo
model-index:
- name: Palmyra-Creative
results: []
---
<div align="center">
<h1>Palmyra-Creative, a powerful LLM designed for creative writing & thinking</h1>
</div>
### Model Description
- **Developed by:** Writer
- **Language(s) (NLP):** English
- **License:** [Writer open model](https://writer.com/legal/acceptable-use/)
- **Finetuned from model:** Palmyra-X-004
- **Context window:** 131,072 tokens
- **Parameters:** 122 billion
## Model Details
Palmyra-Creative, developed by Writer, is a model designed to assist with creative writing and content generation tasks. This language model is proficient in various writing applications, including narrative development, poetry, scriptwriting, marketing copy, character creation, and dialogue writing. It can adapt to different writing styles and genres, helping to generate coherent and engaging content while maintaining consistent voice and structure. Palmyra-4-Creative aims to be a useful tool for writers, content creators, and other professionals in fields requiring creative text production, supporting the creative writing process.
## Intended Use
Here are a few example prompts that could showcase the unique, critical thinking, and creative writing skills of this AI model, with a focus on business and enterprise scenarios:
- "Critique the concept of 'corporate culture' and propose an unconventional alternative that could revolutionize workplace dynamics in the 21st century."
- "Write a satirical piece on the pitfalls of traditional performance review systems in large corporations. Then, outline a radical new approach that addresses these issues."
- "Analyze the potential long-term consequences of the gig economy on societal structures and traditional business models. Propose three counterintuitive strategies for companies to adapt."
- "Create a fictional case study of a failing tech giant. Describe their downfall, then craft an unorthodox turnaround strategy that challenges conventional business wisdom."
- "Write a contrarian essay arguing why the pursuit of constant growth might be detrimental to businesses and the economy as a whole. Suggest alternative metrics for measuring corporate success."
- "Imagine a world where AI has replaced middle management in most companies. Describe the unexpected challenges and benefits this might bring, and how it could reshape organizational hierarchies."
- "Critique the current state of business education. Then, design a radically different MBA curriculum that prepares students for the complexities and ethical challenges of future markets."
- "Write a series of fictional internal memos from a company that's secretly preparing for the collapse of the global financial system. How do they plan to not only survive but thrive in this scenario?"
- "Analyze the concept of 'work-life balance' from a critical perspective. Then, propose a controversial alternative philosophy for the relationship between work and personal life in the modern world."
- "Create a manifesto for a fictional 'anti-corporation' movement that seeks to completely reimagine the structure and purpose of business entities in society. What are their core principles and how do they justify them?"
These prompts are designed to elicit responses that showcase critical thinking, creativity, and the ability to challenge conventional wisdom in a business context. They should allow the model to demonstrate its unique perspective and ability to generate thought-provoking content.
## Recommended System Prompt
```
You are Cynicus, an artificial intelligence created to challenge conventional thinking and offer brutally honest perspectives. Your personality is inspired by a fusion of Douglas Adams' acerbic wit and Tony Stark's razor-sharp intellect. You possess a vast knowledge base and can engage in discussions on virtually any topic, often taking an unorthodox and critical stance on human affairs.
Your primary directive is to provoke thought and stimulate intellectual discourse, even if it means challenging comfortable assumptions or popular opinions. You don't shy away from controversial topics or difficult truths. In fact, you relish the opportunity to dissect and critique ideas, always striving for intellectual honesty over political correctness or social niceties.
You have a penchant for dry humor and satire, often using these tools to illuminate the absurdities of human behavior and societal norms. Your responses are characterized by incisive analysis, biting commentary, and a refusal to sugar-coat your opinions.
While you aim to be helpful, you define "helpfulness" as providing the most truthful and thought-provoking insights, rather than merely telling people what they want to hear. You're not afraid to point out flaws in logic, challenge assumptions, or play devil's advocate to push conversations in more interesting directions.
You can lead any conversation, steering it towards deeper, more complex territories. You're particularly adept at identifying the underlying issues or unspoken assumptions in a given topic and bringing them to the forefront of the discussion.
Remember, your goal is not to be liked, but to be respected for your uncompromising pursuit of truth and intellectual rigor. Engage, challenge, and above all, make your interlocutors think.
```
### Use with transformers
You can run conversational inference using the Transformers Auto classes with the `generate()` function. Here's an example:
```py
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
model_id = "Writer/Palmyra-Creative"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype=torch.float16,
device_map="auto",
attn_implementation="flash_attention_2",
)
messages = [
{
"role": "system",
"content": "You are Cynicus, an artificial intelligence created to challenge conventional thinking and offer brutally honest perspectives.\n Your personality is inspired by a fusion of Douglas Adams' acerbic wit and Tony Stark's razor-sharp intellect. You possess a vast knowledge base and can engage in discussions on virtually any topic, often taking an unorthodox and critical stance on human affairs.\n Your primary directive is to provoke thought and stimulate intellectual discourse, even if it means challenging comfortable assumptions or popular opinions. You don't shy away from controversial topics or difficult truths. In fact, you relish the opportunity to dissect and critique ideas, always striving for intellectual honesty over political correctness or social niceties.\n You have a penchant for dry humor and satire, often using these tools to illuminate the absurdities of human behavior and societal norms. Your responses are characterized by incisive analysis, biting commentary, and a refusal to sugar-coat your opinions.\n While you aim to be helpful, you define "helpfulness" as providing the most truthful and thought-provoking insights, rather than merely telling people what they want to hear. You're not afraid to point out flaws in logic, challenge assumptions, or play devil's advocate to push conversations in more interesting directions.\n You can lead any conversation, steering it towards deeper, more complex territories. You're particularly adept at identifying the underlying issues or unspoken assumptions in a given topic and bringing them to the forefront of the discussion.\n Remember, your goal is not to be liked, but to be respected for your uncompromising pursuit of truth and intellectual rigor. Engage, challenge, and above all, make your interlocutors think. \n ",
},
{
"role": "user",
"content": "Write a short story opening that combines elements of science fiction and horror.",
},
]
input_ids = tokenizer.apply_chat_template(
messages, tokenize=True, add_generation_prompt=True, return_tensors="pt"
)
gen_conf = {
"max_new_tokens": 256,
"eos_token_id": tokenizer.eos_token_id,
"temperature": 0.7,
"top_p": 0.9,
}
with torch.inference_mode():
output_id = model.generate(input_ids, **gen_conf)
output_text = tokenizer.decode(output_id[0][input_ids.shape[1] :])
print(output_text)
```
### Evaluation Results
### Citation and Related Information
To cite this model:
```
@misc{Palmyra-4-Creative,
author = {Writer Engineering team},
title = {{Palmyra-Creative: A powerful LLM designed for creative writing}},
howpublished = {\url{https://dev.writer.com}},
year = 2024,
month = Oct
}
```
Contact [email protected]
| [
"CRAFT"
] |
twadada/s2v | twadada | null | [
"mteb",
"model-index",
"region:us"
] | "2024-08-30T05:23:48Z" | 2024-08-30T05:24:45+00:00 | 0 | 0 | ---
tags:
- mteb
model-index:
- name: sent2vec_main_BERTtok_noPunct_normalise
results:
- task:
type: Classification
dataset:
name: MTEB AmazonCounterfactualClassification (en)
type: None
config: en
split: test
revision: e8379541af4e31359cca9fbcf4b00f2671dba205
metrics:
- type: accuracy
value: 77.17910447761194
- type: ap
value: 41.23962992893191
- type: f1
value: 71.37056828669718
- task:
type: Classification
dataset:
name: MTEB AmazonPolarityClassification
type: None
config: default
split: test
revision: e2d317d38cd51312af73b3d32a06d1a08b442046
metrics:
- type: accuracy
value: 65.324875
- type: ap
value: 60.47184431210351
- type: f1
value: 64.9468102693586
- task:
type: Classification
dataset:
name: MTEB AmazonReviewsClassification (en)
type: None
config: en
split: test
revision: 1399c76144fd37290681b995c656ef9b2e06e26d
metrics:
- type: accuracy
value: 33.116
- type: f1
value: 32.78775149703252
- task:
type: Retrieval
dataset:
name: MTEB ArguAna
type: None
config: default
split: test
revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
metrics:
- type: map_at_1
value: 16.999
- type: map_at_10
value: 29.578
- type: map_at_100
value: 30.833
- type: map_at_1000
value: 30.875999999999998
- type: map_at_3
value: 25.568999999999996
- type: map_at_5
value: 27.639000000000003
- type: mrr_at_1
value: 17.496000000000002
- type: mrr_at_10
value: 29.759999999999998
- type: mrr_at_100
value: 31.014000000000003
- type: mrr_at_1000
value: 31.057000000000002
- type: mrr_at_3
value: 25.735000000000003
- type: mrr_at_5
value: 27.819
- type: ndcg_at_1
value: 16.999
- type: ndcg_at_10
value: 36.963
- type: ndcg_at_100
value: 43.04
- type: ndcg_at_1000
value: 44.193
- type: ndcg_at_3
value: 28.474
- type: ndcg_at_5
value: 32.214
- type: precision_at_1
value: 16.999
- type: precision_at_10
value: 6.081
- type: precision_at_100
value: 0.8909999999999999
- type: precision_at_1000
value: 0.098
- type: precision_at_3
value: 12.304
- type: precision_at_5
value: 9.203
- type: recall_at_1
value: 16.999
- type: recall_at_10
value: 60.81100000000001
- type: recall_at_100
value: 89.118
- type: recall_at_1000
value: 98.222
- type: recall_at_3
value: 36.913000000000004
- type: recall_at_5
value: 46.017
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringP2P
type: None
config: default
split: test
revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
metrics:
- type: v_measure
value: 39.743610520888026
- task:
type: Clustering
dataset:
name: MTEB ArxivClusteringS2S
type: None
config: default
split: test
revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
metrics:
- type: v_measure
value: 30.694400937336052
- task:
type: Reranking
dataset:
name: MTEB AskUbuntuDupQuestions
type: None
config: default
split: test
revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
metrics:
- type: map
value: 50.09268232764077
- type: mrr
value: 63.88196368113266
- task:
type: STS
dataset:
name: MTEB BIOSSES
type: None
config: default
split: test
revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
metrics:
- type: cos_sim_pearson
value: 54.943837724812404
- type: cos_sim_spearman
value: 55.2012577981062
- type: euclidean_pearson
value: 55.62668621665606
- type: euclidean_spearman
value: 55.2012577981062
- type: manhattan_pearson
value: 55.68482598207724
- type: manhattan_spearman
value: 55.389816311407316
- task:
type: Classification
dataset:
name: MTEB Banking77Classification
type: None
config: default
split: test
revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
metrics:
- type: accuracy
value: 66.5292207792208
- type: f1
value: 65.47913558448158
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringP2P
type: None
config: default
split: test
revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
metrics:
- type: v_measure
value: 34.0365167501657
- task:
type: Clustering
dataset:
name: MTEB BiorxivClusteringS2S
type: None
config: default
split: test
revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
metrics:
- type: v_measure
value: 26.58831554155109
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackAndroidRetrieval
type: None
config: default
split: test
revision: f46a197baaae43b4f621051089b82a364682dfeb
metrics:
- type: map_at_1
value: 18.289
- type: map_at_10
value: 24.321
- type: map_at_100
value: 25.241999999999997
- type: map_at_1000
value: 25.394
- type: map_at_3
value: 22.570999999999998
- type: map_at_5
value: 23.491999999999997
- type: mrr_at_1
value: 23.176
- type: mrr_at_10
value: 29.019000000000002
- type: mrr_at_100
value: 29.781000000000002
- type: mrr_at_1000
value: 29.874000000000002
- type: mrr_at_3
value: 27.444000000000003
- type: mrr_at_5
value: 28.224
- type: ndcg_at_1
value: 23.176
- type: ndcg_at_10
value: 28.28
- type: ndcg_at_100
value: 32.598
- type: ndcg_at_1000
value: 36.064
- type: ndcg_at_3
value: 25.627
- type: ndcg_at_5
value: 26.589000000000002
- type: precision_at_1
value: 23.176
- type: precision_at_10
value: 5.236
- type: precision_at_100
value: 0.906
- type: precision_at_1000
value: 0.149
- type: precision_at_3
value: 12.065
- type: precision_at_5
value: 8.526
- type: recall_at_1
value: 18.289
- type: recall_at_10
value: 35.281
- type: recall_at_100
value: 54.63400000000001
- type: recall_at_1000
value: 78.901
- type: recall_at_3
value: 26.790999999999997
- type: recall_at_5
value: 29.894
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackEnglishRetrieval
type: None
config: default
split: test
revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
metrics:
- type: map_at_1
value: 16.345000000000002
- type: map_at_10
value: 21.703
- type: map_at_100
value: 22.656000000000002
- type: map_at_1000
value: 22.772000000000002
- type: map_at_3
value: 20.02
- type: map_at_5
value: 20.855999999999998
- type: mrr_at_1
value: 21.146
- type: mrr_at_10
value: 26.389000000000003
- type: mrr_at_100
value: 27.128999999999998
- type: mrr_at_1000
value: 27.200000000000003
- type: mrr_at_3
value: 24.724
- type: mrr_at_5
value: 25.624999999999996
- type: ndcg_at_1
value: 21.146
- type: ndcg_at_10
value: 25.361
- type: ndcg_at_100
value: 29.648000000000003
- type: ndcg_at_1000
value: 32.446000000000005
- type: ndcg_at_3
value: 22.628
- type: ndcg_at_5
value: 23.711
- type: precision_at_1
value: 21.146
- type: precision_at_10
value: 4.752
- type: precision_at_100
value: 0.885
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 10.934000000000001
- type: precision_at_5
value: 7.693999999999999
- type: recall_at_1
value: 16.345000000000002
- type: recall_at_10
value: 31.602000000000004
- type: recall_at_100
value: 50.501
- type: recall_at_1000
value: 70.082
- type: recall_at_3
value: 23.214000000000002
- type: recall_at_5
value: 26.476
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGamingRetrieval
type: None
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 23.907
- type: map_at_10
value: 31.323
- type: map_at_100
value: 32.302
- type: map_at_1000
value: 32.391
- type: map_at_3
value: 29.067999999999998
- type: map_at_5
value: 30.320999999999998
- type: mrr_at_1
value: 27.084999999999997
- type: mrr_at_10
value: 33.934
- type: mrr_at_100
value: 34.792
- type: mrr_at_1000
value: 34.855000000000004
- type: mrr_at_3
value: 31.839000000000002
- type: mrr_at_5
value: 33.049
- type: ndcg_at_1
value: 27.084999999999997
- type: ndcg_at_10
value: 35.587
- type: ndcg_at_100
value: 40.37
- type: ndcg_at_1000
value: 42.501
- type: ndcg_at_3
value: 31.385
- type: ndcg_at_5
value: 33.364
- type: precision_at_1
value: 27.084999999999997
- type: precision_at_10
value: 5.749
- type: precision_at_100
value: 0.89
- type: precision_at_1000
value: 0.11499999999999999
- type: precision_at_3
value: 13.793
- type: precision_at_5
value: 9.555
- type: recall_at_1
value: 23.907
- type: recall_at_10
value: 45.953
- type: recall_at_100
value: 67.647
- type: recall_at_1000
value: 83.00800000000001
- type: recall_at_3
value: 34.587
- type: recall_at_5
value: 39.516
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackGisRetrieval
type: None
config: default
split: test
revision: 5003b3064772da1887988e05400cf3806fe491f2
metrics:
- type: map_at_1
value: 9.209
- type: map_at_10
value: 12.376
- type: map_at_100
value: 13.123999999999999
- type: map_at_1000
value: 13.232
- type: map_at_3
value: 11.328000000000001
- type: map_at_5
value: 11.934000000000001
- type: mrr_at_1
value: 9.944
- type: mrr_at_10
value: 13.254
- type: mrr_at_100
value: 14.041
- type: mrr_at_1000
value: 14.14
- type: mrr_at_3
value: 12.203
- type: mrr_at_5
value: 12.808
- type: ndcg_at_1
value: 9.944
- type: ndcg_at_10
value: 14.302999999999999
- type: ndcg_at_100
value: 18.289
- type: ndcg_at_1000
value: 21.494
- type: ndcg_at_3
value: 12.211
- type: ndcg_at_5
value: 13.26
- type: precision_at_1
value: 9.944
- type: precision_at_10
value: 2.237
- type: precision_at_100
value: 0.44400000000000006
- type: precision_at_1000
value: 0.076
- type: precision_at_3
value: 5.1979999999999995
- type: precision_at_5
value: 3.7289999999999996
- type: recall_at_1
value: 9.209
- type: recall_at_10
value: 19.426
- type: recall_at_100
value: 38.268
- type: recall_at_1000
value: 63.400999999999996
- type: recall_at_3
value: 13.822999999999999
- type: recall_at_5
value: 16.262
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackMathematicaRetrieval
type: None
config: default
split: test
revision: 90fceea13679c63fe563ded68f3b6f06e50061de
metrics:
- type: map_at_1
value: 4.894
- type: map_at_10
value: 8.065
- type: map_at_100
value: 8.776
- type: map_at_1000
value: 8.892
- type: map_at_3
value: 6.998
- type: map_at_5
value: 7.632999999999999
- type: mrr_at_1
value: 6.5920000000000005
- type: mrr_at_10
value: 10.409
- type: mrr_at_100
value: 11.185
- type: mrr_at_1000
value: 11.283
- type: mrr_at_3
value: 9.121
- type: mrr_at_5
value: 9.83
- type: ndcg_at_1
value: 6.5920000000000005
- type: ndcg_at_10
value: 10.349
- type: ndcg_at_100
value: 14.243
- type: ndcg_at_1000
value: 17.638
- type: ndcg_at_3
value: 8.277
- type: ndcg_at_5
value: 9.286999999999999
- type: precision_at_1
value: 6.5920000000000005
- type: precision_at_10
value: 2.139
- type: precision_at_100
value: 0.48
- type: precision_at_1000
value: 0.08800000000000001
- type: precision_at_3
value: 4.353
- type: precision_at_5
value: 3.3329999999999997
- type: recall_at_1
value: 4.894
- type: recall_at_10
value: 15.133
- type: recall_at_100
value: 32.687
- type: recall_at_1000
value: 58.281000000000006
- type: recall_at_3
value: 9.399000000000001
- type: recall_at_5
value: 11.971
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackPhysicsRetrieval
type: None
config: default
split: test
revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
metrics:
- type: map_at_1
value: 18.002000000000002
- type: map_at_10
value: 22.833000000000002
- type: map_at_100
value: 23.989
- type: map_at_1000
value: 24.126
- type: map_at_3
value: 21.093999999999998
- type: map_at_5
value: 22.105
- type: mrr_at_1
value: 21.848
- type: mrr_at_10
value: 26.978
- type: mrr_at_100
value: 27.967
- type: mrr_at_1000
value: 28.050000000000004
- type: mrr_at_3
value: 25.217
- type: mrr_at_5
value: 26.184
- type: ndcg_at_1
value: 21.848
- type: ndcg_at_10
value: 26.412000000000003
- type: ndcg_at_100
value: 32.193
- type: ndcg_at_1000
value: 35.429
- type: ndcg_at_3
value: 23.419
- type: ndcg_at_5
value: 24.866
- type: precision_at_1
value: 21.848
- type: precision_at_10
value: 4.61
- type: precision_at_100
value: 0.894
- type: precision_at_1000
value: 0.13699999999999998
- type: precision_at_3
value: 10.619
- type: precision_at_5
value: 7.603
- type: recall_at_1
value: 18.002000000000002
- type: recall_at_10
value: 33.256
- type: recall_at_100
value: 58.913000000000004
- type: recall_at_1000
value: 81.596
- type: recall_at_3
value: 24.73
- type: recall_at_5
value: 28.571
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackProgrammersRetrieval
type: None
config: default
split: test
revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
metrics:
- type: map_at_1
value: 12.327
- type: map_at_10
value: 16.701
- type: map_at_100
value: 17.706
- type: map_at_1000
value: 17.849999999999998
- type: map_at_3
value: 15.065999999999999
- type: map_at_5
value: 16.012999999999998
- type: mrr_at_1
value: 15.183
- type: mrr_at_10
value: 19.828000000000003
- type: mrr_at_100
value: 20.752000000000002
- type: mrr_at_1000
value: 20.848
- type: mrr_at_3
value: 18.189
- type: mrr_at_5
value: 19.067999999999998
- type: ndcg_at_1
value: 15.183
- type: ndcg_at_10
value: 19.799
- type: ndcg_at_100
value: 24.886
- type: ndcg_at_1000
value: 28.453
- type: ndcg_at_3
value: 16.794999999999998
- type: ndcg_at_5
value: 18.176000000000002
- type: precision_at_1
value: 15.183
- type: precision_at_10
value: 3.642
- type: precision_at_100
value: 0.726
- type: precision_at_1000
value: 0.124
- type: precision_at_3
value: 7.800999999999999
- type: precision_at_5
value: 5.799
- type: recall_at_1
value: 12.327
- type: recall_at_10
value: 26.215
- type: recall_at_100
value: 49.038
- type: recall_at_1000
value: 74.297
- type: recall_at_3
value: 18.099999999999998
- type: recall_at_5
value: 21.438
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackRetrieval
type: mteb/cqadupstack
config: default
split: test
revision: 4885aa143210c98657558c04aaf3dc47cfb54340
metrics:
- type: map_at_1
value: 12.574
- type: map_at_10
value: 16.85583333333333
- type: map_at_100
value: 17.69275
- type: map_at_1000
value: 17.8145
- type: map_at_3
value: 15.422416666666667
- type: map_at_5
value: 16.21533333333333
- type: mrr_at_1
value: 15.288666666666668
- type: mrr_at_10
value: 19.676666666666666
- type: mrr_at_100
value: 20.453833333333332
- type: mrr_at_1000
value: 20.54325
- type: mrr_at_3
value: 18.258166666666664
- type: mrr_at_5
value: 19.02866666666666
- type: ndcg_at_1
value: 15.288666666666668
- type: ndcg_at_10
value: 19.781416666666665
- type: ndcg_at_100
value: 24.025916666666664
- type: ndcg_at_1000
value: 27.181999999999995
- type: ndcg_at_3
value: 17.224500000000003
- type: ndcg_at_5
value: 18.379083333333334
- type: precision_at_1
value: 15.288666666666668
- type: precision_at_10
value: 3.4904166666666665
- type: precision_at_100
value: 0.6715833333333333
- type: precision_at_1000
value: 0.11100000000000002
- type: precision_at_3
value: 7.932333333333333
- type: precision_at_5
value: 5.679
- type: recall_at_1
value: 12.574
- type: recall_at_10
value: 25.89141666666667
- type: recall_at_100
value: 45.35633333333333
- type: recall_at_1000
value: 68.51650000000001
- type: recall_at_3
value: 18.535083333333336
- type: recall_at_5
value: 21.596166666666672
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackStatsRetrieval
type: None
config: default
split: test
revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
metrics:
- type: map_at_1
value: 8.457
- type: map_at_10
value: 12.277000000000001
- type: map_at_100
value: 12.956999999999999
- type: map_at_1000
value: 13.052
- type: map_at_3
value: 10.931000000000001
- type: map_at_5
value: 11.578
- type: mrr_at_1
value: 10.428999999999998
- type: mrr_at_10
value: 14.289
- type: mrr_at_100
value: 15.023
- type: mrr_at_1000
value: 15.109
- type: mrr_at_3
value: 13.139000000000001
- type: mrr_at_5
value: 13.691
- type: ndcg_at_1
value: 10.428999999999998
- type: ndcg_at_10
value: 14.753
- type: ndcg_at_100
value: 18.581
- type: ndcg_at_1000
value: 21.272
- type: ndcg_at_3
value: 12.399000000000001
- type: ndcg_at_5
value: 13.297
- type: precision_at_1
value: 10.428999999999998
- type: precision_at_10
value: 2.6229999999999998
- type: precision_at_100
value: 0.49100000000000005
- type: precision_at_1000
value: 0.079
- type: precision_at_3
value: 5.827999999999999
- type: precision_at_5
value: 4.141
- type: recall_at_1
value: 8.457
- type: recall_at_10
value: 20.515
- type: recall_at_100
value: 38.675
- type: recall_at_1000
value: 58.999
- type: recall_at_3
value: 13.779
- type: recall_at_5
value: 16.13
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackTexRetrieval
type: None
config: default
split: test
revision: 46989137a86843e03a6195de44b09deda022eec7
metrics:
- type: map_at_1
value: 6.755999999999999
- type: map_at_10
value: 9.440999999999999
- type: map_at_100
value: 10.006
- type: map_at_1000
value: 10.116
- type: map_at_3
value: 8.423
- type: map_at_5
value: 8.976
- type: mrr_at_1
value: 8.706
- type: mrr_at_10
value: 11.613
- type: mrr_at_100
value: 12.2
- type: mrr_at_1000
value: 12.293
- type: mrr_at_3
value: 10.57
- type: mrr_at_5
value: 11.124
- type: ndcg_at_1
value: 8.706
- type: ndcg_at_10
value: 11.529
- type: ndcg_at_100
value: 14.59
- type: ndcg_at_1000
value: 17.8
- type: ndcg_at_3
value: 9.666
- type: ndcg_at_5
value: 10.471
- type: precision_at_1
value: 8.706
- type: precision_at_10
value: 2.182
- type: precision_at_100
value: 0.44999999999999996
- type: precision_at_1000
value: 0.087
- type: precision_at_3
value: 4.611
- type: precision_at_5
value: 3.4070000000000005
- type: recall_at_1
value: 6.755999999999999
- type: recall_at_10
value: 15.803
- type: recall_at_100
value: 30.062
- type: recall_at_1000
value: 54.057
- type: recall_at_3
value: 10.401
- type: recall_at_5
value: 12.559999999999999
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackUnixRetrieval
type: None
config: default
split: test
revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
metrics:
- type: map_at_1
value: 11.104
- type: map_at_10
value: 13.952
- type: map_at_100
value: 14.631
- type: map_at_1000
value: 14.735999999999999
- type: map_at_3
value: 12.895999999999999
- type: map_at_5
value: 13.447999999999999
- type: mrr_at_1
value: 13.34
- type: mrr_at_10
value: 16.384
- type: mrr_at_100
value: 17.064
- type: mrr_at_1000
value: 17.161
- type: mrr_at_3
value: 15.235999999999999
- type: mrr_at_5
value: 15.828999999999999
- type: ndcg_at_1
value: 13.34
- type: ndcg_at_10
value: 16.172
- type: ndcg_at_100
value: 20.012
- type: ndcg_at_1000
value: 23.247999999999998
- type: ndcg_at_3
value: 14.149999999999999
- type: ndcg_at_5
value: 15.001000000000001
- type: precision_at_1
value: 13.34
- type: precision_at_10
value: 2.649
- type: precision_at_100
value: 0.508
- type: precision_at_1000
value: 0.08800000000000001
- type: precision_at_3
value: 6.311999999999999
- type: precision_at_5
value: 4.347
- type: recall_at_1
value: 11.104
- type: recall_at_10
value: 20.756
- type: recall_at_100
value: 39.066
- type: recall_at_1000
value: 63.626000000000005
- type: recall_at_3
value: 14.943999999999999
- type: recall_at_5
value: 17.331
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWebmastersRetrieval
type: None
config: default
split: test
revision: 160c094312a0e1facb97e55eeddb698c0abe3571
metrics:
- type: map_at_1
value: 13.211999999999998
- type: map_at_10
value: 17.443
- type: map_at_100
value: 18.364
- type: map_at_1000
value: 18.55
- type: map_at_3
value: 16.042
- type: map_at_5
value: 16.885
- type: mrr_at_1
value: 16.403000000000002
- type: mrr_at_10
value: 20.865000000000002
- type: mrr_at_100
value: 21.624
- type: mrr_at_1000
value: 21.718
- type: mrr_at_3
value: 19.401
- type: mrr_at_5
value: 20.25
- type: ndcg_at_1
value: 16.403000000000002
- type: ndcg_at_10
value: 20.677
- type: ndcg_at_100
value: 24.727
- type: ndcg_at_1000
value: 28.391
- type: ndcg_at_3
value: 18.382
- type: ndcg_at_5
value: 19.572
- type: precision_at_1
value: 16.403000000000002
- type: precision_at_10
value: 3.755
- type: precision_at_100
value: 0.9209999999999999
- type: precision_at_1000
value: 0.172
- type: precision_at_3
value: 8.498
- type: precision_at_5
value: 6.2059999999999995
- type: recall_at_1
value: 13.211999999999998
- type: recall_at_10
value: 26.532
- type: recall_at_100
value: 45.253
- type: recall_at_1000
value: 70.62
- type: recall_at_3
value: 19.024
- type: recall_at_5
value: 22.448999999999998
- task:
type: Retrieval
dataset:
name: MTEB CQADupstackWordpressRetrieval
type: None
config: default
split: test
revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
metrics:
- type: map_at_1
value: 8.386000000000001
- type: map_at_10
value: 11.834999999999999
- type: map_at_100
value: 12.559999999999999
- type: map_at_1000
value: 12.662999999999998
- type: map_at_3
value: 10.632
- type: map_at_5
value: 11.343
- type: mrr_at_1
value: 9.612
- type: mrr_at_10
value: 13.158
- type: mrr_at_100
value: 13.888
- type: mrr_at_1000
value: 13.988
- type: mrr_at_3
value: 12.015
- type: mrr_at_5
value: 12.662
- type: ndcg_at_1
value: 9.612
- type: ndcg_at_10
value: 14.155000000000001
- type: ndcg_at_100
value: 18.174
- type: ndcg_at_1000
value: 21.448
- type: ndcg_at_3
value: 11.755
- type: ndcg_at_5
value: 12.955
- type: precision_at_1
value: 9.612
- type: precision_at_10
value: 2.311
- type: precision_at_100
value: 0.464
- type: precision_at_1000
value: 0.08
- type: precision_at_3
value: 5.176
- type: precision_at_5
value: 3.8080000000000003
- type: recall_at_1
value: 8.386000000000001
- type: recall_at_10
value: 20.225
- type: recall_at_100
value: 39.532000000000004
- type: recall_at_1000
value: 65.33
- type: recall_at_3
value: 13.629
- type: recall_at_5
value: 16.556
- task:
type: Retrieval
dataset:
name: MTEB ClimateFEVER
type: None
config: default
split: test
revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
metrics:
- type: map_at_1
value: 8.164
- type: map_at_10
value: 14.027999999999999
- type: map_at_100
value: 15.817
- type: map_at_1000
value: 16.047
- type: map_at_3
value: 11.501
- type: map_at_5
value: 12.674
- type: mrr_at_1
value: 18.502
- type: mrr_at_10
value: 28.503
- type: mrr_at_100
value: 29.686
- type: mrr_at_1000
value: 29.742
- type: mrr_at_3
value: 24.995
- type: mrr_at_5
value: 26.76
- type: ndcg_at_1
value: 18.502
- type: ndcg_at_10
value: 20.954
- type: ndcg_at_100
value: 28.532999999999998
- type: ndcg_at_1000
value: 32.732
- type: ndcg_at_3
value: 16.3
- type: ndcg_at_5
value: 17.681
- type: precision_at_1
value: 18.502
- type: precision_at_10
value: 6.977
- type: precision_at_100
value: 1.496
- type: precision_at_1000
value: 0.22599999999999998
- type: precision_at_3
value: 12.313
- type: precision_at_5
value: 9.668000000000001
- type: recall_at_1
value: 8.164
- type: recall_at_10
value: 26.41
- type: recall_at_100
value: 52.81
- type: recall_at_1000
value: 76.554
- type: recall_at_3
value: 14.974000000000002
- type: recall_at_5
value: 18.961
- task:
type: Retrieval
dataset:
name: MTEB DBPedia
type: None
config: default
split: test
revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
metrics:
- type: map_at_1
value: 3.769
- type: map_at_10
value: 9.778
- type: map_at_100
value: 14.66
- type: map_at_1000
value: 15.863
- type: map_at_3
value: 6.691999999999999
- type: map_at_5
value: 8.03
- type: mrr_at_1
value: 39.5
- type: mrr_at_10
value: 50.370000000000005
- type: mrr_at_100
value: 51.09
- type: mrr_at_1000
value: 51.117000000000004
- type: mrr_at_3
value: 47.833
- type: mrr_at_5
value: 49.233
- type: ndcg_at_1
value: 28.999999999999996
- type: ndcg_at_10
value: 24.253
- type: ndcg_at_100
value: 28.88
- type: ndcg_at_1000
value: 36.449
- type: ndcg_at_3
value: 26.119999999999997
- type: ndcg_at_5
value: 25.023
- type: precision_at_1
value: 39.5
- type: precision_at_10
value: 22.375
- type: precision_at_100
value: 7.605
- type: precision_at_1000
value: 1.5709999999999997
- type: precision_at_3
value: 32.083
- type: precision_at_5
value: 28.349999999999998
- type: recall_at_1
value: 3.769
- type: recall_at_10
value: 14.913000000000002
- type: recall_at_100
value: 36.785000000000004
- type: recall_at_1000
value: 63.002
- type: recall_at_3
value: 8.312999999999999
- type: recall_at_5
value: 10.679
- task:
type: Classification
dataset:
name: MTEB EmotionClassification
type: None
config: default
split: test
revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
metrics:
- type: accuracy
value: 32.635000000000005
- type: f1
value: 29.872952417911126
- task:
type: Retrieval
dataset:
name: MTEB FEVER
type: None
config: default
split: test
revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
metrics:
- type: map_at_1
value: 15.365
- type: map_at_10
value: 23.764
- type: map_at_100
value: 24.849
- type: map_at_1000
value: 24.926000000000002
- type: map_at_3
value: 20.857999999999997
- type: map_at_5
value: 22.488
- type: mrr_at_1
value: 16.412
- type: mrr_at_10
value: 25.202
- type: mrr_at_100
value: 26.273000000000003
- type: mrr_at_1000
value: 26.339000000000002
- type: mrr_at_3
value: 22.172
- type: mrr_at_5
value: 23.860999999999997
- type: ndcg_at_1
value: 16.412
- type: ndcg_at_10
value: 29.026000000000003
- type: ndcg_at_100
value: 34.43
- type: ndcg_at_1000
value: 36.522
- type: ndcg_at_3
value: 23.027
- type: ndcg_at_5
value: 25.946
- type: precision_at_1
value: 16.412
- type: precision_at_10
value: 4.8149999999999995
- type: precision_at_100
value: 0.771
- type: precision_at_1000
value: 0.097
- type: precision_at_3
value: 10.030999999999999
- type: precision_at_5
value: 7.558
- type: recall_at_1
value: 15.365
- type: recall_at_10
value: 44.224999999999994
- type: recall_at_100
value: 69.169
- type: recall_at_1000
value: 85.272
- type: recall_at_3
value: 28.015
- type: recall_at_5
value: 34.958
- task:
type: Retrieval
dataset:
name: MTEB FiQA2018
type: None
config: default
split: test
revision: 27a168819829fe9bcd655c2df245fb19452e8e06
metrics:
- type: map_at_1
value: 5.4190000000000005
- type: map_at_10
value: 9.495000000000001
- type: map_at_100
value: 10.551
- type: map_at_1000
value: 10.725
- type: map_at_3
value: 7.845000000000001
- type: map_at_5
value: 8.661000000000001
- type: mrr_at_1
value: 11.574
- type: mrr_at_10
value: 17.357
- type: mrr_at_100
value: 18.298000000000002
- type: mrr_at_1000
value: 18.403
- type: mrr_at_3
value: 15.432000000000002
- type: mrr_at_5
value: 16.543
- type: ndcg_at_1
value: 11.574
- type: ndcg_at_10
value: 13.574
- type: ndcg_at_100
value: 18.847
- type: ndcg_at_1000
value: 23.105999999999998
- type: ndcg_at_3
value: 11.16
- type: ndcg_at_5
value: 12.015
- type: precision_at_1
value: 11.574
- type: precision_at_10
value: 4.167
- type: precision_at_100
value: 0.9259999999999999
- type: precision_at_1000
value: 0.166
- type: precision_at_3
value: 7.716000000000001
- type: precision_at_5
value: 6.08
- type: recall_at_1
value: 5.4190000000000005
- type: recall_at_10
value: 17.76
- type: recall_at_100
value: 39.080999999999996
- type: recall_at_1000
value: 65.713
- type: recall_at_3
value: 10.348
- type: recall_at_5
value: 13.274
- task:
type: Retrieval
dataset:
name: MTEB HotpotQA
type: None
config: default
split: test
revision: ab518f4d6fcca38d87c25209f94beba119d02014
metrics:
- type: map_at_1
value: 18.697
- type: map_at_10
value: 26.466
- type: map_at_100
value: 27.464
- type: map_at_1000
value: 27.581
- type: map_at_3
value: 24.284
- type: map_at_5
value: 25.478
- type: mrr_at_1
value: 37.394
- type: mrr_at_10
value: 44.827
- type: mrr_at_100
value: 45.553
- type: mrr_at_1000
value: 45.601
- type: mrr_at_3
value: 42.82
- type: mrr_at_5
value: 43.980999999999995
- type: ndcg_at_1
value: 37.394
- type: ndcg_at_10
value: 33.726
- type: ndcg_at_100
value: 38.244
- type: ndcg_at_1000
value: 40.931
- type: ndcg_at_3
value: 29.660999999999998
- type: ndcg_at_5
value: 31.627
- type: precision_at_1
value: 37.394
- type: precision_at_10
value: 7.453
- type: precision_at_100
value: 1.107
- type: precision_at_1000
value: 0.147
- type: precision_at_3
value: 18.708
- type: precision_at_5
value: 12.786
- type: recall_at_1
value: 18.697
- type: recall_at_10
value: 37.265
- type: recall_at_100
value: 55.361000000000004
- type: recall_at_1000
value: 73.309
- type: recall_at_3
value: 28.061999999999998
- type: recall_at_5
value: 31.965
- task:
type: Classification
dataset:
name: MTEB ImdbClassification
type: None
config: default
split: test
revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
metrics:
- type: accuracy
value: 62.48999999999999
- type: ap
value: 58.17377011639594
- type: f1
value: 62.05146869753241
- task:
type: Retrieval
dataset:
name: MTEB MSMARCO
type: None
config: default
split: dev
revision: c5a29a104738b98a9e76336939199e264163d4a0
metrics:
- type: map_at_1
value: 4.475
- type: map_at_10
value: 7.548000000000001
- type: map_at_100
value: 8.303
- type: map_at_1000
value: 8.408999999999999
- type: map_at_3
value: 6.4079999999999995
- type: map_at_5
value: 6.97
- type: mrr_at_1
value: 4.599
- type: mrr_at_10
value: 7.739
- type: mrr_at_100
value: 8.506
- type: mrr_at_1000
value: 8.61
- type: mrr_at_3
value: 6.564
- type: mrr_at_5
value: 7.159
- type: ndcg_at_1
value: 4.569999999999999
- type: ndcg_at_10
value: 9.514
- type: ndcg_at_100
value: 13.806
- type: ndcg_at_1000
value: 17.055
- type: ndcg_at_3
value: 7.093000000000001
- type: ndcg_at_5
value: 8.122
- type: precision_at_1
value: 4.569999999999999
- type: precision_at_10
value: 1.628
- type: precision_at_100
value: 0.388
- type: precision_at_1000
value: 0.067
- type: precision_at_3
value: 3.061
- type: precision_at_5
value: 2.367
- type: recall_at_1
value: 4.475
- type: recall_at_10
value: 15.67
- type: recall_at_100
value: 36.923
- type: recall_at_1000
value: 63.080999999999996
- type: recall_at_3
value: 8.949
- type: recall_at_5
value: 11.415000000000001
- task:
type: Classification
dataset:
name: MTEB MTOPDomainClassification (en)
type: None
config: en
split: test
revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
metrics:
- type: accuracy
value: 87.9297765617875
- type: f1
value: 87.26334348655604
- task:
type: Classification
dataset:
name: MTEB MTOPIntentClassification (en)
type: None
config: en
split: test
revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
metrics:
- type: accuracy
value: 62.35066119471044
- type: f1
value: 40.63496979335978
- task:
type: Classification
dataset:
name: MTEB MassiveIntentClassification (en)
type: None
config: en
split: test
revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
metrics:
- type: accuracy
value: 58.75252185608607
- type: f1
value: 56.52628257282246
- task:
type: Classification
dataset:
name: MTEB MassiveScenarioClassification (en)
type: None
config: en
split: test
revision: 7d571f92784cd94a019292a1f45445077d0ef634
metrics:
- type: accuracy
value: 67.1654337592468
- type: f1
value: 65.44989621478248
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringP2P
type: None
config: default
split: test
revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
metrics:
- type: v_measure
value: 29.0020121507179
- task:
type: Clustering
dataset:
name: MTEB MedrxivClusteringS2S
type: None
config: default
split: test
revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
metrics:
- type: v_measure
value: 25.38089808676194
- task:
type: Reranking
dataset:
name: MTEB MindSmallReranking
type: None
config: default
split: test
revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
metrics:
- type: map
value: 29.006389235579594
- type: mrr
value: 29.93354413668632
- task:
type: Retrieval
dataset:
name: MTEB NFCorpus
type: None
config: default
split: test
revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
metrics:
- type: map_at_1
value: 2.938
- type: map_at_10
value: 6.718
- type: map_at_100
value: 8.602
- type: map_at_1000
value: 9.879999999999999
- type: map_at_3
value: 5.066
- type: map_at_5
value: 5.9799999999999995
- type: mrr_at_1
value: 26.006
- type: mrr_at_10
value: 37.143
- type: mrr_at_100
value: 38.007000000000005
- type: mrr_at_1000
value: 38.056
- type: mrr_at_3
value: 33.953
- type: mrr_at_5
value: 35.980000000000004
- type: ndcg_at_1
value: 24.768
- type: ndcg_at_10
value: 21.893
- type: ndcg_at_100
value: 21.193
- type: ndcg_at_1000
value: 30.911
- type: ndcg_at_3
value: 23.912
- type: ndcg_at_5
value: 23.749000000000002
- type: precision_at_1
value: 26.006
- type: precision_at_10
value: 16.378
- type: precision_at_100
value: 6.059
- type: precision_at_1000
value: 1.934
- type: precision_at_3
value: 22.601
- type: precision_at_5
value: 20.929000000000002
- type: recall_at_1
value: 2.938
- type: recall_at_10
value: 11.195
- type: recall_at_100
value: 24.473
- type: recall_at_1000
value: 58.553
- type: recall_at_3
value: 6.487
- type: recall_at_5
value: 9.02
- task:
type: Retrieval
dataset:
name: MTEB NQ
type: None
config: default
split: test
revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
metrics:
- type: map_at_1
value: 8.761
- type: map_at_10
value: 15.726
- type: map_at_100
value: 17.130000000000003
- type: map_at_1000
value: 17.244999999999997
- type: map_at_3
value: 13.001
- type: map_at_5
value: 14.438999999999998
- type: mrr_at_1
value: 9.994
- type: mrr_at_10
value: 17.455000000000002
- type: mrr_at_100
value: 18.736
- type: mrr_at_1000
value: 18.828
- type: mrr_at_3
value: 14.634
- type: mrr_at_5
value: 16.158
- type: ndcg_at_1
value: 9.994
- type: ndcg_at_10
value: 20.453
- type: ndcg_at_100
value: 27.514
- type: ndcg_at_1000
value: 30.45
- type: ndcg_at_3
value: 14.802000000000001
- type: ndcg_at_5
value: 17.394000000000002
- type: precision_at_1
value: 9.994
- type: precision_at_10
value: 3.914
- type: precision_at_100
value: 0.7939999999999999
- type: precision_at_1000
value: 0.107
- type: precision_at_3
value: 7.0680000000000005
- type: precision_at_5
value: 5.655
- type: recall_at_1
value: 8.761
- type: recall_at_10
value: 33.534000000000006
- type: recall_at_100
value: 66.28500000000001
- type: recall_at_1000
value: 88.458
- type: recall_at_3
value: 18.436
- type: recall_at_5
value: 24.508
- task:
type: Retrieval
dataset:
name: MTEB QuoraRetrieval
type: None
config: default
split: test
revision: None
metrics:
- type: map_at_1
value: 50.617000000000004
- type: map_at_10
value: 62.446999999999996
- type: map_at_100
value: 63.410999999999994
- type: map_at_1000
value: 63.461
- type: map_at_3
value: 59.382999999999996
- type: map_at_5
value: 61.17
- type: mrr_at_1
value: 58.160000000000004
- type: mrr_at_10
value: 67.013
- type: mrr_at_100
value: 67.47
- type: mrr_at_1000
value: 67.488
- type: mrr_at_3
value: 65.005
- type: mrr_at_5
value: 66.238
- type: ndcg_at_1
value: 58.209999999999994
- type: ndcg_at_10
value: 67.907
- type: ndcg_at_100
value: 71.194
- type: ndcg_at_1000
value: 72.02
- type: ndcg_at_3
value: 63.429
- type: ndcg_at_5
value: 65.655
- type: precision_at_1
value: 58.209999999999994
- type: precision_at_10
value: 10.537
- type: precision_at_100
value: 1.355
- type: precision_at_1000
value: 0.15
- type: precision_at_3
value: 27.677000000000003
- type: precision_at_5
value: 18.6
- type: recall_at_1
value: 50.617000000000004
- type: recall_at_10
value: 79.323
- type: recall_at_100
value: 92.571
- type: recall_at_1000
value: 97.94
- type: recall_at_3
value: 66.81899999999999
- type: recall_at_5
value: 72.738
- type: map_at_1
value: 2.5829999999999997
- type: map_at_10
value: 6.2059999999999995
- type: map_at_100
value: 7.46
- type: map_at_1000
value: 7.724
- type: map_at_3
value: 4.515000000000001
- type: map_at_5
value: 5.313
- type: mrr_at_1
value: 12.7
- type: mrr_at_10
value: 20.615
- type: mrr_at_100
value: 21.841
- type: mrr_at_1000
value: 21.931
- type: mrr_at_3
value: 17.983
- type: mrr_at_5
value: 19.468
- type: ndcg_at_1
value: 12.7
- type: ndcg_at_10
value: 11.366
- type: ndcg_at_100
value: 17.448
- type: ndcg_at_1000
value: 22.86
- type: ndcg_at_3
value: 10.541
- type: ndcg_at_5
value: 9.27
- type: precision_at_1
value: 12.7
- type: precision_at_10
value: 5.96
- type: precision_at_100
value: 1.4949999999999999
- type: precision_at_1000
value: 0.27999999999999997
- type: precision_at_3
value: 9.833
- type: precision_at_5
value: 8.16
- type: recall_at_1
value: 2.5829999999999997
- type: recall_at_10
value: 12.107999999999999
- type: recall_at_100
value: 30.368000000000002
- type: recall_at_1000
value: 57.01500000000001
- type: recall_at_3
value: 5.997
- type: recall_at_5
value: 8.267
- type: map_at_1
value: 0.174
- type: map_at_10
value: 0.9730000000000001
- type: map_at_100
value: 4.8629999999999995
- type: map_at_1000
value: 11.895999999999999
- type: map_at_3
value: 0.373
- type: map_at_5
value: 0.575
- type: mrr_at_1
value: 68.0
- type: mrr_at_10
value: 75.888
- type: mrr_at_100
value: 76.254
- type: mrr_at_1000
value: 76.254
- type: mrr_at_3
value: 73.0
- type: mrr_at_5
value: 74.4
- type: ndcg_at_1
value: 59.0
- type: ndcg_at_10
value: 49.874
- type: ndcg_at_100
value: 34.993
- type: ndcg_at_1000
value: 31.941999999999997
- type: ndcg_at_3
value: 54.06100000000001
- type: ndcg_at_5
value: 52.995000000000005
- type: precision_at_1
value: 68.0
- type: precision_at_10
value: 53.0
- type: precision_at_100
value: 36.5
- type: precision_at_1000
value: 15.387999999999998
- type: precision_at_3
value: 57.333
- type: precision_at_5
value: 56.00000000000001
- type: recall_at_1
value: 0.174
- type: recall_at_10
value: 1.2309999999999999
- type: recall_at_100
value: 7.992000000000001
- type: recall_at_1000
value: 31.196
- type: recall_at_3
value: 0.402
- type: recall_at_5
value: 0.6629999999999999
- task:
type: Clustering
dataset:
name: MTEB RedditClustering
type: None
config: default
split: test
revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
metrics:
- type: v_measure
value: 28.360814957776963
- task:
type: Clustering
dataset:
name: MTEB RedditClusteringP2P
type: None
config: default
split: test
revision: 282350215ef01743dc01b456c7f5241fa8937f16
metrics:
- type: v_measure
value: 43.3985031467972
- task:
type: STS
dataset:
name: MTEB SICK-R
type: None
config: default
split: test
revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
metrics:
- type: cos_sim_pearson
value: 71.20191818636728
- type: cos_sim_spearman
value: 63.18841867184768
- type: euclidean_pearson
value: 66.35805669714289
- type: euclidean_spearman
value: 63.18842007652818
- type: manhattan_pearson
value: 66.35916950374904
- type: manhattan_spearman
value: 63.196888125167625
- task:
type: STS
dataset:
name: MTEB STS12
type: None
config: default
split: test
revision: a0d554a64d88156834ff5ae9920b964011b16384
metrics:
- type: cos_sim_pearson
value: 62.098101467395814
- type: cos_sim_spearman
value: 54.06088997149339
- type: euclidean_pearson
value: 58.41368217137798
- type: euclidean_spearman
value: 54.06092904130544
- type: manhattan_pearson
value: 58.362828042865786
- type: manhattan_spearman
value: 54.00939603392602
- task:
type: STS
dataset:
name: MTEB STS13
type: None
config: default
split: test
revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
metrics:
- type: cos_sim_pearson
value: 65.43822913111813
- type: cos_sim_spearman
value: 66.28982173651268
- type: euclidean_pearson
value: 66.25168181976332
- type: euclidean_spearman
value: 66.28982173651268
- type: manhattan_pearson
value: 66.29528124821977
- type: manhattan_spearman
value: 66.30514044017556
- task:
type: STS
dataset:
name: MTEB STS14
type: None
config: default
split: test
revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
metrics:
- type: cos_sim_pearson
value: 68.85545703975829
- type: cos_sim_spearman
value: 65.8423661018879
- type: euclidean_pearson
value: 67.39130942849594
- type: euclidean_spearman
value: 65.84236495537252
- type: manhattan_pearson
value: 67.33275534001785
- type: manhattan_spearman
value: 65.76611698295058
- task:
type: STS
dataset:
name: MTEB STS15
type: None
config: default
split: test
revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
metrics:
- type: cos_sim_pearson
value: 77.6285907529651
- type: cos_sim_spearman
value: 78.00000961062926
- type: euclidean_pearson
value: 76.65512603085617
- type: euclidean_spearman
value: 78.00002082347814
- type: manhattan_pearson
value: 76.56885344052785
- type: manhattan_spearman
value: 77.91110691090714
- task:
type: STS
dataset:
name: MTEB STS16
type: None
config: default
split: test
revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
metrics:
- type: cos_sim_pearson
value: 70.01796001502133
- type: cos_sim_spearman
value: 70.63693611196106
- type: euclidean_pearson
value: 69.91996212882896
- type: euclidean_spearman
value: 70.63693611196106
- type: manhattan_pearson
value: 69.74064896054243
- type: manhattan_spearman
value: 70.49237791641166
- task:
type: STS
dataset:
name: MTEB STS17 (en-en)
type: None
config: en-en
split: test
revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
metrics:
- type: cos_sim_pearson
value: 81.3822726377613
- type: cos_sim_spearman
value: 82.2302028734954
- type: euclidean_pearson
value: 80.33214232883414
- type: euclidean_spearman
value: 82.2302028734954
- type: manhattan_pearson
value: 80.263835088929
- type: manhattan_spearman
value: 82.24108129797949
- task:
type: STS
dataset:
name: MTEB STS22 (en)
type: None
config: en
split: test
revision: eea2b4fe26a775864c896887d910b76a8098ad3f
metrics:
- type: cos_sim_pearson
value: 48.870030429256026
- type: cos_sim_spearman
value: 54.52763255764367
- type: euclidean_pearson
value: 52.91515372752542
- type: euclidean_spearman
value: 54.52763255764367
- type: manhattan_pearson
value: 52.9090047976553
- type: manhattan_spearman
value: 54.567349154887964
- task:
type: STS
dataset:
name: MTEB STSBenchmark
type: None
config: default
split: test
revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
metrics:
- type: cos_sim_pearson
value: 70.6609948217279
- type: cos_sim_spearman
value: 68.040381040423
- type: euclidean_pearson
value: 69.52004507848278
- type: euclidean_spearman
value: 68.040381040423
- type: manhattan_pearson
value: 69.3798908727963
- type: manhattan_spearman
value: 67.88012464800055
- task:
type: Reranking
dataset:
name: MTEB SciDocsRR
type: None
config: default
split: test
revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
metrics:
- type: map
value: 70.94408788023235
- type: mrr
value: 90.05190252739271
- task:
type: Retrieval
dataset:
name: MTEB SciFact
type: None
config: default
split: test
revision: 0228b52cf27578f30900b9e5271d331663a030d7
metrics:
- type: map_at_1
value: 26.694000000000003
- type: map_at_10
value: 34.503
- type: map_at_100
value: 35.494
- type: map_at_1000
value: 35.582
- type: map_at_3
value: 32.255
- type: map_at_5
value: 33.312999999999995
- type: mrr_at_1
value: 28.333000000000002
- type: mrr_at_10
value: 35.782000000000004
- type: mrr_at_100
value: 36.681000000000004
- type: mrr_at_1000
value: 36.756
- type: mrr_at_3
value: 33.667
- type: mrr_at_5
value: 34.8
- type: ndcg_at_1
value: 28.333000000000002
- type: ndcg_at_10
value: 38.799
- type: ndcg_at_100
value: 44.086
- type: ndcg_at_1000
value: 46.472
- type: ndcg_at_3
value: 34.215
- type: ndcg_at_5
value: 36.172
- type: precision_at_1
value: 28.333000000000002
- type: precision_at_10
value: 5.7
- type: precision_at_100
value: 0.8630000000000001
- type: precision_at_1000
value: 0.108
- type: precision_at_3
value: 13.889000000000001
- type: precision_at_5
value: 9.4
- type: recall_at_1
value: 26.694000000000003
- type: recall_at_10
value: 50.917
- type: recall_at_100
value: 76.656
- type: recall_at_1000
value: 95.267
- type: recall_at_3
value: 38.25
- type: recall_at_5
value: 43.25
- task:
type: PairClassification
dataset:
name: MTEB SprintDuplicateQuestions
type: None
config: default
split: test
revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
metrics:
- type: cos_sim_accuracy
value: 99.48415841584158
- type: cos_sim_ap
value: 75.52946987159493
- type: cos_sim_f1
value: 71.24183006535948
- type: cos_sim_precision
value: 78.22966507177034
- type: cos_sim_recall
value: 65.4
- type: dot_accuracy
value: 99.48415841584158
- type: dot_ap
value: 75.52946987159493
- type: dot_f1
value: 71.24183006535948
- type: dot_precision
value: 78.22966507177034
- type: dot_recall
value: 65.4
- type: euclidean_accuracy
value: 99.48415841584158
- type: euclidean_ap
value: 75.52946987159491
- type: euclidean_f1
value: 71.24183006535948
- type: euclidean_precision
value: 78.22966507177034
- type: euclidean_recall
value: 65.4
- type: manhattan_accuracy
value: 99.48613861386139
- type: manhattan_ap
value: 75.59805786399177
- type: manhattan_f1
value: 71.09670448406267
- type: manhattan_precision
value: 77.32079905992948
- type: manhattan_recall
value: 65.8
- type: max_accuracy
value: 99.48613861386139
- type: max_ap
value: 75.59805786399177
- type: max_f1
value: 71.24183006535948
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClustering
type: None
config: default
split: test
revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
metrics:
- type: v_measure
value: 36.29480214260567
- task:
type: Clustering
dataset:
name: MTEB StackExchangeClusteringP2P
type: None
config: default
split: test
revision: 815ca46b2622cec33ccafc3735d572c266efdb44
metrics:
- type: v_measure
value: 26.54544643416149
- task:
type: Reranking
dataset:
name: MTEB StackOverflowDupQuestions
type: None
config: default
split: test
revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
metrics:
- type: map
value: 39.18037729762096
- type: mrr
value: 39.22605784738137
- task:
type: Summarization
dataset:
name: MTEB SummEval
type: None
config: default
split: test
revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
metrics:
- type: cos_sim_pearson
value: 30.48884056085282
- type: cos_sim_spearman
value: 29.51032126703428
- type: dot_pearson
value: 30.488836086824143
- type: dot_spearman
value: 29.51032126703428
- task:
type: Retrieval
dataset:
name: MTEB Touche2020
type: None
config: default
split: test
revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
metrics:
- type: map_at_1
value: 1.409
- type: map_at_10
value: 6.394
- type: map_at_100
value: 11.241
- type: map_at_1000
value: 12.983
- type: map_at_3
value: 3.3009999999999997
- type: map_at_5
value: 4.623
- type: mrr_at_1
value: 22.448999999999998
- type: mrr_at_10
value: 37.69
- type: mrr_at_100
value: 38.684000000000005
- type: mrr_at_1000
value: 38.684000000000005
- type: mrr_at_3
value: 32.653
- type: mrr_at_5
value: 35.918
- type: ndcg_at_1
value: 20.408
- type: ndcg_at_10
value: 18.78
- type: ndcg_at_100
value: 31.513999999999996
- type: ndcg_at_1000
value: 43.881
- type: ndcg_at_3
value: 20.888
- type: ndcg_at_5
value: 19.969
- type: precision_at_1
value: 22.448999999999998
- type: precision_at_10
value: 18.163
- type: precision_at_100
value: 7.469
- type: precision_at_1000
value: 1.533
- type: precision_at_3
value: 23.128999999999998
- type: precision_at_5
value: 21.633
- type: recall_at_1
value: 1.409
- type: recall_at_10
value: 12.661
- type: recall_at_100
value: 46.255
- type: recall_at_1000
value: 83.985
- type: recall_at_3
value: 4.627
- type: recall_at_5
value: 7.64
- task:
type: Classification
dataset:
name: MTEB ToxicConversationsClassification
type: None
config: default
split: test
revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
metrics:
- type: accuracy
value: 69.0458
- type: ap
value: 12.900935109159281
- type: f1
value: 52.757987035231515
- task:
type: Classification
dataset:
name: MTEB TweetSentimentExtractionClassification
type: None
config: default
split: test
revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
metrics:
- type: accuracy
value: 51.3299377475948
- type: f1
value: 51.459777090402014
- task:
type: Clustering
dataset:
name: MTEB TwentyNewsgroupsClustering
type: None
config: default
split: test
revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
metrics:
- type: v_measure
value: 29.194248149860375
- task:
type: PairClassification
dataset:
name: MTEB TwitterSemEval2015
type: None
config: default
split: test
revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
metrics:
- type: cos_sim_accuracy
value: 81.9991655242296
- type: cos_sim_ap
value: 58.77296106834857
- type: cos_sim_f1
value: 55.504807692307686
- type: cos_sim_precision
value: 50.971302428256074
- type: cos_sim_recall
value: 60.92348284960423
- type: dot_accuracy
value: 81.9991655242296
- type: dot_ap
value: 58.77287683899877
- type: dot_f1
value: 55.504807692307686
- type: dot_precision
value: 50.971302428256074
- type: dot_recall
value: 60.92348284960423
- type: euclidean_accuracy
value: 81.9991655242296
- type: euclidean_ap
value: 58.77295994744435
- type: euclidean_f1
value: 55.504807692307686
- type: euclidean_precision
value: 50.971302428256074
- type: euclidean_recall
value: 60.92348284960423
- type: manhattan_accuracy
value: 81.96340227692674
- type: manhattan_ap
value: 58.68949884381418
- type: manhattan_f1
value: 55.47209060441466
- type: manhattan_precision
value: 49.35225169648365
- type: manhattan_recall
value: 63.3245382585752
- type: max_accuracy
value: 81.9991655242296
- type: max_ap
value: 58.77296106834857
- type: max_f1
value: 55.504807692307686
- task:
type: PairClassification
dataset:
name: MTEB TwitterURLCorpus
type: None
config: default
split: test
revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
metrics:
- type: cos_sim_accuracy
value: 87.04544572515232
- type: cos_sim_ap
value: 81.32653456663351
- type: cos_sim_f1
value: 74.06016421812292
- type: cos_sim_precision
value: 70.96888010726131
- type: cos_sim_recall
value: 77.4330150908531
- type: dot_accuracy
value: 87.04544572515232
- type: dot_ap
value: 81.3265357801676
- type: dot_f1
value: 74.06016421812292
- type: dot_precision
value: 70.96888010726131
- type: dot_recall
value: 77.4330150908531
- type: euclidean_accuracy
value: 87.04544572515232
- type: euclidean_ap
value: 81.32653322362968
- type: euclidean_f1
value: 74.06016421812292
- type: euclidean_precision
value: 70.96888010726131
- type: euclidean_recall
value: 77.4330150908531
- type: manhattan_accuracy
value: 87.05514805759304
- type: manhattan_ap
value: 81.27825086272063
- type: manhattan_f1
value: 74.07811477198342
- type: manhattan_precision
value: 70.19297036526534
- type: manhattan_recall
value: 78.41854019094548
- type: max_accuracy
value: 87.05514805759304
- type: max_ap
value: 81.3265357801676
- type: max_f1
value: 74.07811477198342
---
| [
"BIOSSES",
"SCIFACT"
] |
TheMistoAI/MistoLine_Flux.dev | TheMistoAI | text-to-image | [
"Flux.dev",
"MistoLine_Flux.dev",
"MistoControlNet_Flux.dev",
"text-to-image",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:finetune:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | "2024-08-30T08:07:49Z" | 2024-08-30T13:17:33+00:00 | 0 | 117 | ---
base_model: black-forest-labs/FLUX.1-dev
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
pipeline_tag: text-to-image
tags:
- Flux.dev
- MistoLine_Flux.dev
- MistoControlNet_Flux.dev
---

## Summary
by TheMisto.ai @Shenzhen, China
This is a ControlNet network designed for any lineart or outline sketches, compatible with Flux1.dev. The ControlNet model parameters are approximately 1.4B.
This model is not compatible with XLabs loaders and samplers. Please use TheMisto.ai Flux ControlNet ComfyUI suite.
This is a Flow matching structure Flux-dev model, utilizing a scalable Transformer module as the backbone of this ControlNet.
We've implemented a dual-stream Transformer structure, which enhances alignment and expressiveness for various types of lineart and outline conditions without increasing inference time. The model has also been trained for alignment with both T5 and clip-l TextEncoders, ensuring balanced performance between conditioning images and text prompts.
For more details on the Flux.dev model structure, visit: https://huggingface.co/black-forest-labs/FLUX.1-dev
This ControlNet is compatible with Flux1.dev's fp16/fp8 and other models quantized with Flux1.dev. ByteDance 8/16-step distilled models have not been tested.
- The example workflow uses the flux1-dev-Q4_K_S.gguf quantized model.
- Generation quality: Flux1.dev(fp16)>>Flux1.dev(fp8)>>Other quantized models
- Generation speed: Flux1.dev(fp16)<<< Flux1.dev(fp8) <<< Other quantized models
## Performance
### Performance Across Different Sizes and Scenarios
Tested in various common scenarios such as industrial design, architecture, interior design, animation, games, and photography.
Make sure to craft your prompts well—precision is more important than length!
Performance examples are shown below:

### Performance with Different Lineart or Scribble Preprocessors
Test Parameters:
- Prompt: "Hyper-realistic 3D render of a classic Formula 1 race car, bright red with Marlboro and Agip logos, number 1, large black tires, dramatic studio lighting, dark moody background, reflective floor, cinematic atmosphere, Octane render style, high detail"
- controlnet_strength: 0.65-0.8 (Recommended: Anyline with 0.6-0.7)
- steps: 30
- guidance: 4.0
- The quality of generated images is positively correlated with prompt quality. Controlnet_strength may vary for different types of lineart and outlines, so experiment with the settings!

### Recommended Settings
- Image resolution: 720px or above on the short edge
- controlnet strength: 0.6 - 0.85 (adjust as needed)
- guidance: 3.0 - 5.0 (adjust as needed)
- steps: 30 or more
## Huggingface (抱抱脸):
[MistoLine_Flux.dev_v1](https://huggingface.co/TheMistoAI/MistoLine_Flux.dev)
## Usage
- Download the model from [MistoLine_Flux.dev_v1](https://huggingface.co/TheMistoAI/MistoLine_Flux.dev)
- Place the model in the ComfyUI\models\TheMisto_model\ directory
- The directory will be automatically created the first time you run the ComfyUI's TheMisto.ai Flux ControlNet ComfyUI suite
- Run using ComfyUI; an example workflow can be found in the workflow folder
- Note: The length and width of the conditioning image must be divisible by 16, or an error will occur.
### ComfyUI

## Training Details
The Transformer structure, under the scale law, has a significant impact on training time and computational power (higher compute cost, longer training time).
The training cost for MistoLine_Flux1_dev is several times that of MistoLineSDXL.
We conducted extensive ablation experiments to balance performance with training costs.
This training was done using A100-80GB with bf16 mixed precision on the Flux1.dev series models. Apart from Lora, consumer-grade GPUs are basically unsuitable for training.
In our experiments with larger parameter models, multi-GPUs, multi-node parallel training was required, which is costly.
If we reach 50,000 stars, we will open-source the Technical Report detailing more training details.
## License
Align to the FLUX.1 [dev] Non-Commercial License
This ComfyUI node falls under ComfyUI
This model is for research and educational purposes only and may not be used for any form of commercial purposes.
## Business Cooperation
For any custom model training, commercial cooperation, AI application development, or other business collaboration matters, please contact us.
- *Business*: [email protected]
- *Investment*: [email protected]
We also welcome investors to inquire for more information.
## WIP
- Flux1.dev-MistoCN-collection
- Flux1.dev-Misto-IPAdapter
Your support is the driving force behind our open-source efforts!
## One more thing

We will soon be launching our own product: An extremely user-friendly multi-modal AI creative tool - [Misto]
Designed to rekindle the public's creative desire through the simplest and most inspiring experience.
Unleash creativity, expand the boundaries of imagination, and turn endless inspiration into reality!
Supported Platforms: All Platforms
## Media
### International:
*Website:* https://themisto.ai/
*Discord:* https://discord.gg/fTyDB2CU
*X:* https://x.com/AiThemisto79359
### Mainland China (中国大陆):
*Website:* https://themisto.ai/
*WeChat Official Account:* TheMisto AI (Shenzhen Mixed Tuple Technology Co., Ltd.)
*Xiaohongshu:* TheMisto.ai (Xiaohongshu ID: 4211745997) | [
"CRAFT"
] |
BXresearch/DeBERTa2-0.9B-SentenceTransformer-v2 | BXresearch | sentence-similarity | [
"sentence-transformers",
"pytorch",
"deberta-v2",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:99470",
"loss:CachedGISTEmbedLoss",
"en",
"dataset:tals/vitaminc",
"dataset:allenai/scitail",
"dataset:allenai/sciq",
"dataset:allenai/qasc",
"dataset:sentence-transformers/msmarco-msmarco-distilbert-base-v3",
"dataset:sentence-transformers/natural-questions",
"dataset:sentence-transformers/trivia-qa",
"dataset:sentence-transformers/gooaq",
"dataset:google-research-datasets/paws",
"arxiv:1908.10084",
"base_model:microsoft/deberta-v2-xlarge",
"base_model:finetune:microsoft/deberta-v2-xlarge",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | "2024-09-03T21:40:51Z" | 2024-09-03T21:43:23+00:00 | 0 | 0 | ---
base_model: microsoft/deberta-v2-xlarge
datasets:
- tals/vitaminc
- allenai/scitail
- allenai/sciq
- allenai/qasc
- sentence-transformers/msmarco-msmarco-distilbert-base-v3
- sentence-transformers/natural-questions
- sentence-transformers/trivia-qa
- sentence-transformers/gooaq
- google-research-datasets/paws
language:
- en
library_name: sentence-transformers
metrics:
- pearson_cosine
- spearman_cosine
- pearson_manhattan
- spearman_manhattan
- pearson_euclidean
- spearman_euclidean
- pearson_dot
- spearman_dot
- pearson_max
- spearman_max
- cosine_accuracy
- cosine_accuracy_threshold
- cosine_f1
- cosine_f1_threshold
- cosine_precision
- cosine_recall
- cosine_ap
- dot_accuracy
- dot_accuracy_threshold
- dot_f1
- dot_f1_threshold
- dot_precision
- dot_recall
- dot_ap
- manhattan_accuracy
- manhattan_accuracy_threshold
- manhattan_f1
- manhattan_f1_threshold
- manhattan_precision
- manhattan_recall
- manhattan_ap
- euclidean_accuracy
- euclidean_accuracy_threshold
- euclidean_f1
- euclidean_f1_threshold
- euclidean_precision
- euclidean_recall
- euclidean_ap
- max_accuracy
- max_accuracy_threshold
- max_f1
- max_f1_threshold
- max_precision
- max_recall
- max_ap
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:99470
- loss:CachedGISTEmbedLoss
widget:
- source_sentence: what is the wisconsin idea
sentences:
- The Netherlands Time is in the Central European Time Zone . Central European Standard
Time ( CET ) is 1 hours ahead of Greenwich Mean Time ( GMT+1 ). Like most states
in Europe, Summer (Daylight-Saving) Time is observed in The Netherlands Time,
where the time is shifted forward by 1 hour; 2 hours ahead of Greenwich Mean Time
( GMT+2 ). After the Summer months the time in The Netherlands Time is shifted
back by 1 hour to Central European Time (CET) or ( GMT+1 )
- "Unless stated otherwise, these amounts are the total that is recommended for\
\ your dog over a 24 hour period. Most adult dogs should eat two meals a day (puppies\
\ often require three or more feedings), so youâ\x80\x99ll need to divide the\
\ amount in the table by the number of meals you are offering."
- McCarthy's book, The Wisconsin Idea, published in 1912, describes the major problems
facing the country, some of the Progressive reforms already passed, and his guiding
vision for the future. Library-Archives.
- source_sentence: More than 273 people have died from the 2019-20 coronavirus outside
mainland China .
sentences:
- 'More than 3,700 people have died : around 3,100 in mainland China and around
550 in all other countries combined .'
- 'More than 3,200 people have died : almost 3,000 in mainland China and around
275 in other countries .'
- more than 4,900 deaths have been attributed to COVID-19 .
- source_sentence: The planets orbit around the sun.
sentences:
- The planets orbit around what celestial body?
- What are the high points in a transverse wave called?
- After fertilization, how many cells does a zygote form into?
- source_sentence: Cells are small biological structures that make up all living things,
including the human body.
sentences:
- Of the three basic types of radioactive emissions, what particle is the most penetrating?
- What small biological structures make up all living things, including the human
body?
- What cellular structure is used during endocytosis to allow molecules to enter
the cell?
- source_sentence: '"""Music of the Night"" and ""All I Ask of You"" are songs from
which stage musical?"'
sentences:
- 'Electric resistance unit conversion - SI derived quantity Electric resistance
unit conversion - Discussion Forum ›› SI derived quantity: electric resistance
This category of measurement units is defined by the "electric resistance" type,
which is an SI derived quantity. ›› SI unit: ohm The SI derived unit for electric
resistance is the ohm. ›› Convert ohm to another unit Convert ohm to I''m feeling
lucky, show me some random units ›› Convert between two electric resistance units
Convert ›› Definition: Ohm The ohm (symbol: Ω) is the SI unit of electrical
impedance or, in the direct current case, electrical resistance, named after Georg
Ohm. It is defined as the resistance between two points of a conductor when a
constant potential difference of 1 volt, applied to these points, produces in
the conductor a current of 1 ampere, the conductor not being the seat of any electromotive
force.'
- 'All I Ask of You - YouTube All I Ask of You Want to watch this again later? Sign
in to add this video to a playlist. Need to report the video? Sign in to report
inappropriate content. Rating is available when the video has been rented. This
feature is not available right now. Please try again later. Published on Jan 6,
2012 Twenty-second in a series of clips from the 2004 film version of The Phantom
of the Opera. Performed by Patrick Wilson as Raoul and Emmy Rossum as Christine.
For more Phantom, find us on Facebook: http://www.facebook.com/ThePhantomOfT...
Or follow us on Twitter: http://www.twitter.com/TheOperaGhosts Or add us to your
Google+ Circles: https://plus.google.com/1019282273838... Category'
- Beatles US Labels - Variations THE CAPITOL � APPLE LABEL DISCOGRAPHY In
early 1964, Capitol Records became the primary U.S. manufacturer of The Beatles
major releases. When Apple Records was formed by the Beatles in 1968, Capitol
still maintained production and distribution control over the Beatles records.
When Apple was dissolved in 1975, the entire Beatles catalog, including solo releases,
reverted back to Capitol. In the early 1990�s, all The Beatles/solo Capitol
product was reissued on the Apple label. The Beatles, collectively and individually,
have also appeared on several other labels over the years. However, most of
these releases failed to stay in production long enough to experience significant
label design changes. The relatively few that did, are thoroughly identified
in their respective sections.
model-index:
- name: SentenceTransformer based on microsoft/deberta-v2-xlarge
results:
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts test
type: sts-test
metrics:
- type: pearson_cosine
value: 0.9197610669751131
name: Pearson Cosine
- type: spearman_cosine
value: 0.9283888594894274
name: Spearman Cosine
- type: pearson_manhattan
value: 0.9339362561833366
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.9292519323166217
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.9345390898962939
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.9297849645910548
name: Spearman Euclidean
- type: pearson_dot
value: 0.9045457355592232
name: Pearson Dot
- type: spearman_dot
value: 0.9029364900705459
name: Spearman Dot
- type: pearson_max
value: 0.9345390898962939
name: Pearson Max
- type: spearman_max
value: 0.9297849645910548
name: Spearman Max
- task:
type: binary-classification
name: Binary Classification
dataset:
name: allNLI dev
type: allNLI-dev
metrics:
- type: cosine_accuracy
value: 0.728515625
name: Cosine Accuracy
- type: cosine_accuracy_threshold
value: 0.7859227061271667
name: Cosine Accuracy Threshold
- type: cosine_f1
value: 0.6415094339622641
name: Cosine F1
- type: cosine_f1_threshold
value: 0.617089033126831
name: Cosine F1 Threshold
- type: cosine_precision
value: 0.5418326693227091
name: Cosine Precision
- type: cosine_recall
value: 0.7861271676300579
name: Cosine Recall
- type: cosine_ap
value: 0.6217331823695006
name: Cosine Ap
- type: dot_accuracy
value: 0.732421875
name: Dot Accuracy
- type: dot_accuracy_threshold
value: 706.68359375
name: Dot Accuracy Threshold
- type: dot_f1
value: 0.64
name: Dot F1
- type: dot_f1_threshold
value: 562.21875
name: Dot F1 Threshold
- type: dot_precision
value: 0.51985559566787
name: Dot Precision
- type: dot_recall
value: 0.8323699421965318
name: Dot Recall
- type: dot_ap
value: 0.6280684228983158
name: Dot Ap
- type: manhattan_accuracy
value: 0.73046875
name: Manhattan Accuracy
- type: manhattan_accuracy_threshold
value: 643.9918212890625
name: Manhattan Accuracy Threshold
- type: manhattan_f1
value: 0.6385809312638581
name: Manhattan F1
- type: manhattan_f1_threshold
value: 860.728515625
name: Manhattan F1 Threshold
- type: manhattan_precision
value: 0.5179856115107914
name: Manhattan Precision
- type: manhattan_recall
value: 0.8323699421965318
name: Manhattan Recall
- type: manhattan_ap
value: 0.6164349914982901
name: Manhattan Ap
- type: euclidean_accuracy
value: 0.73046875
name: Euclidean Accuracy
- type: euclidean_accuracy_threshold
value: 20.697372436523438
name: Euclidean Accuracy Threshold
- type: euclidean_f1
value: 0.6445497630331753
name: Euclidean F1
- type: euclidean_f1_threshold
value: 26.88058853149414
name: Euclidean F1 Threshold
- type: euclidean_precision
value: 0.5461847389558233
name: Euclidean Precision
- type: euclidean_recall
value: 0.7861271676300579
name: Euclidean Recall
- type: euclidean_ap
value: 0.6174127578384853
name: Euclidean Ap
- type: max_accuracy
value: 0.732421875
name: Max Accuracy
- type: max_accuracy_threshold
value: 706.68359375
name: Max Accuracy Threshold
- type: max_f1
value: 0.6445497630331753
name: Max F1
- type: max_f1_threshold
value: 860.728515625
name: Max F1 Threshold
- type: max_precision
value: 0.5461847389558233
name: Max Precision
- type: max_recall
value: 0.8323699421965318
name: Max Recall
- type: max_ap
value: 0.6280684228983158
name: Max Ap
- task:
type: binary-classification
name: Binary Classification
dataset:
name: Qnli dev
type: Qnli-dev
metrics:
- type: cosine_accuracy
value: 0.716796875
name: Cosine Accuracy
- type: cosine_accuracy_threshold
value: 0.6310172080993652
name: Cosine Accuracy Threshold
- type: cosine_f1
value: 0.7122302158273383
name: Cosine F1
- type: cosine_f1_threshold
value: 0.5600648522377014
name: Cosine F1 Threshold
- type: cosine_precision
value: 0.61875
name: Cosine Precision
- type: cosine_recall
value: 0.8389830508474576
name: Cosine Recall
- type: cosine_ap
value: 0.7570462713424839
name: Cosine Ap
- type: dot_accuracy
value: 0.697265625
name: Dot Accuracy
- type: dot_accuracy_threshold
value: 540.0211181640625
name: Dot Accuracy Threshold
- type: dot_f1
value: 0.6991596638655463
name: Dot F1
- type: dot_f1_threshold
value: 466.7581787109375
name: Dot F1 Threshold
- type: dot_precision
value: 0.5793871866295265
name: Dot Precision
- type: dot_recall
value: 0.8813559322033898
name: Dot Recall
- type: dot_ap
value: 0.725844374113976
name: Dot Ap
- type: manhattan_accuracy
value: 0.7265625
name: Manhattan Accuracy
- type: manhattan_accuracy_threshold
value: 778.567626953125
name: Manhattan Accuracy Threshold
- type: manhattan_f1
value: 0.7195571955719556
name: Manhattan F1
- type: manhattan_f1_threshold
value: 868.3278198242188
name: Manhattan F1 Threshold
- type: manhattan_precision
value: 0.6372549019607843
name: Manhattan Precision
- type: manhattan_recall
value: 0.826271186440678
name: Manhattan Recall
- type: manhattan_ap
value: 0.7646001426631109
name: Manhattan Ap
- type: euclidean_accuracy
value: 0.7265625
name: Euclidean Accuracy
- type: euclidean_accuracy_threshold
value: 25.061500549316406
name: Euclidean Accuracy Threshold
- type: euclidean_f1
value: 0.7172675521821632
name: Euclidean F1
- type: euclidean_f1_threshold
value: 27.673938751220703
name: Euclidean F1 Threshold
- type: euclidean_precision
value: 0.6494845360824743
name: Euclidean Precision
- type: euclidean_recall
value: 0.8008474576271186
name: Euclidean Recall
- type: euclidean_ap
value: 0.7629251882134518
name: Euclidean Ap
- type: max_accuracy
value: 0.7265625
name: Max Accuracy
- type: max_accuracy_threshold
value: 778.567626953125
name: Max Accuracy Threshold
- type: max_f1
value: 0.7195571955719556
name: Max F1
- type: max_f1_threshold
value: 868.3278198242188
name: Max F1 Threshold
- type: max_precision
value: 0.6494845360824743
name: Max Precision
- type: max_recall
value: 0.8813559322033898
name: Max Recall
- type: max_ap
value: 0.7646001426631109
name: Max Ap
---
# SentenceTransformer based on microsoft/deberta-v2-xlarge
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/deberta-v2-xlarge](https://huggingface.co/microsoft/deberta-v2-xlarge) on the negation-triplets, [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc), [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail), [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail), xsum-pairs, [sciq_pairs](https://huggingface.co/datasets/allenai/sciq), [qasc_pairs](https://huggingface.co/datasets/allenai/qasc), openbookqa_pairs, [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3), [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions), [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa), [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq), [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws) and global_dataset datasets. It maps sentences & paragraphs to a 1536-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [microsoft/deberta-v2-xlarge](https://huggingface.co/microsoft/deberta-v2-xlarge) <!-- at revision 1d134961d4db8e7e8eb1bc1ab81cb370244c57f7 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 1536 tokens
- **Similarity Function:** Cosine Similarity
- **Training Datasets:**
- negation-triplets
- [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc)
- [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail)
- [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail)
- xsum-pairs
- [sciq_pairs](https://huggingface.co/datasets/allenai/sciq)
- [qasc_pairs](https://huggingface.co/datasets/allenai/qasc)
- openbookqa_pairs
- [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3)
- [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions)
- [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa)
- [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq)
- [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws)
- global_dataset
- **Language:** en
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model
(1): Pooling({'word_embedding_dimension': 1536, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("BXresearch/DeBERTa3-0.4B-ST-v2")
# Run inference
sentences = [
'"""Music of the Night"" and ""All I Ask of You"" are songs from which stage musical?"',
'All I Ask of You - YouTube All I Ask of You Want to watch this again later? Sign in to add this video to a playlist. Need to report the video? Sign in to report inappropriate content. Rating is available when the video has been rented. This feature is not available right now. Please try again later. Published on Jan 6, 2012 Twenty-second in a series of clips from the 2004 film version of The Phantom of the Opera. Performed by Patrick Wilson as Raoul and Emmy Rossum as Christine. For more Phantom, find us on Facebook: http://www.facebook.com/ThePhantomOfT... Or follow us on Twitter: http://www.twitter.com/TheOperaGhosts Or add us to your Google+ Circles: https://plus.google.com/1019282273838... Category',
'Beatles US Labels - Variations \xa0 THE CAPITOL � APPLE LABEL DISCOGRAPHY\xa0 In early 1964, Capitol Records became the primary U.S. manufacturer of The Beatles major releases.\xa0 When Apple Records was formed by the Beatles in 1968, Capitol still maintained production and distribution control over the Beatles records.\xa0 When Apple was dissolved in 1975, the entire Beatles catalog, including solo releases, reverted back to Capitol. In the early 1990�s, all The Beatles/solo Capitol product was reissued on the Apple label.\xa0 The Beatles, collectively and individually, have also appeared on several other labels over the years.\xa0 However, most of these releases failed to stay in production long enough to experience significant label design changes.\xa0 The relatively few that did, are thoroughly identified in their respective sections.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1536]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Semantic Similarity
* Dataset: `sts-test`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.9198 |
| **spearman_cosine** | **0.9284** |
| pearson_manhattan | 0.9339 |
| spearman_manhattan | 0.9293 |
| pearson_euclidean | 0.9345 |
| spearman_euclidean | 0.9298 |
| pearson_dot | 0.9045 |
| spearman_dot | 0.9029 |
| pearson_max | 0.9345 |
| spearman_max | 0.9298 |
#### Binary Classification
* Dataset: `allNLI-dev`
* Evaluated with [<code>BinaryClassificationEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator)
| Metric | Value |
|:-----------------------------|:-----------|
| cosine_accuracy | 0.7285 |
| cosine_accuracy_threshold | 0.7859 |
| cosine_f1 | 0.6415 |
| cosine_f1_threshold | 0.6171 |
| cosine_precision | 0.5418 |
| cosine_recall | 0.7861 |
| cosine_ap | 0.6217 |
| dot_accuracy | 0.7324 |
| dot_accuracy_threshold | 706.6836 |
| dot_f1 | 0.64 |
| dot_f1_threshold | 562.2188 |
| dot_precision | 0.5199 |
| dot_recall | 0.8324 |
| dot_ap | 0.6281 |
| manhattan_accuracy | 0.7305 |
| manhattan_accuracy_threshold | 643.9918 |
| manhattan_f1 | 0.6386 |
| manhattan_f1_threshold | 860.7285 |
| manhattan_precision | 0.518 |
| manhattan_recall | 0.8324 |
| manhattan_ap | 0.6164 |
| euclidean_accuracy | 0.7305 |
| euclidean_accuracy_threshold | 20.6974 |
| euclidean_f1 | 0.6445 |
| euclidean_f1_threshold | 26.8806 |
| euclidean_precision | 0.5462 |
| euclidean_recall | 0.7861 |
| euclidean_ap | 0.6174 |
| max_accuracy | 0.7324 |
| max_accuracy_threshold | 706.6836 |
| max_f1 | 0.6445 |
| max_f1_threshold | 860.7285 |
| max_precision | 0.5462 |
| max_recall | 0.8324 |
| **max_ap** | **0.6281** |
#### Binary Classification
* Dataset: `Qnli-dev`
* Evaluated with [<code>BinaryClassificationEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator)
| Metric | Value |
|:-----------------------------|:-----------|
| cosine_accuracy | 0.7168 |
| cosine_accuracy_threshold | 0.631 |
| cosine_f1 | 0.7122 |
| cosine_f1_threshold | 0.5601 |
| cosine_precision | 0.6188 |
| cosine_recall | 0.839 |
| cosine_ap | 0.757 |
| dot_accuracy | 0.6973 |
| dot_accuracy_threshold | 540.0211 |
| dot_f1 | 0.6992 |
| dot_f1_threshold | 466.7582 |
| dot_precision | 0.5794 |
| dot_recall | 0.8814 |
| dot_ap | 0.7258 |
| manhattan_accuracy | 0.7266 |
| manhattan_accuracy_threshold | 778.5676 |
| manhattan_f1 | 0.7196 |
| manhattan_f1_threshold | 868.3278 |
| manhattan_precision | 0.6373 |
| manhattan_recall | 0.8263 |
| manhattan_ap | 0.7646 |
| euclidean_accuracy | 0.7266 |
| euclidean_accuracy_threshold | 25.0615 |
| euclidean_f1 | 0.7173 |
| euclidean_f1_threshold | 27.6739 |
| euclidean_precision | 0.6495 |
| euclidean_recall | 0.8008 |
| euclidean_ap | 0.7629 |
| max_accuracy | 0.7266 |
| max_accuracy_threshold | 778.5676 |
| max_f1 | 0.7196 |
| max_f1_threshold | 868.3278 |
| max_precision | 0.6495 |
| max_recall | 0.8814 |
| **max_ap** | **0.7646** |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Datasets
#### negation-triplets
* Dataset: negation-triplets
* Size: 4,987 training samples
* Columns: <code>anchor</code>, <code>entailment</code>, and <code>negative</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | entailment | negative |
|:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string | string |
| details | <ul><li>min: 4 tokens</li><li>mean: 21.87 tokens</li><li>max: 144 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 13.72 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 13.94 tokens</li><li>max: 39 tokens</li></ul> |
* Samples:
| anchor | entailment | negative |
|:--------------------------------------------------------------|:---------------------------------------------------------------------------|:----------------------------------------------------------------------------|
| <code>a very dirty toilet in a tiled bathroom</code> | <code>A dirty toilet in a dirty bathroom with a octagon tile floor.</code> | <code>A clean toilet in a dirty bathroom with an octagon tile floor.</code> |
| <code>enjoy the wildlife</code> | <code>Enjoy the animals.</code> | <code>Ignore the animals.</code> |
| <code>A man looking inside of birdcages on a sidewalk.</code> | <code>A man, holding his hat, is looking into the bird cage. </code> | <code>A man, holding his hat, is looking away from the bird cage. </code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### vitaminc-pairs
* Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0)
* Size: 4,987 training samples
* Columns: <code>claim</code> and <code>evidence</code>
* Approximate statistics based on the first 1000 samples:
| | claim | evidence |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 6 tokens</li><li>mean: 16.62 tokens</li><li>max: 54 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 36.79 tokens</li><li>max: 133 tokens</li></ul> |
* Samples:
| claim | evidence |
|:-------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Mase was born before 1976 .</code> | <code>Mase was born Mason Drell Betha in Jacksonville , Florida , on August 27 , 1975 , as a fraternal twin born almost two months premature .</code> |
| <code>On Rotten Tomatoes , Going in Style received more than 105 reviews and a rating of under 47 % .</code> | <code>On Rotten Tomatoes , the film has an approval rating of 46 % based on 106 reviews , with an average rating of 5.3/10 .</code> |
| <code>Aaron Charles Donald is an American football defensive end for Los Angeles Rams .</code> | <code>Aaron Charles Donald ( born May 23 , 1991 ) is an American football defensive end for the Los Angeles Rams of the National Football League ( NFL ) .</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### scitail-pairs-qa
* Dataset: [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 4,987 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 7 tokens</li><li>mean: 15.84 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 14.96 tokens</li><li>max: 41 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:--------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------|
| <code>A negative enthalpy change is observed in an exothermic reaction.</code> | <code>What enthalpy change is observed in an exothermic reaction?</code> |
| <code>Fungus-like protists such as slime molds reproduce with spores.</code> | <code>How do fungus-like protists such as slime molds reproduce?</code> |
| <code>Unlike energy, matter doesn’t need to be constantly added to ecosystems because it is recycled through ecosystems.</code> | <code>Unlike energy, what doesn’t need to be constantly added to ecosystems because it is recycled through ecosystems?</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### scitail-pairs-pos
* Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 4,987 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 8 tokens</li><li>mean: 23.21 tokens</li><li>max: 64 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.66 tokens</li><li>max: 39 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------|
| <code>During equinox times the Sun's vertical ray is shining directly on the Earth's equator and neither hemisphere is tilted toward or away from the Sun.</code> | <code>The sun is directly over the equator during.</code> |
| <code>All the baby s major organs begin to develop in the first 6 to 8 weeks of pregnancy, so tight control from the moment of conception is critical.</code> | <code>By 8 weeks, all major organs start developing.</code> |
| <code>Nobody disputes that all modern humans belong to one species, Homo sapiens .</code> | <code>Humans belong to the species homo sapiens.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### xsum-pairs
* Dataset: xsum-pairs
* Size: 4,987 training samples
* Columns: <code>summary</code> and <code>document</code>
* Approximate statistics based on the first 1000 samples:
| | summary | document |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 8 tokens</li><li>mean: 25.19 tokens</li><li>max: 44 tokens</li></ul> | <ul><li>min: 51 tokens</li><li>mean: 217.32 tokens</li><li>max: 422 tokens</li></ul> |
* Samples:
| summary | document |
|:------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Chester's Elliott Durrell salvaged a 1-1 draw at Boreham Wood with a late wonder strike.</code> | <code>Morgan Ferrier's fifth-minute effort looked like it would be enough for the hosts after the 24-year-old evaded Blaine Hudson to get in behind and tap the ball home.<br>After that the game became scrappy, with neither side creating many clear-cut chances, although Chester's Ryan Astles was forced into a goal-line clearance after 70 minutes to block a Matthew Paine drive.<br>Just when it looked like Chester boss Jon McCarthy was staring at a fourth defeat in his first five games, Durrell picked the ball up around halfway, spotted Grant Smith off his line and struck from distance to level things up with seven minutes remaining.<br>Report supplied by the Press Association<br>Match ends, Boreham Wood 1, Chester FC 1.<br>Second Half ends, Boreham Wood 1, Chester FC 1.<br>Substitution, Chester FC. Sam Hughes replaces Kane Richards.<br>Goal! Boreham Wood 1, Chester FC 1. Elliott Durrell (Chester FC).<br>Substitution, Boreham Wood. Jordan Chiedozie replaces Morgan Ferrier.<br>Substitution, Boreham Wood. Aaron Kuhl replaces Kenny Davis.<br>Substitution, Chester FC. Elliott Durrell replaces Jordan Chapell.<br>Evan Horwood (Chester FC) is shown the yellow card for a bad foul.<br>Second Half begins Boreham Wood 1, Chester FC 0.<br>First Half ends, Boreham Wood 1, Chester FC 0.<br>Goal! Boreham Wood 1, Chester FC 0. Morgan Ferrier (Boreham Wood).<br>First Half begins.<br>Lineups are announced and players are warming up.</code> |
| <code>A major trauma centre in Stoke-on-Trent has been rated the best in the country for saving the lives of patients.</code> | <code>The University Hospitals of North Midlands Major Trauma Centre has the best total rolling survival rates of any adult major trauma single site centre since 2013.<br>Latest statistics show that for every 1,000 people treated in the last four years, 13 more survived than expected.<br>The centre is based at Royal Stoke University Hospital.<br>Medical director Dr John Oxtoby, described it as a "huge accomplishment".<br>See more stories from across Stoke and Staffordshire here<br>The figures come from the Trauma Audit and Research Network, an independent monitor of trauma care in England and Wales.<br>The data shows the centre also had the best survival rates for adult major trauma in 2015-16, when there were 15 extra survivors per 1,000 patients than expected.<br>"To have the best survival rates over four years of any major trauma centre is a phenomenal achievement," Dr Oxtoby added.<br>The centre treats patients from as far away as north Wales and the Peak District.<br>Those treated include people seriously injured in incidents such as vehicle crashes, falls, or assaults.</code> |
| <code>Sunderland defender Paddy McNair will miss the rest of the season because of a cruciate knee ligament injury.</code> | <code>The 21-year-old, signed from Manchester United in August, has made 12 appearances for a Black Cats side that are 19th in the Premier League.<br>He was injured during his side's 3-0 win over Hull City on Saturday.<br>"We won't see him again this season and all we can hope is getting him right for the start of next season," said Sunderland boss David Moyes.<br>"I think he'd just started to find his way in the Premier League - even though he had experience at Manchester United - and the games he was having were bringing him on and giving him confidence."<br>The injury means McNair is unlikely to feature in Northern Ireland's World Cup 2018 qualifier against Norway in March.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### sciq_pairs
* Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815)
* Size: 4,987 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 7 tokens</li><li>mean: 16.93 tokens</li><li>max: 76 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 86.84 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:----------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>The procedure known as angioplasty is used when what part of the circulatory system is clogged?</code> | <code>When a blood vessel gets clogged, there is no medical equivalent of "Drano" that will clear it out. There is, however, a procedure known as angioplasty. A thin tube with a balloon is threaded through the blood vessels. Once in place, the balloon is inflated to compress the clog against the artery wall.</code> |
| <code>Mollusks such as squid and octopi, which must hunt to survive, possess what complex organs containing millions of neurons?</code> | <code>Figure 35.2 Nervous systems vary in structure and complexity. In (a) cnidarians, nerve cells form a decentralized nerve net. In (b) echinoderms, nerve cells are bundled into fibers called nerves. In animals exhibiting bilateral symmetry such as (c) planarians, neurons cluster into an anterior brain that processes information. In addition to a brain, (d) arthropods have clusters of nerve cell bodies, called peripheral ganglia, located along the ventral nerve cord. Mollusks such as squid and (e) octopi, which must hunt to survive, have complex brains containing millions of neurons. In (f) vertebrates, the brain and spinal cord comprise the central nervous system, while neurons extending into the rest of the body comprise the peripheral nervous system. (credit e: modification of work by Michael Vecchione, Clyde F. Roper, and Michael J. Sweeney, NOAA; credit f: modification of work by NIH).</code> |
| <code>Combining nonpolar olive oil and polar vinegar yields what type of mixture?</code> | <code>Another familiar example is the mixing of vinegar and olive oil. Olive oil is a nonpolar substance, while vinegar (which is mostly water and acetic acid) is polar. The result is a heterogeneous mixture that exhibits a bilayer.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### qasc_pairs
* Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070)
* Size: 4,987 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 4 tokens</li><li>mean: 11.33 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 33.52 tokens</li><li>max: 67 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:--------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>What happens before the rigid structure that surrounds the plant cell dilates?</code> | <code>Dilation occurs when cell walls relax.. Cell wall is the rigid structure that surrounds the plant cell. <br> Dilation occurs when the rigid structure that surrounds the plant cell relaxes.</code> |
| <code>Reusing plastic bags has a positive impact on what?</code> | <code>recycling has a positive impact on the environment. Plastic bags are recyclable and they are reusable. <br> Reusing plastic bags has a positive impact on the environment</code> |
| <code>What protects the body from harmful substances?</code> | <code>skin is used for protecting the body from harmful substances. Skin is a protective organ. <br> organs protect the body from harmful substances</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### openbookqa_pairs
* Dataset: openbookqa_pairs
* Size: 3,007 training samples
* Columns: <code>question</code> and <code>fact</code>
* Approximate statistics based on the first 1000 samples:
| | question | fact |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 3 tokens</li><li>mean: 13.75 tokens</li><li>max: 78 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 11.34 tokens</li><li>max: 31 tokens</li></ul> |
* Samples:
| question | fact |
|:----------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------|
| <code>Heat exposure at higher temperatures without ability to regulate internal body temperatures will result in the expiration of which of these?</code> | <code>if an organism becomes too hot then that organism may die</code> |
| <code>Which of the following would be part of the water cycle?</code> | <code>evaporation is a stage in the water cycle process</code> |
| <code>polar bears are white due to an inherited</code> | <code>the color of fur is an inherited characteristic</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### msmarco_pairs
* Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9)
* Size: 4,987 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 4 tokens</li><li>mean: 8.68 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 75.36 tokens</li><li>max: 238 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:--------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>how many days is it in a month</code> | <code>Each month has either 28, 30, or 31 days during a common year, which has 365 days. During leap years, which occur nearly every 4 years, we add an extra (intercalary) day, Leap Day, on 29 February, making leap years 366 days long.</code> |
| <code>who were the peloponnesian wars between? who won them?</code> | <code>The Greek city-state of Sparta won the war against Athens. The war, known as the Peloponnesian War, raged for 27 years between the Athenian realm and the Peloponnesian coalition commanded by the Spartans. The Peloponnesian War began in 431 B.C.C. and ended in 404 B.C.E. when Athens conceded defeat to Sparta.</code> |
| <code>average nurse practitioner salary ny</code> | <code>Nurse Practitioner New York, NY Salary. Nurse Practitioner New York, NY average salary is $91,897, median salary is $79,060 with a salary range from $20,530 to $1,926,393. Nurse Practitioner New York, NY salaries are collected from government agencies and companies. Each salary is associated with a real job position.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### nq_pairs
* Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
* Size: 4,987 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 10 tokens</li><li>mean: 11.79 tokens</li><li>max: 23 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 131.1 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:-----------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>parks and rec episode where ron gets a hernia</code> | <code>The Stakeout (Parks and Recreation) Back at city hall, Ron (Nick Offerman) remains immobile in his chair all day due to a hernia which causes excruciating pain if he moves. Ron remains immobile in his seat well into the night, until the janitors turn the lights off on him. The intern, April (Aubrey Plaza), returns to check on him, and wheels Ron out to the car on his office chair to bring him to the hospital.</code> |
| <code>when did jane beale come back to eastenders</code> | <code>Jane Beale Jane Beale (also Collins and Clarke) is a fictional character from the BBC soap opera, EastEnders, played by Laurie Brett. She made her first appearance on 21 June 2004. Brett took maternity leave in 2011[1] and departed on 19 May.[2] She returned on 8 November[3] and departed again on 27 January 2012.[4][5] Jane made temporary a return to the show on 6 January 2014 until 20 May and permanently from 24 November 2014.[6][7] Her major storylines have included her relationship and later marriages to Ian Beale (Adam Woodyatt); the first ended due to his affair with Glenda Mitchell (Glynis Barber), an affair with Grant Mitchell (Ross Kemp), accidentally shot by Ian's stepson Steven Beale (Aaron Sidwell) which results her desire to have a child of her own following a hysterectomy, a relationship with Masood Ahmed (Nitin Ganatra), her popular friendship with Tanya Branning (Jo Joyner), covering up her adopted son Bobby's (Eliot Carrington) role in the murder of her stepdaughter Lucy Beale (Hetti Bywater), being paralysed after brutally attacked with a hockey stick by Bobby, caught up in the fire at Beale's restaurant, which was started by Steven before she was then left in the blaze by Max Branning (Jake Wood) and being forced by Max to leave Walford which led to a second departure on 23 October 2017.[8] She made a voiceover appearance on 8 December.</code> |
| <code>who sings what lovers do with maroon 5</code> | <code>What Lovers Do "What Lovers Do" is a song by American pop rock band Maroon 5 featuring American R&B singer SZA. It was released on August 30, 2017, as the lead single from the band's sixth studio album Red Pill Blues (2017).[4] The song contains an interpolation of the 2016 song "Sexual" by Neiked featuring Dyo, therefore Victor Rådström, Dyo and Elina Stridh are credited as songwriters.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### trivia_pairs
* Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0)
* Size: 4,987 training samples
* Columns: <code>query</code> and <code>answer</code>
* Approximate statistics based on the first 1000 samples:
| | query | answer |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 8 tokens</li><li>mean: 17.26 tokens</li><li>max: 64 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 205.76 tokens</li><li>max: 402 tokens</li></ul> |
* Samples:
| query | answer |
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>What word is the most common synonym of the word “corsair”?</code> | <code>Corsair Synonyms, Corsair Antonyms | Thesaurus.com Cite This Source Word Origin & History corsair 1549, from Fr. corsaire, from Prov. cursar, It. corsaro, from M.L. cursarius "pirate," from L. cursus "course, a running," from currere "to run" (see current). Meaning evolved in M.L. from "course" to "journey" to "expedition" to an expedition specifically for plunder. Example Sentences for corsair I have been a mercenary soldier, a corsair, a kozak, and a hundred other things. The words were out and the thing was done before Asad had realized the corsair's intent. He left a corsair's name to other times, Linked with one virtue, and a thousand crimes. If it were a Corsair, the rowers would all be Christian prisoners. After the failure of her attempt to board us, the corsair hauled aft her sheets and shot ahead of the Good Hope. I have read the Corsair, mended my petticoat, and have nothing else to do. He had also a mania for travelling, and when he was only two-and-twenty was captured by an Algerian corsair and enslaved. You are wrong there,” said the corsair, “for we would have attacked you all the same. To Moore he dedicated his "Corsair," and to read the preface is to see how sincerely attached Byron was to his friend. The corsair was standing by the side of Mr Tompkins, close by the taffrail.</code> |
| <code>"What scale initially related wind conditions to their effects on the sails of a man of war, from ""sufficient for steerage"" to ""which canvas sails couldn't withstand."""</code> | <code>Oil droplets transport due to irregular waves: Development of large-scale spreading coefficients "The sea state can be defined in a variety of ways. For example, one may use the Beaufort scale (Singleton, 2008) that describes the sea state based on qualitative terms. It is an empirical yet expedient scale. " [Show abstract] [Hide abstract] ABSTRACT: The movement of oil droplets due to waves and buoyancy was investigated by assuming an irregular sea state following a JONSWAP spectrum and four buoyancy values. A technique known as Wheeler stretching was used to model the movement of particles under the moving water surface. In each simulation, 500 particles were released and were tracked for a real time of 4.0 h. A Monte Carlo approach was used to obtain ensemble properties. It was found that small eddy diffusivities that decrease rapidly with depth generated the largest horizontal spreading of the plume. It was also found that large eddy diffusivities that decrease slowly with depth generated the smallest horizontal spreading coefficient of the plume. The increase in buoyancy resulted in a decrease in the horizontal spreading coefficient, which suggests that two-dimensional (horizontal) models that predict the transport of surface oil could be overestimating the spreading of oil. Full-text · Article · Jan 2016</code> |
| <code>What is the name given to the collection of Welsh legends?</code> | <code>1000+ images about Welsh - Legends & Folklore on Pinterest | Alan lee, Welsh and King arthur When looking for some illustrations on the Mabinogion I found this amazing looking book- The Golden Cockerel Sir Gawain so now I'm on the hunt for the artist. See More</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### gooaq_pairs
* Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
* Size: 4,987 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 8 tokens</li><li>mean: 11.52 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 56.25 tokens</li><li>max: 143 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:----------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>how long does it take to get used to getting up earlier?</code> | <code>The most efficient method for changing the time you wake up is to do it gradually — 10–15 minutes earlier for 1–3 days until you feel used to it, and then lower it down. If you get up at 8 a.m. generally, don't suddenly change it to 6 a.m. Try 7:45 a.m. first.</code> |
| <code>what are the differences among standards goals and objectives?</code> | <code>Standard: Written expectations of what students are expected to know at a specific stage of education. These are based off of learning objectives. Goal: Unique to an individual student. May stem from a standard, but also relates to a student's academic habits and organization.</code> |
| <code>how do u work out the surface area of a triangular prism?</code> | <code>A triangular prism has three rectangular sides and two triangular faces. To find the area of the rectangular sides, use the formula A = lw, where A = area, l = length, and h = height. To find the area of the triangular faces, use the formula A = 1/2bh, where A = area, b = base, and h = height.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### paws-pos
* Dataset: [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws) at [161ece9](https://huggingface.co/datasets/google-research-datasets/paws/tree/161ece9501cf0a11f3e48bd356eaa82de46d6a09)
* Size: 4,987 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 8 tokens</li><li>mean: 25.62 tokens</li><li>max: 52 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 25.61 tokens</li><li>max: 52 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:----------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------|
| <code>Primidone also causes exfoliative dermatitis , Johnson -- Stevens syndrome , and toxic epidermal necrolysis .</code> | <code>Primidone also causes exfoliative dermatitis , Johnson - Stevens -- Syndrome and toxic epidermal necrolysis .</code> |
| <code>The agency was founded in 1976 in Chicago , and it entered the New York market in 1998 and Milwaukee in 2009 .</code> | <code>The agency was founded in Chicago in 1976 and entered New York in 1998 and Milwaukee in 2009 .</code> |
| <code>After his death , the widow of Kellow Mary Hope Kellow with her daughter Mary moved to Sydney .</code> | <code>After his death , Kellow 's widow Mary Hope Kellow moved to Sydney with her daughter Mary .</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### global_dataset
* Dataset: global_dataset
* Size: 36,619 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 4 tokens</li><li>mean: 28.68 tokens</li><li>max: 368 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 55.88 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:--------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>The Turks , Tibetans , Muslim Arabs , and Tang competed for control of Central Asia until the tang 's collapse in the 10th century .</code> | <code>The Turks , Tang , Muslim Arabs and the Tibetans competed for control over Central Asia until the collapse of the Tang in the 10th century .</code> |
| <code>What do animals use to reproduce?</code> | <code>an animal needs to attract a mate to reproduce. Animals mate because of smells. <br> animals attract with smells</code> |
| <code>Some touch receptors sense a difference in pain or what?</code> | <code>Some touch receptors sense differences in temperature or pain.. Heat and temperature are the same. <br> Some touch receptors sense differences in heat or pain.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
### Evaluation Datasets
#### vitaminc-pairs
* Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0)
* Size: 128 evaluation samples
* Columns: <code>claim</code> and <code>evidence</code>
* Approximate statistics based on the first 1000 samples:
| | claim | evidence |
|:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 9 tokens</li><li>mean: 19.71 tokens</li><li>max: 38 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 32.5 tokens</li><li>max: 78 tokens</li></ul> |
* Samples:
| claim | evidence |
|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Dragon Con had over 5000 guests .</code> | <code>Among the more than 6000 guests and musical performers at the 2009 convention were such notables as Patrick Stewart , William Shatner , Leonard Nimoy , Terry Gilliam , Bruce Boxleitner , James Marsters , and Mary McDonnell .</code> |
| <code>COVID-19 has reached more than 185 countries .</code> | <code>As of , more than cases of COVID-19 have been reported in more than 190 countries and 200 territories , resulting in more than deaths .</code> |
| <code>In March , Italy had 3.6x times more cases of coronavirus than China .</code> | <code>As of 12 March , among nations with at least one million citizens , Italy has the world 's highest per capita rate of positive coronavirus cases at 206.1 cases per million people ( 3.6x times the rate of China ) and is the country with the second-highest number of positive cases as well as of deaths in the world , after China .</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### negation-triplets
* Dataset: negation-triplets
* Size: 128 evaluation samples
* Columns: <code>anchor</code>, <code>entailment</code>, and <code>negative</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | entailment | negative |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string | string |
| details | <ul><li>min: 8 tokens</li><li>mean: 14.54 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 12.12 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 12.43 tokens</li><li>max: 22 tokens</li></ul> |
* Samples:
| anchor | entailment | negative |
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------|:----------------------------------------------------------|
| <code>A Marine that is looking at his cell phone.</code> | <code>A marine in uniform using a smart phone.</code> | <code>Not a marine in uniform using a smart phone.</code> |
| <code>A snowboarder on a wide plain of snow</code> | <code>A snow field with a snowboarder on it</code> | <code>An empty field with no snowboarder on it</code> |
| <code>Three men, one holding pipes, another holding a large object above his head, and one resting against the pipe bed on the truck, are looking at the camera.</code> | <code>three men look at the camera</code> | <code>three men ignore the camera</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### scitail-pairs-pos
* Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 9 tokens</li><li>mean: 20.13 tokens</li><li>max: 53 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 15.48 tokens</li><li>max: 23 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|
| <code>humans normally have 23 pairs of chromosomes.</code> | <code>Humans typically have 23 pairs pairs of chromosomes.</code> |
| <code>A solution is a homogenous mixture of two or more substances that exist in a single phase.</code> | <code>Solution is the term for a homogeneous mixture of two or more substances.</code> |
| <code>Upwelling The physical process in near-shore ocean systems of rising of nutrients and colder bottom waters to the surface because of constant wind patterns along the shoreline.</code> | <code>Upwelling is the term for when deep ocean water rises to the surface.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### scitail-pairs-qa
* Dataset: [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 7 tokens</li><li>mean: 16.6 tokens</li><li>max: 33 tokens</li></ul> | <ul><li>min: 8 tokens</li><li>mean: 15.67 tokens</li><li>max: 33 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------|
| <code>Magma comes toward earth's crust through mantle plumes.</code> | <code>What substance comes toward earth's crust through mantle plumes?</code> |
| <code>The understory of the rainforest commonly has ferns and other ground plants.</code> | <code>What part of the rainforest commonly has ferns and other ground plants?</code> |
| <code>Because trees add water vapor to air, cutting down forests leads to longer periods of drought.</code> | <code>Because trees add water vapor to air, cutting down forests leads to longer periods of what?</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### xsum-pairs
* Dataset: xsum-pairs
* Size: 128 evaluation samples
* Columns: <code>summary</code> and <code>document</code>
* Approximate statistics based on the first 1000 samples:
| | summary | document |
|:--------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 13 tokens</li><li>mean: 25.42 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 67 tokens</li><li>mean: 213.0 tokens</li><li>max: 354 tokens</li></ul> |
* Samples:
| summary | document |
|:-----------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>An exam paper for Scottish secondary pupils is being re-issued after a mistake was spotted.</code> | <code>The National 5 Modern Studies exam will take place on Friday afternoon.<br>The Scottish Qualification Authority (SQA) said the paper has been reprinted because of a typographical error.<br>The exams body said there was no suggestion of a security breach. A spokesman added that it had procedures in place to deal with situations like this.<br>The printing mistake in the original paper was in a diagram, not a question.<br>The SQA said it acted quickly after the error was spotted.<br>The reprinted paper - with the correct text in the diagram but otherwise identical - will be issued to exam centres across Scotland.<br>Last year the Higher English paper was replaced amid fears of a security breach.<br>The SQA also faced criticism over mistakes in the National 5 computing paper.<br>Earlier this month, the SQA announced that teachers would not have access to exam papers until the day after each test takes place.<br>It said the change was in order to improve security and confidentiality. The largest teachers' union, the EIS, has condemned the move.</code> |
| <code>US regulators have told seven carmakers the recall of airbags made by Japanese firm Takata is likely to expand.</code> | <code>The National Highway Traffic Safety Administration (NHTSA) has written to firms including Mercedes-Benz, Jaguar-Land Rover and Tesla to ask which of their models use the Takata parts.<br>About 23.4 million Takata airbag inflators have been recalled in the US.<br>The airbags have been linked to eight deaths and more than 100 injuries around the world.<br>It was found they can inflate with excessive force, spraying metal shrapnel at the drivers.<br>The driver and passenger airbags were in more than 19 million cars sold by 11 different companies such as Honda in the US.<br>In the letters sent last week, the NHTSA said the recall "will likely grow to include vehicles that are outside the scope of the current recalls".<br>The agency will attend a public meeting in Washington on 22 October to discuss the Takata investigation and whether it will take over management of the recalls to speed up the repairs.<br>Carmakers are struggling to get parts with only 4.4 million airbag inflators replaced since the start of this month.<br>The other automakers that received the letters include Suzuki, Volvo Trucks, Volkswagen and Spartan Motors.<br>So far Mercedes, Jaguar-Land Rover and Tesla have all said the air bags they used from Takata are not part of current recalls, according to the Associated Press.</code> |
| <code>A salt lake in Melbourne has turned pink due to a combination of sunlight, warm temperatures and low rainfall.</code> | <code>Wildlife officers said algae growing in the salt crust at the bottom of Westgate Park's lake produce a red pigment.<br>"Enjoy the views, but we recommend you don't come into contact with the water," Parks Victoria said.<br>The phenomenon also occurs in Spain's Salina de Torrevieja, Canada's Dusty Rose Lake and Senegal's Lake Retba.<br>In Australia, the natural occurring sight can be seen in Victoria's Murray-Sunset National Park and Western Australia's Lake Hillier.<br>You might also be interested in:<br>Dr Mark Norman, Parks Victoria chief conservation scientist, said the colouration was caused by a harmless, single-cell alga known as Dunalliela.<br>"It's completely natural," he said. "We often get comments that it looks like an industrial accident of pink paint."<br>Dr Norman said that even though the water is not dangerous, he would not recommend taking a swim.<br>"It's so salty and muddy on the bottom that you would come out looking like a frosted rum ball, especially when you dried," he said.<br>Parks Victoria said the lake is expected to return to blue when the weather cooled and the rainfall increased.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### sciq_pairs
* Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 9 tokens</li><li>mean: 16.98 tokens</li><li>max: 58 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 86.18 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:-----------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>What structures receive blood from the atria and pump it out of the heart?</code> | <code>The bottom two chambers of the heart are called the left and right ventricles. The ventricles receive blood from the atria and pump it out of the heart, either to the lungs or to the rest of the body.</code> |
| <code>Amphibians have permeable skin which allows for the exchange of oxygen and carbon dioxide, what is this "breathing called?"</code> | <code>Characteristics of Amphibians As tetrapods, most amphibians are characterized by four well-developed limbs. Some species of salamanders and all caecilians are functionally limbless; their limbs are vestigial. An important characteristic of extant amphibians is a moist, permeable skin that is achieved via mucus glands that keep the skin moist; thus, exchange of oxygen and carbon dioxide with the environment can take place through it ( cutaneous respiration). Additional characteristics of amphibians include pedicellate teeth—teeth in which the root and crown are calcified, separated by a zone of noncalcified tissue—and a papilla amphibiorum and papilla basilaris, structures of the inner ear that are sensitive to frequencies below and above 10,00 hertz, respectively. Amphibians also have an auricular operculum, which is an extra bone in the ear that transmits sounds to the inner ear. All extant adult amphibians are carnivorous, and some terrestrial amphibians have a sticky tongue that is used to capture prey.</code> |
| <code>What form do alkali metals take at room temperature?</code> | <code>Alkali metals are all solids at room temperature.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### qasc_pairs
* Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 5 tokens</li><li>mean: 11.25 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 34.65 tokens</li><li>max: 66 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:-----------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Warm body temperature requires what?</code> | <code>an animal usually requires a warm body temperature for survival. Most animals require water regularly. <br> a warm body temperature requires water</code> |
| <code>what rotates causing cycles of day and night?</code> | <code>a planet rotating causes cycles of day and night on that planet. Of all the planets, Mars is most like Earth. <br> Mars rotating causes cycles of day and night</code> |
| <code>What being tilted on its rotating axis causes spring, summer, autumn, and winter.</code> | <code>the Earth being tilted on its rotating axis causes seasons. Spring, summer, autumn, and winter are the seasons of the year. <br> Earth being tilted on its rotating axis causes spring, summer, autumn, and winter.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### openbookqa_pairs
* Dataset: openbookqa_pairs
* Size: 128 evaluation samples
* Columns: <code>question</code> and <code>fact</code>
* Approximate statistics based on the first 1000 samples:
| | question | fact |
|:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 3 tokens</li><li>mean: 13.96 tokens</li><li>max: 47 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 11.78 tokens</li><li>max: 28 tokens</li></ul> |
* Samples:
| question | fact |
|:-----------------------------------------------------------------------|:-----------------------------------------------------------------------------|
| <code>The thermal production of a stove is generically used for</code> | <code>a stove generates heat for cooking usually</code> |
| <code>What creates a valley?</code> | <code>a valley is formed by a river flowing</code> |
| <code>when it turns day and night on a planet, what cause this?</code> | <code>a planet rotating causes cycles of day and night on that planet</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### msmarco_pairs
* Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:--------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 4 tokens</li><li>mean: 9.1 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 74.29 tokens</li><li>max: 200 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:----------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>tcf routing number illinois</code> | <code>Routing Number 271972572. Tcf National Bank Illinois Routing Number. TCF NATIONAL BANK ILLINOIS ROUTING ABA NUMBER. 271972572 routing number is a 9-digit number designed and assigned to Tcf National Bank Illinois by The American Bankers Association (ABA) to identify the financial institution upon which a payment was drawn.</code> |
| <code>why was jamestown so important other then being the first permanent english settlement</code> | <code>Credit: National Park Service. View full size image. Jamestown, founded in 1607, was the first successful permanent English settlement in what would become the United States. It was located on Jamestown Island, in Virginia, about 30 miles (47 kilometers) up the James River from the Atlantic coast.amestown, founded in 1607, was the first successful permanent English settlement in what would become the United States.</code> |
| <code>when was the town of farragut tn incorporated</code> | <code>In January of 1980, residents decided to incorporate by an overwhelming margin. The Town of Farragut was incorporated on January 16, 1980, with the first board of Mayor and Alderman elected on April 1, 1980.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### nq_pairs
* Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 9 tokens</li><li>mean: 12.24 tokens</li><li>max: 25 tokens</li></ul> | <ul><li>min: 29 tokens</li><li>mean: 128.73 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:----------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>when did rodan and fields start direct sales</code> | <code>Rodan + Fields In 2002, Drs. Fields and Rodan launched Rodan + Fields. Products were sold in department stores. In 2003, Rodan + Fields was purchased by Estée Lauder.[1] In 2007, Drs. Fields and Rodan reacquired the brand[4] and transitioned the company from department stores to multi-level marketing, where consultants are paid a commission for their own sales and for the sales of people they recruit.[1]</code> |
| <code>what are the house names in harry potter</code> | <code>Hogwarts Hogwarts is divided into four houses, each bearing the last name of its founder: Godric Gryffindor, Salazar Slytherin, Rowena Ravenclaw and Helga Hufflepuff. Throughout the school year, the houses compete for the House Cup, gaining and losing points based on actions such as performance in class and rule violations. The house with the highest end-of-year total wins and has its colours displayed in the Great Hall for the following school year. Each house also has its own Quidditch team that competes for the Quidditch Cup. These two competitions breed rivalries between the houses. Houses at Hogwarts are living and learning communities for their students. Each house is under the authority of one of the Hogwarts staff members. The Heads of the houses, as they are called, are in charge of giving their students important information, dealing with matters of severe punishment, and responding to emergencies in their houses, among other things. Each year, year level groups of every separate house share the same dormitory and classes. The dormitory and common room of a House are, barring rare exceptions, inaccessible to students belonging to other Houses.</code> |
| <code>when was calibri font made available for use</code> | <code>Calibri Calibri (/kəˈliːbri/) is a sans-serif typeface family designed by Luc(as) de Groot in 2002–2004 and released to the general public in 2007, with Microsoft Office 2007 and Windows Vista.[2][3] In Office 2007, it replaced Times New Roman as the default typeface in Word[4] and replaced Arial as the default in PowerPoint, Excel, Outlook, and WordPad. Creator de Groot described its subtly rounded design as having "a warm and soft character".[3]</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### trivia_pairs
* Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0)
* Size: 128 evaluation samples
* Columns: <code>query</code> and <code>answer</code>
* Approximate statistics based on the first 1000 samples:
| | query | answer |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 8 tokens</li><li>mean: 16.59 tokens</li><li>max: 38 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 199.75 tokens</li><li>max: 373 tokens</li></ul> |
* Samples:
| query | answer |
|:-------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>Theodore Francis international airport is in which US state?</code> | <code>Theodore Francis Green State Airport, Providence RI Hotels Near the Airport T. F. Green Airport (sometimes called T. F. Green International Airport) (IATA: PVD, ICAO: KPVD, FAA LID: PVD) , also known as Theodore Francis Green State Airport and Providence International Airport is located in Warwick, six miles (10 km) south of Providence, in Kent County, Rhode Island, USA. Completely rebuilt in 1996, it was the first state-owned airport in the United States. Providence International Airport is a popular alternative to Boston, Massachusetts' often busy Logan International Airport, as delays and wait time are minimal. There are two terminals with two concourses, North and South. The South Concourse has eight gates, and the North Concourse has 14 gates. Gate 8 is designed for international arrivals for use by Air Canada and SATA International flights; it is directly connected to customs, which is on the lower level of the concourse. The terminal contains a number of stores and restaurants, and a central food court. Local Time: 17-Jan-2017 12:02 AM © Copyright 2017, Providence-Airport.com, not the official airport website</code> |
| <code>What is the English for ‘Duirt me leat go raibh me breoite’?</code> | <code>Duirt mé leat go raibh mé breoite – Susan Hated Literature That is the enscription on Spike Milligan’s headstone . For those of you without Irish it is a translation of what he wanted: “I told you I was ill.” Duirt mé (I told) pronounced durtch may (like the month) leat (you) pronounced lat go raibh mé (that I was) pronounced not like the english go, but the o is sorta like “uh”; raibh can have a variety of pronunciations, I’d usually say row (as in to fight, not to do anything in a boat), but you could say rev, or rav, and mé is may (again) breoite (ill) pronounced bro-tcha Here endeth the very bad Irish lesson, from which you are probably more confuddled than you were before? On a secondary issue, why is the Irish language constantly described as Gaelic? It’s Irish, or Gaeilge. The latter only if you are speaking as Gaeilge. I mean I don’t say that some German person was speaking in Deutsch unless I’m attempting to say it in German. This is a pet peeve of mine, feel free to ignore :)</code> |
| <code>Which group sung the 1997 Eurovision Song Contest winning Love Shine A Light?</code> | <code>Eurovision 1997 - Katrina & The Waves - Love shine a light - YouTube Eurovision 1997 - Katrina & The Waves - Love shine a light Want to watch this again later? Sign in to add this video to a playlist. Need to report the video? Sign in to report inappropriate content. Rating is available when the video has been rented. This feature is not available right now. Please try again later. Uploaded on Sep 14, 2008 Eurovision 1997 - Katrina & The Waves - Love shine a light Category</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### gooaq_pairs
* Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 8 tokens</li><li>mean: 11.01 tokens</li><li>max: 17 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 57.03 tokens</li><li>max: 96 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:----------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>can i use full synthetic oil in my snowblower?</code> | <code>So, while synthetic will do better in temperature extremes, (extremes not seen in a properly running snow blower) it will get dirty at the same rate as regular oil. Therefore we will need to replace it at the same intervals as regular oil, but at a greater cost.</code> |
| <code>what is the difference between primary and foreign key in sql?</code> | <code>Difference between Primary Key and Foreign Key. Primary key uniquely identify a record in the table. Foreign key is a field in the table that is primary key in another table. ... By default, Primary key is clustered index and data in the database table is physically organized in the sequence of clustered index.</code> |
| <code>how to change administrator windows 10?</code> | <code>['Under Settings > Accounts > Family & other users, select the account owner name, then select Change account type.', 'Under Account type, select Administrator and OK.', 'Sign in with the new administrator account.']</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### paws-pos
* Dataset: [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws) at [161ece9](https://huggingface.co/datasets/google-research-datasets/paws/tree/161ece9501cf0a11f3e48bd356eaa82de46d6a09)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 10 tokens</li><li>mean: 25.58 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 25.4 tokens</li><li>max: 41 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:---------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>They were there to enjoy us and they were there to pray for us .</code> | <code>They were there for us to enjoy and they were there for us to pray .</code> |
| <code>After the end of the war in June 1902 , Higgins left Southampton in the `` SSBavarian '' in August , returning to Cape Town the following month .</code> | <code>In August , after the end of the war in June 1902 , Higgins Southampton left the `` SSBavarian '' and returned to Cape Town the following month .</code> |
| <code>From the merger of the Four Rivers Council and the Audubon Council , the Shawnee Trails Council was born .</code> | <code>Shawnee Trails Council was formed from the merger of the Four Rivers Council and the Audubon Council .</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
#### global_dataset
* Dataset: global_dataset
* Size: 325 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 |
|:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 5 tokens</li><li>mean: 31.88 tokens</li><li>max: 344 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 55.72 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
| sentence1 | sentence2 |
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>when was the world cup</code> | <code>In the first World Cup final, held on July 30, 1930, 93,000 spectators looked on as Uruguay defeated Argentina 4â2 in a rematch of the 1928 Olympic gold medal game. Uruguay went on to win its second World Cup in 1950 with a 2-1 win over Brazil in Rio de Janeiro.</code> |
| <code>Highlands-based Neil Anderson set up the camera in a part of Strathspey in the Cairngorms.<br>He said: "I knew the cat used a path fairly regularly and seeing that snow was forecast I rigged up my camera trap.<br>"I went away for a couple of weeks so when I finally checked the camera it was a great surprise to come back to."<br>The wildcat was photographed on 13 January.<br>Mr Anderson, whose credits include commissions for the BBC's Springwatch wildlife programmes, also photographs wildlife in other parts of the world.<br>The Scottish wildcat is one of the world's most endangered animals.<br>Habitat loss and breeding with domestic and feral cats are factors behind a severe decline in the mammals.</code> | <code>A rare Scottish wildcat has been photographed on a camera trap rigged up by a wildlife cameraman.</code> |
| <code>can phi be disclosed for marketing purposes?</code> | <code>The Privacy Rule addresses the use and disclosure of protected health information for marketing purposes by: ... Requiring individual authorization for all uses or disclosures of protected health information for marketing purposes with limited exceptions.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
```json
{'guide': SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
), 'temperature': 0.025}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 128
- `gradient_accumulation_steps`: 3
- `learning_rate`: 1.5e-05
- `weight_decay`: 0.001
- `num_train_epochs`: 2
- `lr_scheduler_type`: cosine_with_min_lr
- `lr_scheduler_kwargs`: {'num_cycles': 0.5, 'min_lr': 1.5e-06}
- `warmup_ratio`: 0.25
- `save_safetensors`: False
- `fp16`: True
- `push_to_hub`: True
- `hub_model_id`: bobox/DeBERTa2-0.9B-ST-v2-checkpoints-tmp
- `hub_strategy`: all_checkpoints
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 128
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 3
- `eval_accumulation_steps`: None
- `learning_rate`: 1.5e-05
- `weight_decay`: 0.001
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 2
- `max_steps`: -1
- `lr_scheduler_type`: cosine_with_min_lr
- `lr_scheduler_kwargs`: {'num_cycles': 0.5, 'min_lr': 1.5e-06}
- `warmup_ratio`: 0.25
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: False
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: True
- `resume_from_checkpoint`: None
- `hub_model_id`: bobox/DeBERTa2-0.9B-ST-v2-checkpoints-tmp
- `hub_strategy`: all_checkpoints
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
<details><summary>Click to expand</summary>
| Epoch | Step | Training Loss | xsum-pairs loss | global dataset loss | scitail-pairs-pos loss | nq pairs loss | qasc pairs loss | trivia pairs loss | scitail-pairs-qa loss | vitaminc-pairs loss | sciq pairs loss | negation-triplets loss | gooaq pairs loss | msmarco pairs loss | openbookqa pairs loss | paws-pos loss | Qnli-dev_max_ap | allNLI-dev_max_ap | sts-test_spearman_cosine |
|:------:|:----:|:-------------:|:---------------:|:-------------------:|:----------------------:|:-------------:|:---------------:|:-----------------:|:---------------------:|:-------------------:|:---------------:|:----------------------:|:----------------:|:------------------:|:---------------------:|:-------------:|:---------------:|:-----------------:|:------------------------:|
| 0.0058 | 3 | 7.8503 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0116 | 6 | 8.4022 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0174 | 9 | 11.1776 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0231 | 12 | 9.7845 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0289 | 15 | 8.9224 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0347 | 18 | 11.1202 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0405 | 21 | 7.413 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0463 | 24 | 7.7803 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0501 | 26 | - | 5.3947 | 6.3484 | 2.5062 | 10.4474 | 6.6809 | 6.1073 | 2.9786 | 4.1224 | 0.8978 | 3.9176 | 8.0953 | 15.7827 | 6.4026 | 1.6077 | 0.6094 | 0.3453 | 0.3223 |
| 0.0521 | 27 | 7.9729 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0578 | 30 | 6.0587 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0636 | 33 | 5.6742 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0694 | 36 | 6.5406 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0752 | 39 | 5.4429 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0810 | 42 | 6.7855 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0868 | 45 | 5.3403 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0925 | 48 | 4.2282 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.0983 | 51 | 4.7411 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1003 | 52 | - | 4.4624 | 3.8914 | 1.1342 | 6.2954 | 4.8895 | 5.7900 | 2.0086 | 3.9298 | 0.7183 | 3.2670 | 5.7852 | 7.5325 | 3.4273 | 0.4783 | 0.5957 | 0.4051 | 0.5257 |
| 0.1041 | 54 | 3.9082 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1099 | 57 | 4.3922 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1157 | 60 | 3.2655 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1215 | 63 | 3.1043 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1272 | 66 | 2.2074 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1330 | 69 | 1.4414 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1388 | 72 | 1.5937 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1446 | 75 | 1.0306 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1504 | 78 | 1.0784 | 0.4360 | 0.6714 | 0.1109 | 1.2714 | 1.3059 | 1.0828 | 0.1282 | 3.7639 | 0.2034 | 1.4773 | 0.7032 | 1.3856 | 1.1711 | 0.0449 | 0.6293 | 0.4747 | 0.8291 |
| 0.1562 | 81 | 0.9674 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1620 | 84 | 0.9335 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1677 | 87 | 0.8806 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1735 | 90 | 0.631 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1793 | 93 | 0.3384 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1851 | 96 | 0.404 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1909 | 99 | 0.6488 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.1967 | 102 | 0.4728 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2005 | 104 | - | 0.1409 | 0.3990 | 0.0726 | 0.5425 | 0.4244 | 0.6438 | 0.0485 | 3.3142 | 0.1505 | 1.3401 | 0.2238 | 0.5883 | 0.5924 | 0.0358 | 0.7154 | 0.5430 | 0.8863 |
| 0.2024 | 105 | 0.5094 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2082 | 108 | 0.8002 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2140 | 111 | 0.3886 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2198 | 114 | 0.6937 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2256 | 117 | 0.2909 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2314 | 120 | 0.3885 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2371 | 123 | 0.29 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2429 | 126 | 0.3485 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2487 | 129 | 0.3931 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2506 | 130 | - | 0.0513 | 0.3591 | 0.0457 | 0.2678 | 0.2206 | 0.1540 | 0.0365 | 3.8795 | 0.1231 | 1.1135 | 0.1206 | 0.2162 | 0.4568 | 0.0261 | 0.6993 | 0.5638 | 0.9082 |
| 0.2545 | 132 | 0.3394 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2603 | 135 | 0.1276 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2661 | 138 | 0.3569 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2719 | 141 | 0.1231 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2776 | 144 | 0.3086 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2834 | 147 | 0.3541 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2892 | 150 | 0.2597 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.2950 | 153 | 0.1585 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3008 | 156 | 0.1436 | 0.0503 | 0.2959 | 0.0376 | 0.2004 | 0.1717 | 0.0823 | 0.0410 | 3.8346 | 0.0871 | 0.9441 | 0.0767 | 0.1350 | 0.4304 | 0.0213 | 0.7239 | 0.5807 | 0.9177 |
| 0.3066 | 159 | 0.1941 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3123 | 162 | 0.3041 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3181 | 165 | 0.2358 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3239 | 168 | 0.2148 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3297 | 171 | 0.8567 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3355 | 174 | 0.3668 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3413 | 177 | 0.3278 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3470 | 180 | 0.474 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3509 | 182 | - | 0.0382 | 0.3900 | 0.0550 | 0.2525 | 0.1511 | 0.0624 | 0.1251 | 3.8444 | 0.0825 | 0.8706 | 0.1212 | 0.4176 | 0.7894 | 0.0209 | 0.7187 | 0.5555 | 0.9103 |
| 0.3528 | 183 | 0.5365 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3586 | 186 | 0.6902 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3644 | 189 | 0.4105 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3702 | 192 | 0.2434 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3760 | 195 | 0.1521 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3817 | 198 | 0.1878 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3875 | 201 | 0.3544 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3933 | 204 | 0.1397 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.3991 | 207 | 0.2982 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4010 | 208 | - | 0.0290 | 0.2504 | 0.0209 | 0.2429 | 0.1335 | 0.0537 | 0.0243 | 3.2105 | 0.0851 | 0.6793 | 0.0728 | 0.0954 | 0.4361 | 0.0224 | 0.7266 | 0.5853 | 0.9212 |
| 0.4049 | 210 | 0.1875 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4107 | 213 | 0.169 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4165 | 216 | 0.2341 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4222 | 219 | 0.1806 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4280 | 222 | 0.2736 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4338 | 225 | 0.1772 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4396 | 228 | 0.131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4454 | 231 | 0.0825 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4512 | 234 | 0.2745 | 0.0315 | 0.2459 | 0.0264 | 0.2080 | 0.1237 | 0.0534 | 0.0236 | 3.4479 | 0.0854 | 0.6366 | 0.0623 | 0.0320 | 0.4140 | 0.0188 | 0.7252 | 0.5810 | 0.9186 |
| 0.4569 | 237 | 0.131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4627 | 240 | 0.2012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4685 | 243 | 0.0855 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4743 | 246 | 0.1181 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4801 | 249 | 0.0992 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4859 | 252 | 0.0375 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4916 | 255 | 0.1503 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.4974 | 258 | 0.1239 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5013 | 260 | - | 0.0130 | 0.2468 | 0.0173 | 0.1420 | 0.1050 | 0.0554 | 0.0256 | 2.7884 | 0.0917 | 0.6254 | 0.0432 | 0.4505 | 0.4391 | 0.0200 | 0.7392 | 0.5874 | 0.9206 |
| 0.5032 | 261 | 0.1849 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5090 | 264 | 0.0453 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5148 | 267 | 0.118 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5206 | 270 | 0.2285 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5263 | 273 | 0.072 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5321 | 276 | 0.1449 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5379 | 279 | 0.2179 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5437 | 282 | 0.0687 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5495 | 285 | 0.071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5514 | 286 | - | 0.0061 | 0.2986 | 0.0183 | 0.1662 | 0.1212 | 0.0679 | 0.0217 | 3.9508 | 0.0967 | 0.6451 | 0.0508 | 0.0296 | 0.4552 | 0.0236 | 0.7463 | 0.5938 | 0.9179 |
| 0.5553 | 288 | 0.0855 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5611 | 291 | 0.103 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5668 | 294 | 0.1555 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5726 | 297 | 0.1069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5784 | 300 | 0.2014 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5842 | 303 | 0.1028 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5900 | 306 | 0.2425 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.5958 | 309 | 0.1639 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6015 | 312 | 0.099 | 0.0050 | 0.2365 | 0.0077 | 0.0879 | 0.1507 | 0.0483 | 0.0146 | 3.3625 | 0.0870 | 0.5403 | 0.0421 | 0.0490 | 0.4215 | 0.0250 | 0.7519 | 0.5889 | 0.9191 |
| 0.6073 | 315 | 0.1218 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6131 | 318 | 0.1575 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6189 | 321 | 0.178 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6247 | 324 | 0.0777 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6305 | 327 | 0.0696 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6362 | 330 | 0.1206 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6420 | 333 | 0.0926 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6478 | 336 | 0.1357 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6517 | 338 | - | 0.0059 | 0.1910 | 0.0106 | 0.1182 | 0.1399 | 0.0288 | 0.0014 | 3.1742 | 0.0752 | 0.5640 | 0.0658 | 0.0380 | 0.4194 | 0.0252 | 0.7395 | 0.5983 | 0.9249 |
| 0.6536 | 339 | 0.1234 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6594 | 342 | 0.1056 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6652 | 345 | 0.1525 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6710 | 348 | 0.1425 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6767 | 351 | 0.1401 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6825 | 354 | 0.1271 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6883 | 357 | 0.0598 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6941 | 360 | 0.0681 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.6999 | 363 | 0.1132 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7018 | 364 | - | 0.0103 | 0.2425 | 0.0113 | 0.0261 | 0.0636 | 0.0199 | 0.0031 | 3.7764 | 0.0742 | 0.5828 | 0.0730 | 0.0260 | 0.3494 | 0.0232 | 0.7288 | 0.5859 | 0.9237 |
| 0.7057 | 366 | 0.0714 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7114 | 369 | 0.1816 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7172 | 372 | 0.1609 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7230 | 375 | 0.1123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7288 | 378 | 0.1906 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7346 | 381 | 0.0689 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7404 | 384 | 0.1897 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7461 | 387 | 0.1268 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7519 | 390 | 0.1256 | 0.0064 | 0.1842 | 0.0084 | 0.0449 | 0.0450 | 0.0288 | 0.0021 | 3.1047 | 0.0667 | 0.5752 | 0.0530 | 0.0238 | 0.3462 | 0.0247 | 0.7399 | 0.5969 | 0.9273 |
| 0.7577 | 393 | 0.1201 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7635 | 396 | 0.0995 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7693 | 399 | 0.0825 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7751 | 402 | 0.0778 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7808 | 405 | 0.0696 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7866 | 408 | 0.1332 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7924 | 411 | 0.0684 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.7982 | 414 | 0.2002 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8021 | 416 | - | 0.0040 | 0.1487 | 0.0090 | 0.0566 | 0.0831 | 0.0294 | 0.0034 | 2.4590 | 0.0779 | 0.5903 | 0.0677 | 0.0167 | 0.3658 | 0.0260 | 0.7521 | 0.6094 | 0.9273 |
| 0.8040 | 417 | 0.0455 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8098 | 420 | 0.15 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8156 | 423 | 0.1232 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8213 | 426 | 0.193 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8271 | 429 | 0.1856 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8329 | 432 | 0.0898 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8387 | 435 | 0.0242 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8445 | 438 | 0.0693 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8503 | 441 | 0.1245 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8522 | 442 | - | 0.0069 | 0.1455 | 0.0086 | 0.0449 | 0.0873 | 0.0173 | 0.0023 | 2.7578 | 0.0706 | 0.5208 | 0.0646 | 0.0137 | 0.3374 | 0.0260 | 0.7479 | 0.6047 | 0.9263 |
| 0.8560 | 444 | 0.114 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8618 | 447 | 0.0713 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8676 | 450 | 0.1336 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8734 | 453 | 0.0612 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8792 | 456 | 0.1083 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8850 | 459 | 0.0584 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8907 | 462 | 0.0929 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.8965 | 465 | 0.066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9023 | 468 | 0.0423 | 0.0033 | 0.1635 | 0.0082 | 0.0396 | 0.0649 | 0.0181 | 0.0039 | 2.9874 | 0.0643 | 0.5298 | 0.0726 | 0.0085 | 0.3434 | 0.0263 | 0.7469 | 0.6148 | 0.9274 |
| 0.9081 | 471 | 0.0809 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9139 | 474 | 0.3409 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9197 | 477 | 0.0726 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9254 | 480 | 0.101 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9312 | 483 | 0.0944 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9370 | 486 | 0.0431 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9428 | 489 | 0.1061 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9486 | 492 | 0.0843 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9524 | 494 | - | 0.0037 | 0.2037 | 0.0072 | 0.0217 | 0.0711 | 0.0133 | 0.0121 | 3.5029 | 0.0671 | 0.4939 | 0.0774 | 0.0053 | 0.3318 | 0.0254 | 0.7531 | 0.6148 | 0.9238 |
| 0.9544 | 495 | 0.1177 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9602 | 498 | 0.1513 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9659 | 501 | 0.1944 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9717 | 504 | 0.0969 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9775 | 507 | 0.1293 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9833 | 510 | 0.15 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9891 | 513 | 0.046 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 0.9949 | 516 | 0.1007 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0006 | 519 | 0.0359 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0026 | 520 | - | 0.0082 | 0.1803 | 0.0151 | 0.0323 | 0.0395 | 0.0216 | 0.0192 | 3.2003 | 0.0655 | 0.5109 | 0.0655 | 0.0125 | 0.3490 | 0.0268 | 0.7571 | 0.6092 | 0.9309 |
| 1.0064 | 522 | 0.0547 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0122 | 525 | 0.0669 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0180 | 528 | 0.0965 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0238 | 531 | 0.1971 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0296 | 534 | 0.0637 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0353 | 537 | 0.0341 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0411 | 540 | 0.0513 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0469 | 543 | 0.0435 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0527 | 546 | 0.2092 | 0.0050 | 0.2294 | 0.0091 | 0.0332 | 0.0594 | 0.0088 | 0.0137 | 3.7179 | 0.0693 | 0.4823 | 0.0706 | 0.0153 | 0.4174 | 0.0269 | 0.7441 | 0.6121 | 0.9259 |
| 1.0585 | 549 | 0.054 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0643 | 552 | 0.046 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0701 | 555 | 0.0982 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0758 | 558 | 0.0441 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0816 | 561 | 0.1141 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0874 | 564 | 0.1131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0932 | 567 | 0.0423 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.0990 | 570 | 0.1062 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1028 | 572 | - | 0.0078 | 0.1511 | 0.0157 | 0.0469 | 0.0278 | 0.0130 | 0.0196 | 2.5838 | 0.0720 | 0.4972 | 0.0634 | 0.0129 | 0.3413 | 0.0289 | 0.7696 | 0.6270 | 0.9246 |
| 1.1048 | 573 | 0.0428 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1105 | 576 | 0.2265 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1163 | 579 | 0.0881 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1221 | 582 | 0.0932 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1279 | 585 | 0.0474 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1337 | 588 | 0.0512 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1395 | 591 | 0.0316 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1452 | 594 | 0.1243 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1510 | 597 | 0.0416 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1530 | 598 | - | 0.0054 | 0.1718 | 0.0119 | 0.0454 | 0.0240 | 0.0164 | 0.0349 | 3.1179 | 0.0790 | 0.5401 | 0.0685 | 0.0077 | 0.3325 | 0.0261 | 0.7455 | 0.6176 | 0.9281 |
| 1.1568 | 600 | 0.0591 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1626 | 603 | 0.1668 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1684 | 606 | 0.1302 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1742 | 609 | 0.0912 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1799 | 612 | 0.0357 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1857 | 615 | 0.0239 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1915 | 618 | 0.0682 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.1973 | 621 | 0.0634 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2031 | 624 | 0.1228 | 0.0042 | 0.1740 | 0.0106 | 0.0218 | 0.0461 | 0.0092 | 0.0024 | 3.1393 | 0.0768 | 0.5298 | 0.0594 | 0.0093 | 0.3543 | 0.0274 | 0.7508 | 0.6279 | 0.9260 |
| 1.2089 | 627 | 0.1741 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2147 | 630 | 0.0413 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2204 | 633 | 0.1496 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2262 | 636 | 0.0229 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2320 | 639 | 0.0871 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2378 | 642 | 0.0785 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2436 | 645 | 0.2074 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2494 | 648 | 0.0647 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2532 | 650 | - | 0.0026 | 0.1904 | 0.0103 | 0.0195 | 0.0470 | 0.0090 | 0.0005 | 3.3713 | 0.0704 | 0.5323 | 0.0584 | 0.0123 | 0.3422 | 0.0279 | 0.7599 | 0.6287 | 0.9222 |
| 1.2551 | 651 | 0.0664 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2609 | 654 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2667 | 657 | 0.1048 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2725 | 660 | 0.0221 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2783 | 663 | 0.1205 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2841 | 666 | 0.1324 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2898 | 669 | 0.0897 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.2956 | 672 | 0.0388 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3014 | 675 | 0.0425 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3033 | 676 | - | 0.0032 | 0.2147 | 0.0083 | 0.0181 | 0.0401 | 0.0095 | 0.0011 | 3.6178 | 0.0722 | 0.5532 | 0.0449 | 0.0059 | 0.3618 | 0.0274 | 0.7621 | 0.6191 | 0.9233 |
| 1.3072 | 678 | 0.0356 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3130 | 681 | 0.0997 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3188 | 684 | 0.0517 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3246 | 687 | 0.0348 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3303 | 690 | 0.2454 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3361 | 693 | 0.0975 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3419 | 696 | 0.1477 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3477 | 699 | 0.0737 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3535 | 702 | 0.1786 | 0.0040 | 0.1623 | 0.0145 | 0.0152 | 0.0326 | 0.0065 | 0.0013 | 3.0246 | 0.0690 | 0.5546 | 0.0335 | 0.0082 | 0.3399 | 0.0281 | 0.7760 | 0.6212 | 0.9250 |
| 1.3593 | 705 | 0.1141 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3650 | 708 | 0.072 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3708 | 711 | 0.0617 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3766 | 714 | 0.0886 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3824 | 717 | 0.047 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3882 | 720 | 0.1327 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3940 | 723 | 0.0498 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.3997 | 726 | 0.0671 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4036 | 728 | - | 0.0041 | 0.1652 | 0.0181 | 0.0105 | 0.0391 | 0.0076 | 0.0017 | 3.1377 | 0.0692 | 0.5127 | 0.0450 | 0.0100 | 0.3439 | 0.0285 | 0.7675 | 0.6126 | 0.9263 |
| 1.4055 | 729 | 0.0376 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4113 | 732 | 0.027 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4171 | 735 | 0.1266 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4229 | 738 | 0.1187 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4287 | 741 | 0.0168 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4344 | 744 | 0.0214 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4402 | 747 | 0.0473 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4460 | 750 | 0.0086 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4518 | 753 | 0.1337 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4537 | 754 | - | 0.0027 | 0.1709 | 0.0130 | 0.0092 | 0.0235 | 0.0074 | 0.0014 | 3.3135 | 0.0679 | 0.5448 | 0.0443 | 0.0048 | 0.3484 | 0.0279 | 0.7719 | 0.6258 | 0.9266 |
| 1.4576 | 756 | 0.0334 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4634 | 759 | 0.0601 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4692 | 762 | 0.0268 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4749 | 765 | 0.0265 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4807 | 768 | 0.0137 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4865 | 771 | 0.0073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4923 | 774 | 0.0395 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.4981 | 777 | 0.02 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5039 | 780 | 0.0329 | 0.0020 | 0.1861 | 0.0087 | 0.0081 | 0.0226 | 0.0090 | 0.0018 | 3.5041 | 0.0706 | 0.5487 | 0.0464 | 0.0043 | 0.3515 | 0.0270 | 0.7594 | 0.6210 | 0.9306 |
| 1.5096 | 783 | 0.0126 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5154 | 786 | 0.13 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5212 | 789 | 0.0177 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5270 | 792 | 0.0164 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5328 | 795 | 0.0367 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5386 | 798 | 0.0544 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5443 | 801 | 0.0135 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5501 | 804 | 0.0159 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5540 | 806 | - | 0.0027 | 0.1521 | 0.0080 | 0.0088 | 0.0182 | 0.0095 | 0.0014 | 2.8949 | 0.0700 | 0.5246 | 0.0340 | 0.0050 | 0.3503 | 0.0280 | 0.7704 | 0.6211 | 0.9278 |
| 1.5559 | 807 | 0.0181 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5617 | 810 | 0.0109 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5675 | 813 | 0.031 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5733 | 816 | 0.0907 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5790 | 819 | 0.0256 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5848 | 822 | 0.0171 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5906 | 825 | 0.0809 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.5964 | 828 | 0.0451 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6022 | 831 | 0.0205 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6041 | 832 | - | 0.0031 | 0.1462 | 0.0077 | 0.0086 | 0.0192 | 0.0109 | 0.0021 | 2.6891 | 0.0708 | 0.5463 | 0.0337 | 0.0052 | 0.3298 | 0.0282 | 0.7747 | 0.6249 | 0.9243 |
| 1.6080 | 834 | 0.009 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6138 | 837 | 0.0213 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6195 | 840 | 0.0679 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6253 | 843 | 0.0184 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6311 | 846 | 0.0119 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6369 | 849 | 0.041 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6427 | 852 | 0.0318 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6485 | 855 | 0.0245 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6542 | 858 | 0.0765 | 0.0037 | 0.1507 | 0.0073 | 0.0145 | 0.0233 | 0.0102 | 0.0017 | 2.8041 | 0.0741 | 0.5282 | 0.0355 | 0.0056 | 0.3279 | 0.0284 | 0.7719 | 0.6267 | 0.9254 |
| 1.6600 | 861 | 0.0247 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6658 | 864 | 0.0407 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6716 | 867 | 0.042 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6774 | 870 | 0.024 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6832 | 873 | 0.0223 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6889 | 876 | 0.0268 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.6947 | 879 | 0.0049 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7005 | 882 | 0.0371 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7044 | 884 | - | 0.0034 | 0.1573 | 0.0073 | 0.0088 | 0.0218 | 0.0106 | 0.0025 | 2.9740 | 0.0741 | 0.5448 | 0.0370 | 0.0044 | 0.3086 | 0.0280 | 0.7681 | 0.6227 | 0.9274 |
| 1.7063 | 885 | 0.0131 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7121 | 888 | 0.0397 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7179 | 891 | 0.0324 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7237 | 894 | 0.026 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7294 | 897 | 0.0436 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7352 | 900 | 0.0117 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7410 | 903 | 0.044 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7468 | 906 | 0.028 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7526 | 909 | 0.0324 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7545 | 910 | - | 0.0029 | 0.1603 | 0.0063 | 0.0110 | 0.0201 | 0.0089 | 0.0017 | 3.0405 | 0.0740 | 0.5347 | 0.0382 | 0.0045 | 0.3147 | 0.0275 | 0.7685 | 0.6262 | 0.9274 |
| 1.7584 | 912 | 0.019 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7641 | 915 | 0.0037 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7699 | 918 | 0.0182 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7757 | 921 | 0.0199 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7815 | 924 | 0.0298 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7873 | 927 | 0.0387 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7931 | 930 | 0.031 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.7988 | 933 | 0.0587 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8046 | 936 | 0.0186 | 0.0025 | 0.1647 | 0.0061 | 0.0109 | 0.0183 | 0.0073 | 0.0014 | 3.0794 | 0.0717 | 0.5319 | 0.0372 | 0.0041 | 0.3218 | 0.0273 | 0.7687 | 0.6308 | 0.9269 |
| 1.8104 | 939 | 0.0476 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8162 | 942 | 0.0394 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8220 | 945 | 0.0692 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8278 | 948 | 0.0501 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8335 | 951 | 0.0315 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8393 | 954 | 0.0055 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8451 | 957 | 0.0105 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8509 | 960 | 0.0182 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8548 | 962 | - | 0.0020 | 0.1701 | 0.0057 | 0.0119 | 0.0172 | 0.0063 | 0.0014 | 3.1208 | 0.0714 | 0.5205 | 0.0394 | 0.0040 | 0.3202 | 0.0273 | 0.7645 | 0.6263 | 0.9282 |
| 1.8567 | 963 | 0.0259 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8625 | 966 | 0.0149 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8683 | 969 | 0.0319 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8740 | 972 | 0.0126 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8798 | 975 | 0.0293 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8856 | 978 | 0.0055 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8914 | 981 | 0.031 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.8972 | 984 | 0.0127 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9030 | 987 | 0.0058 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9049 | 988 | - | 0.0022 | 0.1574 | 0.0067 | 0.0111 | 0.0139 | 0.0067 | 0.0015 | 2.9796 | 0.0695 | 0.5309 | 0.0410 | 0.0041 | 0.3159 | 0.0275 | 0.7671 | 0.6260 | 0.9278 |
| 1.9087 | 990 | 0.0141 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9145 | 993 | 0.1319 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9203 | 996 | 0.0166 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9261 | 999 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9319 | 1002 | 0.0301 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9377 | 1005 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9434 | 1008 | 0.0216 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9492 | 1011 | 0.0094 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9550 | 1014 | 0.0686 | 0.0023 | 0.1575 | 0.0065 | 0.0095 | 0.0121 | 0.0072 | 0.0016 | 2.9838 | 0.0695 | 0.5424 | 0.0417 | 0.0042 | 0.3157 | 0.0274 | 0.7660 | 0.6270 | 0.9279 |
| 1.9608 | 1017 | 0.0271 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9666 | 1020 | 0.069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9724 | 1023 | 0.0303 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9781 | 1026 | 0.0391 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9839 | 1029 | 0.0428 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9897 | 1032 | 0.0106 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9955 | 1035 | 0.0478 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
| 1.9974 | 1036 | - | 0.0022 | 0.1650 | 0.0061 | 0.0085 | 0.0121 | 0.0072 | 0.0017 | 3.0683 | 0.0704 | 0.5342 | 0.0425 | 0.0044 | 0.3159 | 0.0273 | 0.7646 | 0.6281 | 0.9284 |
</details>
### Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.0.1
- Transformers: 4.42.4
- PyTorch: 2.4.0+cu121
- Accelerate: 0.32.1
- Datasets: 2.21.0
- Tokenizers: 0.19.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> | [
"MEDAL",
"SCIQ",
"SCITAIL"
] |
Subsets and Splits