YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Number of experts present in the library: 10
Expert Name | Base Model | Trained on | Adapter Type |
---|---|---|---|
c6o10 | mistralai/Mistral-7B-v0.1 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_text,cos_e_v1_11_question_description_option_text,cos_e_v1_11_question_option_description_id,qasc_qa_with_combined_facts_1,qasc_qa_with_separated_facts_1,qasc_qa_with_separated_facts_2,qasc_qa_with_separated_facts_3,qasc_qa_with_separated_facts_5,quarel_choose_between,quarel_heres_a_story,quarel_logic_test,quarel_testing_students,quartz_answer_question_based_on,quartz_answer_question_below,quartz_given_the_fact_answer_the_q,quartz_having_read_above_passage,quartz_paragraph_question_plain_concat,quartz_read_passage_below_choose,quartz_use_info_from_paragraph_question,quartz_use_info_from_question_paragraph,quoref_Answer_Question_Given_Context,ropes_background_new_situation_answer,ropes_background_situation_middle,ropes_given_background_situation,ropes_new_situation_background_answer,ropes_plain_background_situation,ropes_plain_bottom_hint,ropes_plain_no_background,ropes_prompt_beginning,ropes_prompt_bottom_hint_beginning,ropes_prompt_bottom_no_hint,ropes_prompt_mix,ropes_read_background_situation,sciq_Direct_Question_Closed_Book_,social_i_qa_Show_choices_and_generate_answer,wiqa_does_the_supposed_perturbation_have_an_effect,wiqa_effect_with_label_answer,wiqa_effect_with_string_answer,wiqa_which_of_the_following_is_the_supposed_perturbation | lora |
c0o10 | mistralai/Mistral-7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_dbert_generate_question,adversarial_qa_dbidaf_generate_question,adversarial_qa_droberta_generate_question,app_reviews_generate_review,cot_creak,cot_esnli,cot_esnli_ii,dream_generate_first_utterance,dream_generate_last_utterance,duorc_ParaphraseRC_title_generation,duorc_SelfRC_title_generation,fix_punct,gem_common_gen_1_1_0,gem_dart_1_1_0,gigaword_1_2_0,huggingface_xsum,lambada_1_0_0,race_high_Write_a_multi_choice_question_for_the_following_article,race_high_Write_a_multi_choice_question_options_given_,race_middle_Write_a_multi_choice_question_for_the_following_article,race_middle_Write_a_multi_choice_question_options_given_,stream_aqua,stream_qed,wiqa_what_is_the_missing_first_step,wmt16_translate_fi_en_1_0_0,wmt16_translate_ro_en_1_0_0,yelp_polarity_reviews_0_2_0 | lora |
c8o10 | mistralai/Mistral-7B-v0.1 | sordonia/flan-10k-flat/anli_r1_0_1_0,anli_r2_0_1_0,anli_r3_0_1_0,cosmos_qa_1_0_0,cot_ecqa,cot_sensemaking,glue_cola_2_0_0,glue_mrpc_2_0_0,glue_sst2_2_0_0,imdb_reviews_plain_text_1_0_0,para_crawl_enes,super_glue_record_1_0_2,true_case,wmt14_translate_fr_en_1_0_0,wmt16_translate_de_en_1_0_0,word_segment | lora |
c9o10 | mistralai/Mistral-7B-v0.1 | sordonia/flan-10k-flat/wiki_hop_original_choose_best_object_affirmative_1,wiki_hop_original_choose_best_object_affirmative_2,wiki_hop_original_choose_best_object_affirmative_3,wiki_hop_original_choose_best_object_interrogative_1,wiki_hop_original_choose_best_object_interrogative_2,wiki_hop_original_explain_relation,wiki_hop_original_generate_object,wiki_hop_original_generate_subject,wiki_hop_original_generate_subject_and_object | lora |
c2o10 | mistralai/Mistral-7B-v0.1 | sordonia/flan-10k-flat/app_reviews_categorize_rating_using_review,cos_e_v1_11_question_option_description_text,cot_qasc,cot_strategyqa_ii,dbpedia_14_pick_one_category_for_the_following_text,definite_pronoun_resolution_1_1_0,kilt_tasks_hotpotqa_final_exam,math_dataset_algebra__linear_1d_1_0_0,qasc_qa_with_separated_facts_4,quarel_do_not_use,quoref_Context_Contains_Answer,race_high_Is_this_the_right_answer,race_middle_Is_this_the_right_answer,sciq_Direct_Question,sciq_Multiple_Choice,sciq_Multiple_Choice_Closed_Book_,sciq_Multiple_Choice_Question_First,social_i_qa_Show_choices_and_generate_index,stream_aqua_ii,super_glue_cb_1_0_2,super_glue_copa_1_0_2,unified_qa_science_inst,wiki_qa_Decide_good_answer,wiki_qa_Direct_Answer_to_Question,wiki_qa_Generate_Question_from_Topic,wiki_qa_Jeopardy_style,wiki_qa_Topic_Prediction_Answer_Only,wiki_qa_Topic_Prediction_Question_Only,wiki_qa_Topic_Prediction_Question_and_Answer_Pair,wiki_qa_automatic_system,wiki_qa_exercise,wiki_qa_found_on_google | lora |
c7o10 | mistralai/Mistral-7B-v0.1 | sordonia/flan-10k-flat/aeslc_1_0_0,cnn_dailymail_3_4_0,coqa_1_0_0,cot_gsm8k,dream_answer_to_dialogue,duorc_ParaphraseRC_build_story_around_qa,duorc_SelfRC_build_story_around_qa,gem_e2e_nlg_1_1_0,gem_web_nlg_en_1_1_0,gem_wiki_lingua_english_en_1_1_0,multi_news_1_0_0,wiki_bio_comprehension,wiki_bio_key_content,wiki_bio_what_content,wiki_bio_who,wiqa_what_is_the_final_step_of_the_following_process,wiqa_what_might_be_the_first_step_of_the_process,wiqa_what_might_be_the_last_step_of_the_process,wmt16_translate_tr_en_1_0_0 | lora |
c4o10 | mistralai/Mistral-7B-v0.1 | sordonia/flan-10k-flat/duorc_ParaphraseRC_answer_question,duorc_ParaphraseRC_decide_worth_it,duorc_ParaphraseRC_extract_answer,duorc_ParaphraseRC_generate_question,duorc_ParaphraseRC_movie_director,duorc_ParaphraseRC_question_answering,duorc_SelfRC_answer_question,duorc_SelfRC_decide_worth_it,duorc_SelfRC_extract_answer,duorc_SelfRC_generate_question,duorc_SelfRC_movie_director,duorc_SelfRC_question_answering,quac_1_0_0,quoref_Answer_Friend_Question,quoref_Answer_Test,quoref_Find_Answer,quoref_Found_Context_Online,quoref_Given_Context_Answer_Question,quoref_Guess_Answer,quoref_Guess_Title_For_Context,quoref_Read_And_Extract_,quoref_What_Is_The_Answer | lora |
c3o10 | mistralai/Mistral-7B-v0.1 | sordonia/flan-10k-flat/adversarial_qa_dbert_answer_the_following_q,adversarial_qa_dbert_based_on,adversarial_qa_dbert_question_context_answer,adversarial_qa_dbert_tell_what_it_is,adversarial_qa_dbidaf_answer_the_following_q,adversarial_qa_dbidaf_based_on,adversarial_qa_dbidaf_question_context_answer,adversarial_qa_dbidaf_tell_what_it_is,adversarial_qa_droberta_answer_the_following_q,adversarial_qa_droberta_based_on,adversarial_qa_droberta_question_context_answer,adversarial_qa_droberta_tell_what_it_is,cos_e_v1_11_aligned_with_common_sense,cos_e_v1_11_explain_why_human,cos_e_v1_11_generate_explanation_given_text,cos_e_v1_11_i_think,cos_e_v1_11_rationale,drop_2_0_0,duorc_ParaphraseRC_generate_question_by_answer,duorc_SelfRC_generate_question_by_answer,kilt_tasks_hotpotqa_combining_facts,kilt_tasks_hotpotqa_formulate,kilt_tasks_hotpotqa_straighforward_qa,natural_questions_open_1_0_0,trivia_qa_rc_1_1_0,web_questions_get_the_answer,web_questions_potential_correct_answer,web_questions_question_answer,web_questions_short_general_knowledge_q,web_questions_whats_the_answer | lora |
c1o10 | mistralai/Mistral-7B-v0.1 | sordonia/flan-10k-flat/ag_news_subset_1_0_0,app_reviews_convert_to_rating,app_reviews_convert_to_star_rating,cot_creak_ii,cot_ecqa_ii,cot_gsm8k_ii,cot_sensemaking_ii,cot_strategyqa,dbpedia_14_given_a_choice_of_categories_,dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to,dbpedia_14_given_list_what_category_does_the_paragraph_belong_to,glue_mnli_2_0_0,glue_qnli_2_0_0,glue_qqp_2_0_0,glue_stsb_2_0_0,glue_wnli_2_0_0,kilt_tasks_hotpotqa_complex_question,paws_wiki_1_1_0,qasc_is_correct_1,qasc_is_correct_2,snli_1_1_0,social_i_qa_Check_if_a_random_answer_is_valid_or_not,social_i_qa_Generate_answer,social_i_qa_Generate_the_question_from_the_answer,social_i_qa_I_was_wondering,squad_v1_1_3_0_0,squad_v2_0_3_0_0,stream_qed_ii,super_glue_multirc_1_0_2,super_glue_rte_1_0_2,super_glue_wic_1_0_2,super_glue_wsc_fixed_1_0_2,trec_1_0_0,wiki_bio_guess_person,wiki_qa_Is_This_True_ | lora |
c5o10 | mistralai/Mistral-7B-v0.1 | sordonia/flan-10k-flat/cos_e_v1_11_description_question_option_id,cos_e_v1_11_question_description_option_id,dream_baseline,dream_read_the_following_conversation_and_answer_the_question,quail_context_description_question_answer_id,quail_context_description_question_answer_text,quail_context_description_question_text,quail_context_question_answer_description_id,quail_context_question_answer_description_text,quail_context_question_description_answer_id,quail_context_question_description_answer_text,quail_context_question_description_text,quail_description_context_question_answer_id,quail_description_context_question_answer_text,quail_description_context_question_text,quail_no_prompt_id,quail_no_prompt_text,race_high_Read_the_article_and_answer_the_question_no_option_,race_high_Select_the_best_answer,race_high_Select_the_best_answer_generate_span_,race_high_Select_the_best_answer_no_instructions_,race_high_Taking_a_test,race_middle_Read_the_article_and_answer_the_question_no_option_,race_middle_Select_the_best_answer,race_middle_Select_the_best_answer_generate_span_,race_middle_Select_the_best_answer_no_instructions_,race_middle_Taking_a_test | lora |
Last updated on: 2024-04-06 04:06:45+00:00 |