Lora Land - 27 High-Quality LoRA Adapters
Collection
27 Fine-tuned LoRA Adapters using Mistral-7B. Try them here: https://predibase.com/lora-land
•
27 items
•
Updated
•
3
Description: Does the hypothesis entail the premise?
Original dataset: https://huggingface.co/datasets/glue/viewer/mnli
---
Try querying this adapter for free in Lora Land at https://predibase.com/lora-land!
The adapter_category is Academic Benchmarks and the name is Natural Language Inference (MNLI)
---
Sample input: You are given a premise and a hypothesis below. If the premise entails the hypothesis, return 0. If the premise contradicts the hypothesis, return 2. Otherwise, if the premise does neither, return 1.\n\n### Premise: You and your friends are not welcome here, said Severn.\n\n### Hypothesis: Severn said the people were not welcome there.\n\n### Label:
---
Sample output: 0
---
Try using this adapter yourself!
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "mistralai/Mistral-7B-v0.1"
peft_model_id = "predibase/glue_mnli"
model = AutoModelForCausalLM.from_pretrained(model_id)
model.load_adapter(peft_model_id)
Base model
mistralai/Mistral-7B-v0.1