Instructions to use amazeble/crewai-lora-adapter with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use amazeble/crewai-lora-adapter with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("/content/models/models--mistralai--Mistral-7B-Instruct-v0.2/snapshots/36d7e540e651b68dac59394d9c3381651df7fb01") model = PeftModel.from_pretrained(base_model, "amazeble/crewai-lora-adapter") - Notebooks
- Google Colab
- Kaggle
Low-rank decomposition of Superoisesuki/Mistral_7B_CrewAI using mistralai/Mistral-7B-Instruct-v0.2 as base
Created using LoRD
- Downloads last month
- 1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for amazeble/crewai-lora-adapter
Base model
mistralai/Mistral-7B-Instruct-v0.2