File size: 387 Bytes
d646200 |
1 2 3 4 5 6 7 8 9 10 11 12 |
---
library_name: peft
base_model: mistralai/Mistral-7B-v0.1
---
# Low-rank decomposition of [yam-peleg/Experiment26-7B](https://huggingface.co/yam-peleg/Experiment26-7B) using [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) as base
Created using [LoRD](https://github.com/thomasgauthier/LoRD)
Join my AI Discord: [rwitz](https://discord.gg/qbqjBEfkGw)
|