File size: 822 Bytes
2f977be |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
---
base_model:
- meta-llama/Llama-3.1-8B-Instruct
- leafspark/Llama-3.1-8B-MultiReflection-Instruct
library_name: transformers
tags:
- mergekit
- peft
---
# Untitled LoRA Model (1)
This is a LoRA extracted from a language model. It was extracted using [mergekit](https://github.com/arcee-ai/mergekit).
## LoRA Details
This LoRA adapter was extracted from [leafspark/Llama-3.1-8B-MultiReflection-Instruct](https://huggingface.co/leafspark/Llama-3.1-8B-MultiReflection-Instruct) and uses [meta-llama/Llama-3.1-8B-Instruct](https://huggingface.co/meta-llama/Llama-3.1-8B-Instruct) as a base.
### Parameters
The following command was used to extract this LoRA adapter:
```sh
mergekit-extract-lora leafspark/Llama-3.1-8B-MultiReflection-Instruct meta-llama/Llama-3.1-8B-Instruct OUTPUT_PATH --rank=64 --device=cuda
```
|