wwe180 commited on
Commit
c2914f9
·
verified ·
1 Parent(s): bddb1ac

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -25
README.md CHANGED
@@ -1,25 +1 @@
1
- ---
2
- base_model:
3
- - NousResearch/Meta-Llama-3-8B-Instruct
4
- - Sao10K/L3-8B-Stheno-v3.1
5
- library_name: transformers
6
- tags:
7
- - mergekit
8
- - peft
9
-
10
- ---
11
- # Untitled LoRA Model (1)
12
-
13
- This is a LoRA extracted from a language model. It was extracted using [mergekit](https://github.com/arcee-ai/mergekit).
14
-
15
- ## LoRA Details
16
-
17
- This LoRA adapter was extracted from [Sao10K/L3-8B-Stheno-v3.1](https://huggingface.co/Sao10K/L3-8B-Stheno-v3.1) and uses [NousResearch/Meta-Llama-3-8B-Instruct](https://huggingface.co/NousResearch/Meta-Llama-3-8B-Instruct) as a base.
18
-
19
- ### Parameters
20
-
21
- The following command was used to extract this LoRA adapter:
22
-
23
- ```sh
24
- mergekit-extract-lora NousResearch/Meta-Llama-3-8B-Instruct Sao10K/L3-8B-Stheno-v3.1 OUTPUT_PATH --rank=32
25
- ```
 
1
+ This LoRA is for testing purposes!