jeiku commited on
Commit
4731f43
1 Parent(s): d63dafa

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -32
README.md CHANGED
@@ -8,42 +8,21 @@ library_name: transformers
8
  tags:
9
  - mergekit
10
  - merge
11
-
 
 
 
 
 
12
  ---
13
  # Ashera
14
 
15
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
16
-
17
- ## Merge Details
18
- ### Merge Method
19
-
20
- This model was merged using the SLERP merge method.
21
-
22
- ### Models Merged
23
 
24
- The following models were included in the merge:
25
- * [jeiku/parttwo](https://huggingface.co/jeiku/parttwo) + [jeiku/Gnosis_Reformatted_Mistral](https://huggingface.co/jeiku/Gnosis_Reformatted_Mistral)
26
- * [jeiku/partone](https://huggingface.co/jeiku/partone) + [jeiku/Synthetic_Soul_1k_Mistral_128](https://huggingface.co/jeiku/Synthetic_Soul_1k_Mistral_128)
27
 
28
- ### Configuration
29
 
30
- The following YAML configuration was used to produce this model:
31
 
32
- ```yaml
33
- slices:
34
- - sources:
35
- - model: jeiku/partone+jeiku/Synthetic_Soul_1k_Mistral_128
36
- layer_range: [0, 32]
37
- - model: jeiku/parttwo+jeiku/Gnosis_Reformatted_Mistral
38
- layer_range: [0, 32]
39
- merge_method: slerp
40
- base_model: jeiku/partone+jeiku/Synthetic_Soul_1k_Mistral_128
41
- parameters:
42
- t:
43
- - filter: self_attn
44
- value: [0, 0.5, 0.3, 0.7, 1]
45
- - filter: mlp
46
- value: [1, 0.5, 0.7, 0.3, 0]
47
- - value: 0.5
48
- dtype: bfloat16
49
- ```
 
8
  tags:
9
  - mergekit
10
  - merge
11
+ license: apache-2.0
12
+ datasets:
13
+ - ResplendentAI/Synthetic_Soul_1k
14
+ - Epiculous/Gnosis
15
+ language:
16
+ - en
17
  ---
18
  # Ashera
19
 
20
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/626dfb8786671a29c715f8a9/PwebDlwW-mPHC8yQwV2mF.png)
 
 
 
 
 
 
 
21
 
22
+ Asherah, goddess of all creation according to ancient myth was a huge inspiration for this model. The model started with a merge of four of Sanji Watsuki's models using various methods. This merge was then finetuned on Gnosis and Synthetic Soul, two datasets penned by myself.It was then merged with Excalibur to impart multimodality. I consider this a great achievement.
 
 
23
 
24
+ You can use this as mmproj: https://huggingface.co/cjpais/llava-1.6-mistral-7b-gguf/blob/main/mmproj-model-f16.gguf
25
 
26
+ I have also included a folder in the repo containing this file. It will be necessary for multimodal GGUF users. I recommend Koboldcpp.
27
 
28
+ Multimodal functionality is limited to GGUF users at this time. You can still use this model as a standard LLM.