morriszms commited on
Commit
e4186c4
·
verified ·
1 Parent(s): 726c2fb

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -33,3 +33,15 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
 
 
 
 
 
 
 
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ Sailor2-20B-Chat-Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
37
+ Sailor2-20B-Chat-Q3_K_L.gguf filter=lfs diff=lfs merge=lfs -text
38
+ Sailor2-20B-Chat-Q3_K_M.gguf filter=lfs diff=lfs merge=lfs -text
39
+ Sailor2-20B-Chat-Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
40
+ Sailor2-20B-Chat-Q4_0.gguf filter=lfs diff=lfs merge=lfs -text
41
+ Sailor2-20B-Chat-Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
42
+ Sailor2-20B-Chat-Q4_K_S.gguf filter=lfs diff=lfs merge=lfs -text
43
+ Sailor2-20B-Chat-Q5_0.gguf filter=lfs diff=lfs merge=lfs -text
44
+ Sailor2-20B-Chat-Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
45
+ Sailor2-20B-Chat-Q5_K_S.gguf filter=lfs diff=lfs merge=lfs -text
46
+ Sailor2-20B-Chat-Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
47
+ Sailor2-20B-Chat-Q8_0.gguf filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,114 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ - zh
5
+ - id
6
+ - th
7
+ - vi
8
+ - ms
9
+ - lo
10
+ - my
11
+ - jv
12
+ - km
13
+ - su
14
+ - tl
15
+ tags:
16
+ - multilingual
17
+ - sea
18
+ - sailor
19
+ - sft
20
+ - chat
21
+ - instruction
22
+ - TensorBlock
23
+ - GGUF
24
+ widget:
25
+ - text: 如何制作烤鱼?
26
+ example_title: Chinese
27
+ - text: How to bake fish?
28
+ example_title: English
29
+ - text: Bagaimana cara memanggang ikan?
30
+ example_title: Malay
31
+ - text: วิธีย่างปลา?
32
+ example_title: Thai
33
+ - text: Bagaimana membuat bakaran ikan?
34
+ example_title: Indonesian
35
+ - text: Làm thế nào để nướng cá?
36
+ example_title: Vietnamese
37
+ license: apache-2.0
38
+ base_model: sail/Sailor2-20B-Chat
39
+ library_name: transformers
40
+ pipeline_tag: text-generation
41
+ ---
42
+
43
+ <div style="width: auto; margin-left: auto; margin-right: auto">
44
+ <img src="https://i.imgur.com/jC7kdl8.jpeg" alt="TensorBlock" style="width: 100%; min-width: 400px; display: block; margin: auto;">
45
+ </div>
46
+ <div style="display: flex; justify-content: space-between; width: 100%;">
47
+ <div style="display: flex; flex-direction: column; align-items: flex-start;">
48
+ <p style="margin-top: 0.5em; margin-bottom: 0em;">
49
+ Feedback and support: TensorBlock's <a href="https://x.com/tensorblock_aoi">Twitter/X</a>, <a href="https://t.me/TensorBlock">Telegram Group</a> and <a href="https://x.com/tensorblock_aoi">Discord server</a>
50
+ </p>
51
+ </div>
52
+ </div>
53
+
54
+ ## sail/Sailor2-20B-Chat - GGUF
55
+
56
+ This repo contains GGUF format model files for [sail/Sailor2-20B-Chat](https://huggingface.co/sail/Sailor2-20B-Chat).
57
+
58
+ The files were quantized using machines provided by [TensorBlock](https://tensorblock.co/), and they are compatible with llama.cpp as of [commit b4823](https://github.com/ggml-org/llama.cpp/commit/5bbe6a9fe9a8796a9389c85accec89dbc4d91e39).
59
+
60
+ <div style="text-align: left; margin: 20px 0;">
61
+ <a href="https://tensorblock.co/waitlist/client" style="display: inline-block; padding: 10px 20px; background-color: #007bff; color: white; text-decoration: none; border-radius: 5px; font-weight: bold;">
62
+ Run them on the TensorBlock client using your local machine ↗
63
+ </a>
64
+ </div>
65
+
66
+ ## Prompt template
67
+
68
+ ```
69
+ <|im_start|>system
70
+ {system_prompt}<|im_end|>
71
+ <|im_start|>user
72
+ {prompt}<|im_end|>
73
+ <|im_start|>assistant
74
+ ```
75
+
76
+ ## Model file specification
77
+
78
+ | Filename | Quant type | File Size | Description |
79
+ | -------- | ---------- | --------- | ----------- |
80
+ | [Sailor2-20B-Chat-Q2_K.gguf](https://huggingface.co/tensorblock/Sailor2-20B-Chat-GGUF/blob/main/Sailor2-20B-Chat-Q2_K.gguf) | Q2_K | 7.393 GB | smallest, significant quality loss - not recommended for most purposes |
81
+ | [Sailor2-20B-Chat-Q3_K_S.gguf](https://huggingface.co/tensorblock/Sailor2-20B-Chat-GGUF/blob/main/Sailor2-20B-Chat-Q3_K_S.gguf) | Q3_K_S | 8.552 GB | very small, high quality loss |
82
+ | [Sailor2-20B-Chat-Q3_K_M.gguf](https://huggingface.co/tensorblock/Sailor2-20B-Chat-GGUF/blob/main/Sailor2-20B-Chat-Q3_K_M.gguf) | Q3_K_M | 9.458 GB | very small, high quality loss |
83
+ | [Sailor2-20B-Chat-Q3_K_L.gguf](https://huggingface.co/tensorblock/Sailor2-20B-Chat-GGUF/blob/main/Sailor2-20B-Chat-Q3_K_L.gguf) | Q3_K_L | 10.239 GB | small, substantial quality loss |
84
+ | [Sailor2-20B-Chat-Q4_0.gguf](https://huggingface.co/tensorblock/Sailor2-20B-Chat-GGUF/blob/main/Sailor2-20B-Chat-Q4_0.gguf) | Q4_0 | 10.995 GB | legacy; small, very high quality loss - prefer using Q3_K_M |
85
+ | [Sailor2-20B-Chat-Q4_K_S.gguf](https://huggingface.co/tensorblock/Sailor2-20B-Chat-GGUF/blob/main/Sailor2-20B-Chat-Q4_K_S.gguf) | Q4_K_S | 11.069 GB | small, greater quality loss |
86
+ | [Sailor2-20B-Chat-Q4_K_M.gguf](https://huggingface.co/tensorblock/Sailor2-20B-Chat-GGUF/blob/main/Sailor2-20B-Chat-Q4_K_M.gguf) | Q4_K_M | 11.622 GB | medium, balanced quality - recommended |
87
+ | [Sailor2-20B-Chat-Q5_0.gguf](https://huggingface.co/tensorblock/Sailor2-20B-Chat-GGUF/blob/main/Sailor2-20B-Chat-Q5_0.gguf) | Q5_0 | 13.294 GB | legacy; medium, balanced quality - prefer using Q4_K_M |
88
+ | [Sailor2-20B-Chat-Q5_K_S.gguf](https://huggingface.co/tensorblock/Sailor2-20B-Chat-GGUF/blob/main/Sailor2-20B-Chat-Q5_K_S.gguf) | Q5_K_S | 13.294 GB | large, low quality loss - recommended |
89
+ | [Sailor2-20B-Chat-Q5_K_M.gguf](https://huggingface.co/tensorblock/Sailor2-20B-Chat-GGUF/blob/main/Sailor2-20B-Chat-Q5_K_M.gguf) | Q5_K_M | 13.618 GB | large, very low quality loss - recommended |
90
+ | [Sailor2-20B-Chat-Q6_K.gguf](https://huggingface.co/tensorblock/Sailor2-20B-Chat-GGUF/blob/main/Sailor2-20B-Chat-Q6_K.gguf) | Q6_K | 15.737 GB | very large, extremely low quality loss |
91
+ | [Sailor2-20B-Chat-Q8_0.gguf](https://huggingface.co/tensorblock/Sailor2-20B-Chat-GGUF/blob/main/Sailor2-20B-Chat-Q8_0.gguf) | Q8_0 | 20.381 GB | very large, extremely low quality loss - not recommended |
92
+
93
+
94
+ ## Downloading instruction
95
+
96
+ ### Command line
97
+
98
+ Firstly, install Huggingface Client
99
+
100
+ ```shell
101
+ pip install -U "huggingface_hub[cli]"
102
+ ```
103
+
104
+ Then, downoad the individual model file the a local directory
105
+
106
+ ```shell
107
+ huggingface-cli download tensorblock/Sailor2-20B-Chat-GGUF --include "Sailor2-20B-Chat-Q2_K.gguf" --local-dir MY_LOCAL_DIR
108
+ ```
109
+
110
+ If you wanna download multiple model files with a pattern (e.g., `*Q4_K*gguf`), you can try:
111
+
112
+ ```shell
113
+ huggingface-cli download tensorblock/Sailor2-20B-Chat-GGUF --local-dir MY_LOCAL_DIR --local-dir-use-symlinks False --include='*Q4_K*gguf'
114
+ ```
Sailor2-20B-Chat-Q2_K.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d0b2a9c92dd3b0202fe5f70041877f7a1551b1fc79fcf80287a6be3f1e2a694e
3
+ size 7393209280
Sailor2-20B-Chat-Q3_K_L.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3dee672e99acc2e715e1359f0c19b79a8d67016064775ed42806154b3df36526
3
+ size 10239145920
Sailor2-20B-Chat-Q3_K_M.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:70eda9e3a23f7359c410712abd11000bc608884fe900aa957e25bc41f4b680b7
3
+ size 9457956800
Sailor2-20B-Chat-Q3_K_S.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fc6563c04f1c746bc9069374b07d72b9823ed9a52e4a900bf31c365989cd7af4
3
+ size 8552249280
Sailor2-20B-Chat-Q4_0.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3d7381784db4f5688c32ab297365b597e141aa174ba3035a3ffd663e99fea139
3
+ size 10995200960
Sailor2-20B-Chat-Q4_K_M.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6965c656f96aee722c79181b88c7ea29d7926785852272cd0311b07b3b6cea4a
3
+ size 11622380480
Sailor2-20B-Chat-Q4_K_S.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b455696999733183ce3a3a9124f42056397294f09c96313bc4d0cc73b0455bc1
3
+ size 11068601280
Sailor2-20B-Chat-Q5_0.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c3d9211c56ab3074d39a3ad805bf3e133e4b7d2b9ca911032f3190eec473dd61
3
+ size 13294449600
Sailor2-20B-Chat-Q5_K_M.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:49ab0470976e569645516f9200b0398949ee3540b03ff912daeeef1d8027fa38
3
+ size 13617542080
Sailor2-20B-Chat-Q5_K_S.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:71dccb27353fa3710db59d1122a9cbfb0054d7508765c94879eaf04bebe6191b
3
+ size 13294449600
Sailor2-20B-Chat-Q6_K.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8ddbdb821de8121f0104931ee2a52dc8f31c04620364d894f8c865c41263604d
3
+ size 15737401280
Sailor2-20B-Chat-Q8_0.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:15ea59fcc360c3ca8cbe83e9b44f53bb933d255d5e929a8db5cd04d382ad16ff
3
+ size 20380596160