DavidLee12356 commited on
Commit
8490af3
·
verified ·
1 Parent(s): 1e59883

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -0
README.md CHANGED
@@ -7,4 +7,36 @@ language:
7
  base_model:
8
  - THUDM/GLM-4-32B-0414
9
  library_name: mlx
 
 
 
10
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7
  base_model:
8
  - THUDM/GLM-4-32B-0414
9
  library_name: mlx
10
+ pipeline_tag: text-generation
11
+ tags:
12
+ - mlx
13
  ---
14
+
15
+
16
+ # mlx-community/GLM-4-32B-0414-4bit
17
+
18
+ This model [mlx-community/GLM-4-32B-0414-4bit](https://huggingface.co/mlx-community/GLM-4-32B-0414-4bit) was
19
+ converted to MLX format from [THUDM/GLM-4-32B-0414](https://huggingface.co/THUDM/GLM-4-32B-0414)
20
+ using mlx-lm version **0.23.1**.
21
+
22
+ ## Use with mlx
23
+
24
+ ```bash
25
+ pip install mlx-lm
26
+ ```
27
+
28
+ ```python
29
+ from mlx_lm import load, generate
30
+
31
+ model, tokenizer = load("mlx-community/GLM-4-32B-0414-4bit")
32
+
33
+ prompt = "hello"
34
+
35
+ if tokenizer.chat_template is not None:
36
+ messages = [{"role": "user", "content": prompt}]
37
+ prompt = tokenizer.apply_chat_template(
38
+ messages, add_generation_prompt=True
39
+ )
40
+
41
+ response = generate(model, tokenizer, prompt=prompt, verbose=True)
42
+ ```