File size: 732 Bytes
0188da0 ca8ed56 0188da0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
---
license: mit
language:
- zh
- en
pipeline_tag: text-generation
library_name: transformers
---
# GLM-4-32B-0414
## Introduction
Based on our latest technological advancements, we have trained a `GLM-4-0414` series model. During pretraining, we incorporated more code-related and reasoning-related data. In the alignment phase, we optimized the model specifically for agent capabilities. As a result, the model's performance in agent tasks such as tool use, web search, and coding has been significantly improved.
## Inference Code
Make Sure Using `transforemrs>=4.51.3`.
This model is a base model. If you need a chat model, please use the [GLM-4-32B-Chat-0414](https://huggingface.co/THUDM/GLM-4-32B-Chat-0414) chat model. |