---
base_model:
- GSAI-ML/LLaDA-8B-Instruct
language:
- en
library_name: transformers
---
# Large Language Diffusion with Ordered Unmasking (LLaDOU)
We introduce the **L**arge **La**nguage **D**iffusion with **O**rdered **U**nmasking (**LLaDOU**), which is trained by reinforcing a new reasoning paradigm named the **D**iffusion **C**hain **o**f **L**ateral **T**hought (**DCoLT**) for diffusion language models.
Compared to standard CoT, DCoLT is distinguished with several notable features:
- **Bidirectional Reasoning**: Allowing global refinement throughout generations with bidirectional self-attention masks.
- **Format-Free Reasoning**: No strict rule on grammatical correctness amid its intermediate steps of thought.
- **Nonlinear Generation**: Generating tokens at various positions in different steps.

## Instructions
**LLaDOU-v0-Math** is a math-specific model trained on GSM8K and MATH.
For inference codes and detailed instructions, please refer our github page: [maple-research-lab/LLaDOU](https://github.com/maple-research-lab/LLaDOU).