MMaDA-8B-Pretrain / README.md
Lingaaaaaaa's picture
Update README.md
6900aaf verified
metadata
license: mit
library_name: transformers
pipeline_tag: any-to-any

Model Card for MMaDA-8B-Pretrain

Pretrained MMaDA weight after stage 2 in https://github.com/Gen-Verse/MMaDA

Paper | Code | Demo

Citation

@article{yang2025mmada,
  title={MMaDA: Multimodal Large Diffusion Language Models},
  author={Yang, Ling and Tian, Ye and Li, Bowen and Zhang, Xinchen and Shen, Ke and Tong, Yunhai and Wang, Mengdi},
  journal={arXiv preprint arXiv:2505.15809},
  year={2025}
}