MolmoE-1B-0924 / README.md
Muennighoff's picture
Update README.md
5cd5354 verified
|
raw
history blame
657 Bytes
metadata
license: apache-2.0
language:
  - en
tags:
  - moe
  - olmo
  - olmoe
  - molmo
  - molmoe
co2_eq_emissions: 1
datasets:
  - allenai/OLMoE-mix-0924
library_name: transformers
Molmo Logo.

Model Summary

MolmoE-1B is a multimodal Mixture-of-Experts LLM with 1.5B active and 7.2B total parameters released in September 2024 (0924) based on OLMoE-1B-7B-0924. It yields state-of-the-art performance among multimodal models with a similar size while being fully open-source.

  • Paper: WIP
  • Code: WIP

Use

WIP

Evaluation Snapshot

WIP

Citation

WIP