Muennighoff commited on
Commit
5cd5354
1 Parent(s): cf022eb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -1
README.md CHANGED
@@ -18,7 +18,10 @@ library_name: transformers
18
 
19
  # Model Summary
20
 
21
- WIP
 
 
 
22
 
23
  # Use
24
 
 
18
 
19
  # Model Summary
20
 
21
+ > MolmoE-1B is a multimodal Mixture-of-Experts LLM with 1.5B active and 7.2B total parameters released in September 2024 (0924) based on [OLMoE-1B-7B-0924](https://huggingface.co/allenai/OLMoE-1B-7B-0924). It yields state-of-the-art performance among multimodal models with a similar size while being fully open-source.
22
+
23
+ - **Paper:** WIP
24
+ - **Code:** WIP
25
 
26
  # Use
27