Phando commited on
Commit
582ee1e
1 Parent(s): 4f8e18d

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -0
README.md ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ ---
4
+ This is a Hugging Face transformers-style conversion of the original __SMoE 15B-parameter__ model with __BFLOAT16__ from the paper "[Efficient Large Scale Language Modeling with Mixtures of Experts](https://arxiv.org/abs/2112.10684)" from Artetxe et al. The original model card can be found at https://github.com/facebookresearch/fairseq/blob/main/examples/moe_lm/model_card.md.
5
+
6
+ The usage example and modeling code can be found at https://github.com/pingzhili/light-fairseq