pan-li commited on
Commit
06b10d9
·
verified ·
1 Parent(s): 45ae458

Upload folder using huggingface_hub

Browse files
Files changed (3) hide show
  1. .gitattributes +1 -0
  2. README.md +2 -2
  3. proteinmoe_architecture.png +3 -0
.gitattributes CHANGED
@@ -34,3 +34,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
  inverse_folding.png filter=lfs diff=lfs merge=lfs -text
 
 
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
  inverse_folding.png filter=lfs diff=lfs merge=lfs -text
37
+ proteinmoe_architecture.png filter=lfs diff=lfs merge=lfs -text
README.md CHANGED
@@ -66,8 +66,8 @@ We fine-tuned a pretrained masked language model using MSA data by concatenating
66
  | Sequence length | 2048 | 12800 | 12800 |
67
  | Per Device Micro Batch Size | 1 | 1 | 1 |
68
  | Precision | Mixed FP32-FP16 | Mixed FP32-FP16 | Mixed FP32-FP16 |
69
- | 1st Stage LR | [5e-6,5e-5] | [1e-6, 1e-5] | 1e-5 |
70
- | 1st Stage Num Tokens | 10 billion | 100 billion | 80 billion |
71
 
72
  ### Tokenization
73
 
 
66
  | Sequence length | 2048 | 12800 | 12800 |
67
  | Per Device Micro Batch Size | 1 | 1 | 1 |
68
  | Precision | Mixed FP32-FP16 | Mixed FP32-FP16 | Mixed FP32-FP16 |
69
+ | LR | [5e-6,5e-5] | [1e-6, 1e-5] | 1e-5 |
70
+ | Num Tokens | 10 billion | 100 billion | 80 billion |
71
 
72
  ### Tokenization
73
 
proteinmoe_architecture.png ADDED

Git LFS Details

  • SHA256: 670762ddcb58e41cea704d9fddb6acf9bb109216baff6eaadea401699b56552f
  • Pointer size: 131 Bytes
  • Size of remote file: 450 kB