stefan-it commited on
Commit
fbeb1ea
1 Parent(s): b9bea85

readme: add initial version

Browse files
Files changed (1) hide show
  1. README.md +39 -0
README.md ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ language:
4
+ - nl
5
+ ---
6
+
7
+ # hmByT5 - Preliminary Language Models
8
+
9
+ Preliminary Historic Multilingual and Monolingual ByT5 Models. Following languages are currently covered:
10
+
11
+ * Dutch (Delpher Corpus)
12
+
13
+ More details can be found in [our GitHub repository](https://github.com/stefan-it/hmByT5).
14
+
15
+ # Pretraining
16
+
17
+ We use the official JAX/FLAX example in Hugging Face Transformers to pretrain a ByT5 model on a single v3-8 TPU.
18
+ Details about the training can be found [here](https://github.com/stefan-it/hmByT5/tree/main/hmbyt5-flax).
19
+
20
+ This model was trained with `mean_noise_span_length=20`.
21
+
22
+ # Evaluation on Downstream Tasks (NER)
23
+
24
+ We evaluated the hmByT5 model on ICDAR Europeana dataset:
25
+
26
+ | Configuration | Run 1 | Run 2 | Run 3 | Run 4 | Run 5 | Avg. |
27
+ |------------------------------------------|-------|-------|-------|-------|-------|--------------|
28
+ | `wsFalse-bs4-e10-lr0.00015-poolingfirst` | 86.61 | 85.88 | 87.65 | 87.93 | 88.01 | 87.22 ± 0.83 |
29
+ | `wsFalse-bs8-e10-lr0.00015-poolingfirst` | 87.88 | 87.56 | 85.62 | 86.52 | 87.03 | 86.92 ± 0.8 |
30
+ | `wsFalse-bs4-e10-lr0.00016-poolingfirst` | 86.17 | 85.87 | 87.77 | 86.58 | 87.96 | 86.87 ± 0.85 |
31
+ | `wsFalse-bs8-e10-lr0.00016-poolingfirst` | 87.67 | 86.02 | 85.66 | 87 | 85.99 | 86.47 ± 0.75 |
32
+
33
+ The results show no performance improvement of the [model](https://huggingface.co/hmbyt5/byt5-small-historic-dutch)
34
+ trained with `mean_noise_span_length=3`, that achieved 87.90 ± 0.71.
35
+
36
+ # Acknowledgements
37
+
38
+ Research supported with Cloud TPUs from Google's [TPU Research Cloud](https://sites.research.google/trc/about/) (TRC).
39
+ Many Thanks for providing access to the TPUs ❤️