Update README.md
Browse files
README.md
CHANGED
@@ -296,7 +296,7 @@ Click the expand button below to see the full list of language pairs and the dat
|
|
296 |
|---------------|--------------------:|------------------|
|
297 |
| en → ar | 74,471,769 | [WMT 2025 Translation Task Training Data](https://www2.statmt.org/wmt25/mtdata/), [News-Commentary](https://opus.nlpl.eu/News-Commentary/corpus/version/News-Commentary) |
|
298 |
| en → zh | 66,544,557 | [WMT 2025 Translation Task Training Data](https://www2.statmt.org/wmt25/mtdata/) |
|
299 |
-
| cs → de | 50,552,593 | [WMT 2025 Translation Task Training Data](https://www2.statmt.org/wmt25/mtdata/)
|
300 |
| en → ko | 27,378,827 | [WMT 2025 Translation Task Training Data](https://www2.statmt.org/wmt25/mtdata/) |
|
301 |
| en → hi | 27,273,478 | [CCMatrix](https://opus.nlpl.eu/CCMatrix), [MultiHPLT](https://opus.nlpl.eu/MultiHPLT), [NLLB](https://opus.nlpl.eu/NLLB), [Samanantar](https://opus.nlpl.eu/Samanantar) |
|
302 |
| en → cs | 20,000,000 | Sampled from [SalamandraTA-7B-Instruct](https://huggingface.co/BSC-LT/salamandraTA-7b-instruct) pretraining data |
|
|
|
296 |
|---------------|--------------------:|------------------|
|
297 |
| en → ar | 74,471,769 | [WMT 2025 Translation Task Training Data](https://www2.statmt.org/wmt25/mtdata/), [News-Commentary](https://opus.nlpl.eu/News-Commentary/corpus/version/News-Commentary) |
|
298 |
| en → zh | 66,544,557 | [WMT 2025 Translation Task Training Data](https://www2.statmt.org/wmt25/mtdata/) |
|
299 |
+
| cs → de | 50,552,593 | [WMT 2025 Translation Task Training Data](https://www2.statmt.org/wmt25/mtdata/) |
|
300 |
| en → ko | 27,378,827 | [WMT 2025 Translation Task Training Data](https://www2.statmt.org/wmt25/mtdata/) |
|
301 |
| en → hi | 27,273,478 | [CCMatrix](https://opus.nlpl.eu/CCMatrix), [MultiHPLT](https://opus.nlpl.eu/MultiHPLT), [NLLB](https://opus.nlpl.eu/NLLB), [Samanantar](https://opus.nlpl.eu/Samanantar) |
|
302 |
| en → cs | 20,000,000 | Sampled from [SalamandraTA-7B-Instruct](https://huggingface.co/BSC-LT/salamandraTA-7b-instruct) pretraining data |
|