Update README.md
Browse files
README.md
CHANGED
@@ -3,6 +3,11 @@ license: mit
|
|
3 |
language:
|
4 |
- ru
|
5 |
- en
|
|
|
|
|
|
|
|
|
|
|
6 |
---
|
7 |
This Seq2Seq transformer model is designed to translate text, compose headings, generate questions and summarizations. Its size is only 80 million parameters, which allows you to work with good speed on equipment with low computing power.
|
8 |
|
@@ -49,4 +54,4 @@ rtext=requestor0(text,"Summarize")
|
|
49 |
print(rtext+"\n")
|
50 |
```
|
51 |
The prompts on which the model was trained:
|
52 |
-
"Translate to english"; "Translate to russian"; "Generate question", "Make a title" "What is the topic of this text?", "Summarize", "Summarize briefly", "Make a step-by-step plan for this text" and russian analogs for this commnads. However, some combinations that were not present in the training sample also work. For example, "Translate briefly" can give a translation about twice as compact as the original text.
|
|
|
3 |
language:
|
4 |
- ru
|
5 |
- en
|
6 |
+
metrics:
|
7 |
+
- bleu
|
8 |
+
Average BLEU score (EN->RU): 0.2703
|
9 |
+
Average BLEU score (RU->EN): 0.3445
|
10 |
+
pipeline_tag: translation
|
11 |
---
|
12 |
This Seq2Seq transformer model is designed to translate text, compose headings, generate questions and summarizations. Its size is only 80 million parameters, which allows you to work with good speed on equipment with low computing power.
|
13 |
|
|
|
54 |
print(rtext+"\n")
|
55 |
```
|
56 |
The prompts on which the model was trained:
|
57 |
+
"Translate to english"; "Translate to russian"; "Generate question", "Make a title" "What is the topic of this text?", "Summarize", "Summarize briefly", "Make a step-by-step plan for this text" and russian analogs for this commnads. However, some combinations that were not present in the training sample also work. For example, "Translate briefly" can give a translation about twice as compact as the original text.
|