add example
Browse files
README.md
CHANGED
@@ -3,6 +3,12 @@ Are you frequently asked google-able Trivia questions and annoyed by it? Well, t
|
|
3 |
|
4 |
This is by far the largest model trained and should be _more_ credible in its answers or at least able to handle more kinds of questions.
|
5 |
|
|
|
|
|
|
|
|
|
|
|
|
|
6 |
|
7 |
## Training
|
8 |
This text gen model is a GPT-2 ~1.5 B Parameter Size XL Model, first trained on [Wizard of Wikipedia](https://parl.ai/projects/wizard_of_wikipedia/) for 40k steps (**33**/36 layers frozen for the fine-tuning), and then subsequently trained for 40k steps on a parsed variant of [Natural Questions](https://ai.google.com/research/NaturalQuestions)(then **34**/36 layers frozen for the second fine-tuning) to accidentally create this model.
|
|
|
3 |
|
4 |
This is by far the largest model trained and should be _more_ credible in its answers or at least able to handle more kinds of questions.
|
5 |
|
6 |
+
```
|
7 |
+
what is the temperature of dry ice in kelvin
|
8 |
+
|
9 |
+
person beta:
|
10 |
+
194.65 K
|
11 |
+
```
|
12 |
|
13 |
## Training
|
14 |
This text gen model is a GPT-2 ~1.5 B Parameter Size XL Model, first trained on [Wizard of Wikipedia](https://parl.ai/projects/wizard_of_wikipedia/) for 40k steps (**33**/36 layers frozen for the fine-tuning), and then subsequently trained for 40k steps on a parsed variant of [Natural Questions](https://ai.google.com/research/NaturalQuestions)(then **34**/36 layers frozen for the second fine-tuning) to accidentally create this model.
|