Dataset?
what dataset was it train? is openly available?
Apparently (not so sure but based on the paper) they finetuned based on ada-instruct, they have a dataset on their repo.
Check this
https://github.com/wangitu/Ada-Instruct/tree/main/data%2Fada-instruct
We did not use the Ada-instruct dataset
I hand-wrote 21 examples, and also used 500 examples from ROPES augmented with reasoning from GPT-4
I hand-wrote 21 examples, and also used 500 examples from ROPES augmented with reasoning from GPT-4
Looking forward to being shared.
We did not use the Ada-instruct dataset
I hand-wrote 21 examples, and also used 500 examples from ROPES augmented with reasoning from GPT-4
was the dataset really that small? @euclaise , kind of suprising considering the size of some STF datasets these days.
We did not use the Ada-instruct dataset
I hand-wrote 21 examples, and also used 500 examples from ROPES augmented with reasoning from GPT-4
was the dataset really that small? @euclaise , kind of suprising considering the size of some STF datasets these days.
Yes. The original Ada-instruct even used only 10 examples
I trained for dozens of epochs though