vblagoje/lfqa
Viewer โข Updated โข 239k โข 325 โข 16
How to use pszemraj/t5-base-askscience-lfqa with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("pszemraj/t5-base-askscience-lfqa")
model = AutoModelForSeq2SeqLM.from_pretrained("pszemraj/t5-base-askscience-lfqa")vblagoje/lfqa dataset, with training duration of 2 epochs, for a (somewhat) apples-to-apples comparison with t5-base on the standard eli5 dataset.NOTE: the inference API is limited to generating approx. 64 chars for runtime reasons, for longer outputs try using it in python as a transformers pipeline object.
askscience subreddit in an attempt to focus on academic/technical queries.The following hyperparameters were used during training:
Base model
google/t5-v1_1-base