Datasets:

Modalities:
Tabular
Text
Formats:
parquet
Languages:
English
ArXiv:
DOI:
Libraries:
Datasets
Dask
License:

Limitation - context length of the BERT classifier?

#17
by liyucheng - opened

The classifier is taking the first 1500 chars as input due to the context limit of BERT series models.

This does not make full use of the Llama capability and can fail to identify length educational content.

Do we have any long-context BERT-like models on huggingface?

HuggingFaceFW org

@loubnabnl many thanks, will try and share how it works here.

Sign up or log in to comment