visualheist-large / tokenizer_config.json
shixuanleong
copied model files from tf-id-large
ed0d679
raw
history blame contribute delete
34 Bytes
{
"model_max_length": 1024
}