deep-metal / tokenizer_config.json
lucone83's picture
fix: fix tokenizer files
2c4cde7
raw
history blame contribute delete
111 Bytes
{"model_max_length": 1024, "special_tokens_map_file": "./special_tokens_map.json", "full_tokenizer_file": null}