turkish-deepseek / tokenizer_config.json
alibayram's picture
Upload folder using huggingface_hub
0602132 verified
{
"tokenizer_class": "LlamaTokenizer",
"model_max_length": 256,
"pad_token": "<pad>",
"bos_token": "<s>",
"eos_token": "</s>",
"unk_token": "<unk>",
"clean_up_tokenization_spaces": false,
"auto_map": {
"AutoTokenizer": [
"sentencepiece",
"LlamaTokenizer"
]
}
}