comet-v2-gpt2-small-japanese / special_tokens_map.json
Eiki's picture
add tokenizer
ec23d3f
raw
history blame
136 Bytes
{"eos_token": "</s>", "unk_token": "<unk>", "pad_token": "</s>", "additional_special_tokens": ["xNeed", "xEffect", "xIntent", "xReact"]}