inappropriate tokenizer
#2
by
TinyLamda
- opened
This comment has been hidden (marked as Resolved)
@TinyLamda
Hi, thanks for raising the issue, we have added the file special_tokens_map.json
. If there exist any other tokenizer-related issue, would you elaborate more about that?