Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
thu-coai
/
roberta-zh-specific
like
0
Follow
Conversational AI (CoAI) group from Tsinghua University
41
Text Classification
Transformers
PyTorch
Safetensors
Chinese
bert
Conversational
Inference Endpoints
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
main
roberta-zh-specific
/
special_tokens_map.json
everk
Upload tokenizer
1c9b36b
about 2 years ago
raw
Copy download link
history
blame
contribute
delete
Safe
125 Bytes
{
"cls_token"
:
"[CLS]"
,
"mask_token"
:
"[MASK]"
,
"pad_token"
:
"[PAD]"
,
"sep_token"
:
"[SEP]"
,
"unk_token"
:
"[UNK]"
}