Token Classification
Transformers
PyTorch
Rust
ONNX
Safetensors
xlm-roberta

Labelling Scheme

#12
by swtb - opened

I notice that this model always give "I-PER" even when the original conll2003 dataset uses "B-PER, I-PER"

Why is this the case for this model?

Sign up or log in to comment