YAML Metadata
Error:
"license" must be one of [apache-2.0, mit, openrail, bigscience-openrail-m, creativeml-openrail-m, bigscience-bloom-rail-1.0, bigcode-openrail-m, afl-3.0, artistic-2.0, bsl-1.0, bsd, bsd-2-clause, bsd-3-clause, bsd-3-clause-clear, c-uda, cc, cc0-1.0, cc-by-2.0, cc-by-2.5, cc-by-3.0, cc-by-4.0, cc-by-sa-3.0, cc-by-sa-4.0, cc-by-nc-2.0, cc-by-nc-3.0, cc-by-nc-4.0, cc-by-nd-4.0, cc-by-nc-nd-3.0, cc-by-nc-nd-4.0, cc-by-nc-sa-2.0, cc-by-nc-sa-3.0, cc-by-nc-sa-4.0, cdla-sharing-1.0, cdla-permissive-1.0, cdla-permissive-2.0, wtfpl, ecl-2.0, epl-1.0, epl-2.0, etalab-2.0, eupl-1.1, agpl-3.0, gfdl, gpl, gpl-2.0, gpl-3.0, lgpl, lgpl-2.1, lgpl-3.0, isc, lppl-1.3c, ms-pl, apple-ascl, mpl-2.0, odc-by, odbl, openrail++, osl-3.0, postgresql, ofl-1.1, ncsa, unlicense, zlib, pddl, lgpl-lr, deepfloyd-if-license, llama2, llama3, llama3.1, llama3.2, llama3.3, gemma, unknown, other, array]
Japanese transformer pipeline (bert-base). Components: transformer, parser, ner.
Feature | Description |
---|---|
Name | ja_gsd_bert_wwm_unidic_lite |
Version | 3.1.1 |
spaCy | >=3.1.0,<3.2.0 |
Default Pipeline | transformer , parser , ner |
Components | transformer , parser , ner |
Vectors | 0 keys, 0 unique vectors (0 dimensions) |
Sources | UD_Japanese-GSD UD_Japanese-GSD r2.8+NE SudachiDict_core cl-tohoku/bert-base-japanese-whole-word-masking unidic_lite |
License | CC BY-SA 4.0 |
Author | Megagon Labs Tokyo. |
Label Scheme
View label scheme (45 labels for 2 components)
Component | Labels |
---|---|
parser |
ROOT , acl , advcl , advmod , amod , aux , case , cc , ccomp , compound , cop , csubj , dep , det , dislocated , fixed , mark , nmod , nsubj , nummod , obj , obl , punct |
ner |
CARDINAL , DATE , EVENT , FAC , GPE , LANGUAGE , LAW , LOC , MONEY , MOVEMENT , NORP , ORDINAL , ORG , PERCENT , PERSON , PET_NAME , PHONE , PRODUCT , QUANTITY , TIME , TITLE_AFFIX , WORK_OF_ART |
Accuracy
Type | Score |
---|---|
DEP_UAS |
93.68 |
DEP_LAS |
92.61 |
SENTS_P |
92.02 |
SENTS_R |
95.46 |
SENTS_F |
93.71 |
ENTS_F |
84.04 |
ENTS_P |
84.96 |
ENTS_R |
83.14 |
TAG_ACC |
0.00 |
TRANSFORMER_LOSS |
28861.67 |
PARSER_LOSS |
1306248.63 |
NER_LOSS |
13993.36 |
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Evaluation results
- NER Precisionself-reported0.850
- NER Recallself-reported0.831
- NER F Scoreself-reported0.840
- POS Accuracyself-reported0.000
- SENTER Precisionself-reported0.920
- SENTER Recallself-reported0.955
- SENTER F Scoreself-reported0.937
- Unlabeled Dependencies Accuracyself-reported0.937
- Labeled Dependencies Accuracyself-reported0.937