calpt's picture
Add adapter bert-base-multilingual-cased_wikiann_ner_zh_pfeiffer version 1
2626339 verified
---
tags:
- bert
- adapter-transformers
- adapterhub:wikiann/zh
- token-classification
license: "apache-2.0"
---
# Adapter `bert-base-multilingual-cased_wikiann_ner_zh_pfeiffer` for bert-base-multilingual-cased
Stacked adapter on top of Language adapter. MAD-X 2.0 style. The language adapters in the last layer (layer 11) are deleted.
**This adapter was created for usage with the [Adapters](https://github.com/Adapter-Hub/adapters) library.**
## Usage
First, install `adapters`:
```
pip install -U adapters
```
Now, the adapter can be loaded and activated like this:
```python
from adapters import AutoAdapterModel
model = AutoAdapterModel.from_pretrained("bert-base-multilingual-cased")
adapter_name = model.load_adapter("AdapterHub/bert-base-multilingual-cased_wikiann_ner_zh_pfeiffer")
model.set_active_adapters(adapter_name)
```
## Architecture & Training
- Adapter architecture: pfeiffer
- Prediction head: tagging
- Dataset: [Chinese](https://adapterhub.ml/explore/wikiann/zh/)
## Author Information
- Author name(s): Jonas Pfeiffer
- Author email: [email protected]
- Author links: [Website](https://pfeiffer.ai), [GitHub](https://github.com/JoPfeiff), [Twitter](https://twitter.com/@PfeiffJo)
## Versions
- `1` **(main)**
- `2`
- `3`
- `4`
- `5`
## Citation
```bibtex
@article{Pfeiffer21UNKs,
author = {Jonas Pfeiffer and
Ivan Vuli\'{c} and
Iryna Gurevych and
Sebastian Ruder},
title = {{UNKs Everywhere: Adapting Multilingual Language Models to New Scripts}},
journal = {arXiv preprint},
year = {2021} ,
url = {https://arxiv.org/abs/2012.15562}
}
```
*This adapter has been auto-imported from https://github.com/Adapter-Hub/Hub/blob/master/adapters/ukp/bert-base-multilingual-cased_wikiann_ner_zh_pfeiffer.yaml*.