This is an 8bit version of distilcamembert-base-ner obtained with Intel® Neural Compressor on wikiner_fr dataset.
Get Started
First, install libraries:
pip install --upgrade-strategy eager "optimum[neural-compressor]" > null
Second, use INCModelForTokenClassification
from optimum.intel . It can be used in the similar way as
an ordinary DistilBertForTokenClassification
:
from transformers import AutoModelForTokenClassification, AutoTokenizer
from optimum.intel import INCModelForTokenClassification
model = INCModelForTokenClassification.from_pretrained('konverner/8bit-distilcamembert-base-ner')
tokenizer = AutoTokenizer.from_pretrained('konverner/8bit-distilcamembert-base-ner')
text = "Meta Platforms ou Meta, anciennement connue sous le nom de Facebook, est une multinationale américaine fondée en 2004 par Mark Zuckerberg."
model_input = tokenizer(text, return_tensors='pt')
model_output = model(**model_input)
print(model_output.logits.argmax(2))
# tensor([[0, 4, 4, 4, 4, 4, 0, 4, 4, 0, 0, 0, 0, 0, 0, 0, 0, 4, 0, 0, 0, 0, 0, 0,
# 0, 0, 0, 2, 2, 2, 2, 2, 0, 0]])
- Downloads last month
- 5
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.