This model is a quantized version of BAAI/bge-large-en-v1.5
and is converted to the OpenVINO format. This model was obtained via the nncf-quantization space with optimum-intel.
First make sure you have optimum-intel
installed:
pip install optimum[openvino]
To load your model you can do as follows:
from optimum.intel import OVModelForFeatureExtraction
model_id = "nskeatts/bge-large-en-v1.5-openvino-8bit"
model = OVModelForFeatureExtraction.from_pretrained(model_id)
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for nskeatts/bge-large-en-v1.5-openvino-8bit
Base model
BAAI/bge-large-en-v1.5Evaluation results
- accuracy on MTEB AmazonCounterfactualClassification (en)test set self-reported75.851
- ap on MTEB AmazonCounterfactualClassification (en)test set self-reported38.566
- f1 on MTEB AmazonCounterfactualClassification (en)test set self-reported69.694
- accuracy on MTEB AmazonPolarityClassificationtest set self-reported92.417
- ap on MTEB AmazonPolarityClassificationtest set self-reported89.193
- f1 on MTEB AmazonPolarityClassificationtest set self-reported92.395
- accuracy on MTEB AmazonReviewsClassification (en)test set self-reported48.176
- f1 on MTEB AmazonReviewsClassification (en)test set self-reported47.807
- map_at_1 on MTEB ArguAnatest set self-reported40.185
- map_at_10 on MTEB ArguAnatest set self-reported55.654