YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Fixed parameters:

  • model_name_or_path: Bhumika/roberta-base-finetuned-sst2
  • dataset:
    • path: glue
    • name: sst2
    • calibration_split: None
    • eval_split: validation
    • data_keys: ['sentence']
    • label_keys: ['label']
  • quantization_approach: dynamic
  • node_exclusion: []
  • per_channel: False
  • calibration: None
  • framework: onnxruntime
  • framework_args:
    • opset: 15
    • optimization_level: 1
  • aware_training: False

Benchmarked parameters:

  • operators_to_quantize: ['Add', 'MatMul'], ['Add']

Evaluation

Below, time metrics for

  • Batch size: 8
  • Input length: 128
    operators_to_quantize latency_mean (original, ms) latency_mean (optimized, ms) throughput (original, /s) throughput (optimized, /s) accuracy (original) accuracy (optimized)
    ['Add'] | 454.70 361.81 | 2.50 3.00 | 1.0 1.0

| ['Add', 'MatMul'] | | | 474.54 | 135.14 | | | 2.50 | 7.50 | | | 1.0 | 1.0 |

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support