greatakela's picture
Add new SentenceTransformer model
e17e45c verified
metadata
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:4893
  - loss:TripletLoss
base_model: microsoft/mpnet-base
widget:
  - source_sentence: >-
      I almost envy you your assignment. I see in your mind that you are tempted
      to take my place. Not correct, Doctor, although I am aware of your mind
      attempting to contact mine. Were you born a telepath? Yes. That is why I
      had to study on Vulcan. I understand. May I show you to your
      quarters?[SEP]I think I'll stay here a bit. Ambassador Kollos often finds
      the process of transport somewhat unsettling.
    sentences:
      - ' I don''t see anything, do you?'
      - >-
        I understand. Our ship's surgeon often makes the same complaint. Do call
        when you are ready.
      - Irrelevant, since we are here.
  - source_sentence: >-
      Aye, sir. Aye, sir. Return to your station, Sub-commander. The boarding
      action on the Enterprise will begin with my command. If they resist,
      destroy her. Execution of state criminals is both painful and unpleasant.
      I believe the details are unnecessary. The sentence will be carried out
      immediately after the charges have been recorded. I demand the Right of
      Statement first.[SEP]You understand Romulan tradition well. The right is
      granted.
    sentences:
      - ' No, I... it''s just... you''re just coming off the surgery and you''re not yourself yet and I work for you and Even though last year''s... [Frustrated sigh as House starts smiling smugly.] you''re smiling! I''m saying no and you''re smiling!'
      - I wish to ask a question. What of Sarek's family, his wife and son?
      - >-
        Thank you. I shall not require much time. No more than twenty minutes, I
        should say.
  - source_sentence: >-
      Mankind, ready to kill. That's the way it was in 1881. I wonder how
      humanity managed to survive. We overcame our instinct for violence. Some
      desk-bound Starfleet bureaucrat cut these cloak-and-dagger
      orders.[SEP]Aye, but why the secrecy? This star system's under Federation
      control.
    sentences:
      - >-
        It's in a border area, Mister Scott. The Klingons also claim
        jurisdiction.
      - ' That''s your argument? Better outcome?'
      - Are you all right, Captain?
  - source_sentence: >-
      We're trying to help you, Oxmyx. Nobody helps nobody but himself. Sir, you
      are employing a double negative. Huh? I fail to see why you do not
      understand us. You yourself have stated the need for unity of authority on
      this planet. We agree.[SEP]Yeah, but I got to be the unity.
    sentences:
      - Co-operation, sir, would inevitably result
      - >-
        Quite right, Mister Scott. There's somebody holding us down. All systems
        are go, but we're not moving.
      - ' So today I''m jailbait but in 22 weeks anybody can do anything to me. Will I be so different in 22 weeks?'
  - source_sentence: >-
      What happened? Where have I been? Right here, it seems. But that girl. She
      was so beautiful. So real. Do you remember anything else? No.[SEP]Good.
      Perhaps that explains why he's here. Nothing was real to him except the
      girl.
    sentences:
      - >-
        Sweeping the area of Outpost two. Sensor reading indefinite.
        Double-checking Outpost three. I read dust and debris. Both Earth
        outposts gone, and the asteroids they were constructed on, pulverised.
      - ' It''s killing you.'
      - Captain, the Melkotian object.
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
  - cosine_accuracy
model-index:
  - name: SentenceTransformer based on microsoft/mpnet-base
    results:
      - task:
          type: triplet
          name: Triplet
        dataset:
          name: evaluator enc
          type: evaluator_enc
        metrics:
          - type: cosine_accuracy
            value: 0.9989781379699707
            name: Cosine Accuracy
      - task:
          type: triplet
          name: Triplet
        dataset:
          name: evaluator val
          type: evaluator_val
        metrics:
          - type: cosine_accuracy
            value: 0.9930555820465088
            name: Cosine Accuracy

SentenceTransformer based on microsoft/mpnet-base

This is a sentence-transformers model finetuned from microsoft/mpnet-base. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: microsoft/mpnet-base
  • Maximum Sequence Length: 256 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: MPNetModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("greatakela/gennlp_hw1_encoder2025")
# Run inference
sentences = [
    "What happened? Where have I been? Right here, it seems. But that girl. She was so beautiful. So real. Do you remember anything else? No.[SEP]Good. Perhaps that explains why he's here. Nothing was real to him except the girl.",
    'Captain, the Melkotian object.',
    " It's killing you.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Triplet

Metric evaluator_enc evaluator_val
cosine_accuracy 0.999 0.9931

Training Details

Training Dataset

Unnamed Dataset

  • Size: 4,893 training samples
  • Columns: sentence_0, sentence_1, and sentence_2
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1 sentence_2
    type string string string
    details
    • min: 2 tokens
    • mean: 90.47 tokens
    • max: 256 tokens
    • min: 3 tokens
    • mean: 18.64 tokens
    • max: 98 tokens
    • min: 4 tokens
    • mean: 20.14 tokens
    • max: 199 tokens
  • Samples:
    sentence_0 sentence_1 sentence_2
    Oh, well, if that's all. Mister Scott, transport the glommer over to the Klingon ship. Aye, sir. You can't do this to me. Under space salvage laws, he's mine. A planetary surface is not covered by space salvage laws. But if you want the little beastie that bad, Mister Jones, we'll transport you over with it. I withdraw my claim.[SEP]Well, at least we can report the stasis field is not as effective a weapon as we thought. The power drain is too high and takes too long for the Klingon ship to recover to make it practical. Agreed, Captain. Tribbles appear to be a much more effective weapon. [protesting] I give him...
    Do you mean that's what the Kelvans really are? Undoubtedly. Well, if they look that way normally, why did they adapt themselves to our bodies? Perhaps practicality. They chose the Enterprise as the best vessel for the trip. Immense beings with a hundred tentacles would have difficulty with the turbolift. We've got to stop them. We outnumber them. Their only hold on us is the paralysis field. Well, that's enough. One wrong move, and they jam all our neural circuits.[SEP]Jam. Spock, if you reverse the circuits on McCoy's neuro-analyser, can you set up a counter field to jam the paralysis projector? I'm dubious of the possibilities of success, Captain. The medical equipment is not designed to put out a great deal of power. The polarized elements would burn out quickly. The next step would be a type of brain surgery.
    Well, speculation isn't much help. We have to get in there. Perhaps there is a way open on the far side. There is much less activity there. That building in the centre. It seems to be important. You stand before the Ruling Tribunal of the Aquans. I am Domar, the High Tribune. I'm Captain Kirk of the starship Enterprise. This is my first officer, Mister Spock.[SEP]You are air-breather enemies from the surface. We have expected spies for a long time. We came here in peace, Tribune. Which is why we need to look at the nerve that you didn't biopsy.
  • Loss: TripletLoss with these parameters:
    {
        "distance_metric": "TripletDistanceMetric.EUCLIDEAN",
        "triplet_margin": 5
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 8
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin

Training Logs

Epoch Step Training Loss evaluator_enc_cosine_accuracy evaluator_val_cosine_accuracy
-1 -1 - 0.5494 -
0.4902 300 - 0.9808 -
0.8170 500 1.4249 - -
0.9804 600 - 0.9912 -
1.0 612 - 0.9931 -
1.4706 900 - 0.9963 -
1.6340 1000 0.2269 - -
1.9608 1200 - 0.9990 -
2.0 1224 - 0.9990 -
2.4510 1500 0.1054 0.9990 -
2.9412 1800 - 0.9990 -
3.0 1836 - 0.9990 -
-1 -1 - - 0.9931

Framework Versions

  • Python: 3.11.11
  • Sentence Transformers: 3.4.1
  • Transformers: 4.48.2
  • PyTorch: 2.5.1+cu124
  • Accelerate: 1.3.0
  • Datasets: 3.2.0
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

TripletLoss

@misc{hermans2017defense,
    title={In Defense of the Triplet Loss for Person Re-Identification},
    author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
    year={2017},
    eprint={1703.07737},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}