metadata
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:4893
- loss:TripletLoss
base_model: microsoft/mpnet-base
widget:
- source_sentence: >-
I almost envy you your assignment. I see in your mind that you are tempted
to take my place. Not correct, Doctor, although I am aware of your mind
attempting to contact mine. Were you born a telepath? Yes. That is why I
had to study on Vulcan. I understand. May I show you to your
quarters?[SEP]I think I'll stay here a bit. Ambassador Kollos often finds
the process of transport somewhat unsettling.
sentences:
- ' I don''t see anything, do you?'
- >-
I understand. Our ship's surgeon often makes the same complaint. Do call
when you are ready.
- Irrelevant, since we are here.
- source_sentence: >-
Aye, sir. Aye, sir. Return to your station, Sub-commander. The boarding
action on the Enterprise will begin with my command. If they resist,
destroy her. Execution of state criminals is both painful and unpleasant.
I believe the details are unnecessary. The sentence will be carried out
immediately after the charges have been recorded. I demand the Right of
Statement first.[SEP]You understand Romulan tradition well. The right is
granted.
sentences:
- ' No, I... it''s just... you''re just coming off the surgery and you''re not yourself yet and I work for you and Even though last year''s... [Frustrated sigh as House starts smiling smugly.] you''re smiling! I''m saying no and you''re smiling!'
- I wish to ask a question. What of Sarek's family, his wife and son?
- >-
Thank you. I shall not require much time. No more than twenty minutes, I
should say.
- source_sentence: >-
Mankind, ready to kill. That's the way it was in 1881. I wonder how
humanity managed to survive. We overcame our instinct for violence. Some
desk-bound Starfleet bureaucrat cut these cloak-and-dagger
orders.[SEP]Aye, but why the secrecy? This star system's under Federation
control.
sentences:
- >-
It's in a border area, Mister Scott. The Klingons also claim
jurisdiction.
- ' That''s your argument? Better outcome?'
- Are you all right, Captain?
- source_sentence: >-
We're trying to help you, Oxmyx. Nobody helps nobody but himself. Sir, you
are employing a double negative. Huh? I fail to see why you do not
understand us. You yourself have stated the need for unity of authority on
this planet. We agree.[SEP]Yeah, but I got to be the unity.
sentences:
- Co-operation, sir, would inevitably result
- >-
Quite right, Mister Scott. There's somebody holding us down. All systems
are go, but we're not moving.
- ' So today I''m jailbait but in 22 weeks anybody can do anything to me. Will I be so different in 22 weeks?'
- source_sentence: >-
What happened? Where have I been? Right here, it seems. But that girl. She
was so beautiful. So real. Do you remember anything else? No.[SEP]Good.
Perhaps that explains why he's here. Nothing was real to him except the
girl.
sentences:
- >-
Sweeping the area of Outpost two. Sensor reading indefinite.
Double-checking Outpost three. I read dust and debris. Both Earth
outposts gone, and the asteroids they were constructed on, pulverised.
- ' It''s killing you.'
- Captain, the Melkotian object.
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy
model-index:
- name: SentenceTransformer based on microsoft/mpnet-base
results:
- task:
type: triplet
name: Triplet
dataset:
name: evaluator enc
type: evaluator_enc
metrics:
- type: cosine_accuracy
value: 0.9989781379699707
name: Cosine Accuracy
- task:
type: triplet
name: Triplet
dataset:
name: evaluator val
type: evaluator_val
metrics:
- type: cosine_accuracy
value: 0.9930555820465088
name: Cosine Accuracy
SentenceTransformer based on microsoft/mpnet-base
This is a sentence-transformers model finetuned from microsoft/mpnet-base. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: microsoft/mpnet-base
- Maximum Sequence Length: 256 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False}) with Transformer model: MPNetModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("greatakela/gennlp_hw1_encoder2025")
# Run inference
sentences = [
"What happened? Where have I been? Right here, it seems. But that girl. She was so beautiful. So real. Do you remember anything else? No.[SEP]Good. Perhaps that explains why he's here. Nothing was real to him except the girl.",
'Captain, the Melkotian object.',
" It's killing you.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Triplet
- Datasets:
evaluator_enc
andevaluator_val
- Evaluated with
TripletEvaluator
Metric | evaluator_enc | evaluator_val |
---|---|---|
cosine_accuracy | 0.999 | 0.9931 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 4,893 training samples
- Columns:
sentence_0
,sentence_1
, andsentence_2
- Approximate statistics based on the first 1000 samples:
sentence_0 sentence_1 sentence_2 type string string string details - min: 2 tokens
- mean: 90.47 tokens
- max: 256 tokens
- min: 3 tokens
- mean: 18.64 tokens
- max: 98 tokens
- min: 4 tokens
- mean: 20.14 tokens
- max: 199 tokens
- Samples:
sentence_0 sentence_1 sentence_2 Oh, well, if that's all. Mister Scott, transport the glommer over to the Klingon ship. Aye, sir. You can't do this to me. Under space salvage laws, he's mine. A planetary surface is not covered by space salvage laws. But if you want the little beastie that bad, Mister Jones, we'll transport you over with it. I withdraw my claim.[SEP]Well, at least we can report the stasis field is not as effective a weapon as we thought. The power drain is too high and takes too long for the Klingon ship to recover to make it practical.
Agreed, Captain. Tribbles appear to be a much more effective weapon.
[protesting] I give him...
Do you mean that's what the Kelvans really are? Undoubtedly. Well, if they look that way normally, why did they adapt themselves to our bodies? Perhaps practicality. They chose the Enterprise as the best vessel for the trip. Immense beings with a hundred tentacles would have difficulty with the turbolift. We've got to stop them. We outnumber them. Their only hold on us is the paralysis field. Well, that's enough. One wrong move, and they jam all our neural circuits.[SEP]Jam. Spock, if you reverse the circuits on McCoy's neuro-analyser, can you set up a counter field to jam the paralysis projector?
I'm dubious of the possibilities of success, Captain. The medical equipment is not designed to put out a great deal of power. The polarized elements would burn out quickly.
The next step would be a type of brain surgery.
Well, speculation isn't much help. We have to get in there. Perhaps there is a way open on the far side. There is much less activity there. That building in the centre. It seems to be important. You stand before the Ruling Tribunal of the Aquans. I am Domar, the High Tribune. I'm Captain Kirk of the starship Enterprise. This is my first officer, Mister Spock.[SEP]You are air-breather enemies from the surface. We have expected spies for a long time.
We came here in peace, Tribune.
Which is why we need to look at the nerve that you didn't biopsy.
- Loss:
TripletLoss
with these parameters:{ "distance_metric": "TripletDistanceMetric.EUCLIDEAN", "triplet_margin": 5 }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsmulti_dataset_batch_sampler
: round_robin
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 8per_device_eval_batch_size
: 8per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 5e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1num_train_epochs
: 3max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: round_robin
Training Logs
Epoch | Step | Training Loss | evaluator_enc_cosine_accuracy | evaluator_val_cosine_accuracy |
---|---|---|---|---|
-1 | -1 | - | 0.5494 | - |
0.4902 | 300 | - | 0.9808 | - |
0.8170 | 500 | 1.4249 | - | - |
0.9804 | 600 | - | 0.9912 | - |
1.0 | 612 | - | 0.9931 | - |
1.4706 | 900 | - | 0.9963 | - |
1.6340 | 1000 | 0.2269 | - | - |
1.9608 | 1200 | - | 0.9990 | - |
2.0 | 1224 | - | 0.9990 | - |
2.4510 | 1500 | 0.1054 | 0.9990 | - |
2.9412 | 1800 | - | 0.9990 | - |
3.0 | 1836 | - | 0.9990 | - |
-1 | -1 | - | - | 0.9931 |
Framework Versions
- Python: 3.11.11
- Sentence Transformers: 3.4.1
- Transformers: 4.48.2
- PyTorch: 2.5.1+cu124
- Accelerate: 1.3.0
- Datasets: 3.2.0
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
TripletLoss
@misc{hermans2017defense,
title={In Defense of the Triplet Loss for Person Re-Identification},
author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
year={2017},
eprint={1703.07737},
archivePrefix={arXiv},
primaryClass={cs.CV}
}