legora_model / README.md
Vinsuka's picture
Upload folder using huggingface_hub
d52ad1b verified
metadata
language:
  - en
license: apache-2.0
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:6190
  - loss:MatryoshkaLoss
  - loss:MultipleNegativesRankingLoss
base_model: nomic-ai/modernbert-embed-base
widget:
  - source_sentence: What is the duration of the period mentioned in the text?
    sentences:
      - >-
        . The only excep Ɵon to the requirement that the plainƟff must be a
        lending i nsƟtuƟon in order to invoke the provisions of the Act is
        contained in SecƟon 25, in terms of which a person who inter alia
        knowingly draws a cheque which is subsequently dishonoured by the bank
        for want of funds is guilty of an offence under the Act, and proceedings
        can be insƟtuted against such person in the Magistrate’s
      - >-
        ? The 1st question of law is formulated on the basis that , the 1st
        Defendant is the licensee of the 2nd Defendant and therefore, the 1st
        Defendant cannot claim prescriptive title to the subject matter
      - >-
        .50,000/ - (that is , a period of 36 months) but such “Facility” is
        subject to review on 30 /09/2000”, (that is, a period of about only 5
        months from the date of P4)
  - source_sentence: >-
      What is the purpose of the disposition of the property by Lanka Tractors
      Limited as mentioned in the text?
    sentences:
      - >-
        . (3) is whether the said disposition of the property by Lanka Tractors
        Limited was done with the sole object of defrauding its creditors.
        Section 348 of the Companies Act which describes about Fraudulent
        reference would be relevant in this regard
      - >-
        . In the arbitration process, the Government is not involved; the court
        system is not involved (except as provided for in the Act); the parties
        do not have to rely on any Government institution for resolution of
        their dispute. Process of conducting the arbitration, venue, time, mode
        of adducing evidence are all decided by agreement of parties
      - >-
        . This is broadly similar to the provision in the summary procedure on
        liquid claims. The amendment in clause 8 of the Bill, repeals the defini
        Ɵon of the term ‘debt’ in sec Ɵon 30. The subs Ɵtuted defini Ɵon excludes
        the words referred to above which limit its applicability to money owed
        under a promise or agreement which is in wri Ɵng
  - source_sentence: What is one of the topics covered in the training program?
    sentences:
      - >-
        . The resul Ɵng posiƟon is that the court would not have any wri Ʃen
        evidence of the commitment on the part of the debtor when it issues
        decree nisi in the first instance
      - >-
        ? Before this C ourt, there is no dispute on the manner in which the
        appellant obtained the title of the land in question
      - >-
        . Detail reporting procedures to government of Sri Lanka’s contact
        points. - 4 Weeks Phase 3 Training of Port Facility Security Officers
        SATHSINDU/BAGNOLD undertakes to design a training program and conducted
        aid program for up to ten persons. • Understanding the reasons for the
        ISPS code • ISPS Code content and requirements. • Understanding the ISPS
        Code
  - source_sentence: What type of action was taken by the Divisional Secretary?
    sentences:
      - >-
        .2020 was also sent by the Divisional Secretary of Th amankaduwa
        imposing similar restrictions as by the Polonnaruwa Pradeshiya Sabha
      - >-
        . When Seylan Bank published the resolution of its board of directors
        which exercised its powers of Parate Execution in the newspaper on 10th
        March 2006-, HNB had made the application dated 21st March [SC Appeal
        No. 85A /2009 ] Page 6 of 25 2006 to the District Court of Colombo in
        terms of Sections 260, 261, 348, 359 and 352 of the Companies Act No
      - >-
        . Having regard to the above -mentioned stipulated circumstances , I
        consider the facts put forward for the appellant , seeking a reduction
        of sentence. The offence was committed in 2004. The appellant had been
        in remand custody for more than three years and the appell ant did not
        have any previous convictions
  - source_sentence: What is described in Section 25 of the Arbitration Act?
    sentences:
      - >-
        . But where a matter is within the plenary jurisdiction of the Court if
        no objection is taken, the Court will then have jurisdiction to proceed
        on with the matter and make a valid order.” 14 31. Further , in the case
        of Don Tilakaratne v
      - >-
        . (3) The provision of subsections (1) and (2) shall apply only to the
        extent agreed to by the parties. (4) The arbitral tribunal shall decide
        according to considerations of general justice and fairness or trade
        usages only if the parties have expressly authorised it to do so.
        Section 25 of the Arbitration Act describes the form and content of the
        arbitral award as follows: 25
      - >-
        . 9 and 10 based on the objection taken to them by the Counsel for HNB,
        despite the fact that they did not arise from the pleadings, and were
        altogether inconsistent with them, answered the afore-stated question of
        law (in respect of which this Court had granted Leave to Appeal in that
        case) in the affirmative and in favour of HNB, and stated as follows:
        “In conclusion, it needs to be emphasised
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
model-index:
  - name: Fine-tuned with [QuicKB](https://github.com/ALucek/QuicKB)
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 768
          type: dim_768
        metrics:
          - type: cosine_accuracy@1
            value: 0.5741279069767442
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.7616279069767442
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8197674418604651
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.8851744186046512
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.5741279069767442
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.25387596899224807
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.163953488372093
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.0885174418604651
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.5741279069767442
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.7616279069767442
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8197674418604651
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.8851744186046512
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.7308126785084815
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.6812459625322997
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.6852483059452662
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 512
          type: dim_512
        metrics:
          - type: cosine_accuracy@1
            value: 0.5741279069767442
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.7630813953488372
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.8212209302325582
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.875
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.5741279069767442
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.2543604651162791
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.16424418604651161
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.0875
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.5741279069767442
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.7630813953488372
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.8212209302325582
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.875
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.726227401269234
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.6782132475083055
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.6827936993080407
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 256
          type: dim_256
        metrics:
          - type: cosine_accuracy@1
            value: 0.5552325581395349
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.7281976744186046
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.7921511627906976
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.8619186046511628
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.5552325581395349
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.24273255813953487
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.15843023255813954
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.08619186046511627
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.5552325581395349
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.7281976744186046
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.7921511627906976
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.8619186046511628
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.7077790398550751
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.6585646225544481
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.6630890497309057
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 128
          type: dim_128
        metrics:
          - type: cosine_accuracy@1
            value: 0.49709302325581395
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.6758720930232558
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.7354651162790697
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.8241279069767442
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.49709302325581395
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.22529069767441862
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.14709302325581394
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.08241279069767442
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.49709302325581395
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.6758720930232558
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.7354651162790697
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.8241279069767442
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.6567813216281579
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.6037779162052417
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.6090388181529673
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 64
          type: dim_64
        metrics:
          - type: cosine_accuracy@1
            value: 0.39680232558139533
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.5581395348837209
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.622093023255814
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.7252906976744186
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.39680232558139533
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.18604651162790695
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.12441860465116278
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.07252906976744186
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.39680232558139533
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.5581395348837209
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.622093023255814
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.7252906976744186
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.5513541983050395
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.497020348837209
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.5050183064129367
            name: Cosine Map@100

Fine-tuned with QuicKB

This is a sentence-transformers model finetuned from nomic-ai/modernbert-embed-base. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: nomic-ai/modernbert-embed-base
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 dimensions
  • Similarity Function: Cosine Similarity
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: ModernBertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'What is described in Section 25 of the Arbitration Act?',
    '. (3) The provision of subsections (1) and (2) shall apply only to the extent agreed to by the parties. (4) The arbitral tribunal shall decide according to considerations of general justice and fairness or trade usages only if the parties have expressly authorised it to do so. Section 25 of the Arbitration Act describes the form and content of the arbitral award as follows: 25',
    '. 9 and 10 based on the objection taken to them by the Counsel for HNB, despite the fact that they did not arise from the pleadings, and were altogether inconsistent with them, answered the afore-stated question of law (in respect of which this Court had granted Leave to Appeal in that case) in the affirmative and in favour of HNB, and stated as follows: “In conclusion, it needs to be emphasised',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric dim_768 dim_512 dim_256 dim_128 dim_64
cosine_accuracy@1 0.5741 0.5741 0.5552 0.4971 0.3968
cosine_accuracy@3 0.7616 0.7631 0.7282 0.6759 0.5581
cosine_accuracy@5 0.8198 0.8212 0.7922 0.7355 0.6221
cosine_accuracy@10 0.8852 0.875 0.8619 0.8241 0.7253
cosine_precision@1 0.5741 0.5741 0.5552 0.4971 0.3968
cosine_precision@3 0.2539 0.2544 0.2427 0.2253 0.186
cosine_precision@5 0.164 0.1642 0.1584 0.1471 0.1244
cosine_precision@10 0.0885 0.0875 0.0862 0.0824 0.0725
cosine_recall@1 0.5741 0.5741 0.5552 0.4971 0.3968
cosine_recall@3 0.7616 0.7631 0.7282 0.6759 0.5581
cosine_recall@5 0.8198 0.8212 0.7922 0.7355 0.6221
cosine_recall@10 0.8852 0.875 0.8619 0.8241 0.7253
cosine_ndcg@10 0.7308 0.7262 0.7078 0.6568 0.5514
cosine_mrr@10 0.6812 0.6782 0.6586 0.6038 0.497
cosine_map@100 0.6852 0.6828 0.6631 0.609 0.505

Training Details

Training Dataset

Unnamed Dataset

  • Size: 6,190 training samples
  • Columns: anchor and positive
  • Approximate statistics based on the first 1000 samples:
    anchor positive
    type string string
    details
    • min: 7 tokens
    • mean: 15.11 tokens
    • max: 32 tokens
    • min: 3 tokens
    • mean: 69.53 tokens
    • max: 214 tokens
  • Samples:
    anchor positive
    How must the District Court exercise its discretion? imposition of ‘ a’ term; (5) It is not mandatory to impose security, as evinced by the use of the conjunction “or”; (6) In imposing terms, the District Court must be mindful of the objectives of the Act, and its discretion must be exercised judicially
    What is the source of the observation made by Christian Appu? . Christian Appu , (1895) 1 NLR 288 observed that , “possession is "disturbed" either by an action intended to remove the possessor from the land, or by acts which prevent the possessor from enjoying the free and full use of 12 the land of which he is in the course of acquiring the dominion, and which convert his continuous user into a disconnected and divided user ”
    What must the defendant do regarding the plaintiff's claim? . The Court of Appeal in Ramanayake v Sampath Bank Ltd and Others [(1993) 1 Sri LR 145 at page 153] has held that, “The defendant has to deal with the plaintiff’s claim on its merits; it is not competent for the defendant to merely set out technical objections. It is also incumbent on the defendant to reveal his defence, if he has any
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            768,
            512,
            256,
            128,
            64
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 16
  • gradient_accumulation_steps: 8
  • learning_rate: 2e-05
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • tf32: True
  • load_best_model_at_end: True
  • optim: adamw_torch_fused
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 8
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: True
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss dim_768_cosine_ndcg@10 dim_512_cosine_ndcg@10 dim_256_cosine_ndcg@10 dim_128_cosine_ndcg@10 dim_64_cosine_ndcg@10
0.1034 5 29.8712 - - - - -
0.2067 10 26.1323 - - - - -
0.3101 15 17.8585 - - - - -
0.4134 20 14.0232 - - - - -
0.5168 25 11.6897 - - - - -
0.6202 30 10.8431 - - - - -
0.7235 35 9.264 - - - - -
0.8269 40 11.2186 - - - - -
0.9302 45 9.9143 - - - - -
1.0 49 - 0.7134 0.7110 0.6902 0.6341 0.5282
1.0207 50 7.2581 - - - - -
1.1240 55 6.066 - - - - -
1.2274 60 6.3626 - - - - -
1.3307 65 6.8135 - - - - -
1.4341 70 5.5556 - - - - -
1.5375 75 6.0144 - - - - -
1.6408 80 6.1965 - - - - -
1.7442 85 5.596 - - - - -
1.8475 90 6.631 - - - - -
1.9509 95 6.3319 - - - - -
2.0 98 - 0.7331 0.7304 0.7074 0.6569 0.5477
2.0413 100 4.7382 - - - - -
2.1447 105 4.1516 - - - - -
2.2481 110 4.3517 - - - - -
2.3514 115 3.7044 - - - - -
2.4548 120 4.1593 - - - - -
2.5581 125 4.8081 - - - - -
2.6615 130 3.908 - - - - -
2.7649 135 3.7684 - - - - -
2.8682 140 3.8927 - - - - -
2.9509 144 - 0.7308 0.7262 0.7078 0.6568 0.5514
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.13.3
  • Sentence Transformers: 3.4.0
  • Transformers: 4.48.1
  • PyTorch: 2.6.0+cu126
  • Accelerate: 1.3.0
  • Datasets: 3.2.0
  • Tokenizers: 0.21.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning},
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}