SentenceTransformer based on Alibaba-NLP/gte-en-mlm-base

This is a sentence-transformers model finetuned from Alibaba-NLP/gte-en-mlm-base on the msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1 dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NewModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("joe32140/gte-en-mlm-base-msmarco")
# Run inference
sentences = [
    'what county is hayden in',
    "Hayden is a city in Kootenai County, Idaho, United States. Located in the northern portion of the state, just north of Coeur d'Alene, its population was 13,294 at the 2010 census.",
    "According to the United States Census Bureau, the city has a total area of 9.61 square miles (24.89 km2), of which 9.60 square miles (24.86 km2) is land and 0.01 square miles (0.03 km2) is water. It lies at the southwestern end of Hayden Lake, and the elevation of the city is 2,287 feet (697 m) above sea level. Hayden is located on U.S. Route 95 at the junction of Route 41. It is also four miles (6 km) north of Interstate 90 and Coeur d'Alene. The Coeur d'Alene airport is northwest of Hayden.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Triplet

Metric Value
cosine_accuracy 0.983

Training Details

Training Dataset

msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1

  • Dataset: msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1 at 84ed2d3
  • Size: 11,662,655 training samples
  • Columns: query, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    query positive negative
    type string string string
    details
    • min: 4 tokens
    • mean: 8.96 tokens
    • max: 40 tokens
    • min: 15 tokens
    • mean: 77.58 tokens
    • max: 222 tokens
    • min: 22 tokens
    • mean: 78.59 tokens
    • max: 325 tokens
  • Samples:
    query positive negative
    what is the meaning of menu planning Menu planning is the selection of a menu for an event. Such as picking out the dinner for your wedding or even a meal at a Birthday Party. Menu planning is when you are preparing a calendar of meals and you have to sit down and decide what meat and veggies you want to serve on each certain day. Menu Costs. In economics, a menu cost is the cost to a firm resulting from changing its prices. The name stems from the cost of restaurants literally printing new menus, but economists use it to refer to the costs of changing nominal prices in general.
    how old is brett butler Brett Butler is 59 years old. To be more precise (and nerdy), the current age as of right now is 21564 days or (even more geeky) 517536 hours. That's a lot of hours! Passed in: St. John's, Newfoundland and Labrador, Canada. Passed on: 16/07/2016. Published in the St. John's Telegram. Passed away suddenly at the Health Sciences Centre surrounded by his loving family, on July 16, 2016 Robert (Bobby) Joseph Butler, age 52 years. Predeceased by his special aunt Geri Murrin and uncle Mike Mchugh; grandparents Joe and Margaret Murrin and Jack and Theresa Butler.
    when was the last navajo treaty sign? In Executive Session, Senate of the United States, July 25, 1868. Resolved, (two-thirds of the senators present concurring,) That the Senate advise and consent to the ratification of the treaty between the United States and the Navajo Indians, concluded at Fort Sumner, New Mexico, on the first day of June, 1868. Share Treaty of Greenville. The Treaty of Greenville was signed August 3, 1795, between the United States, represented by Gen. Anthony Wayne, and chiefs of the Indian tribes located in the Northwest Territory, including the Wyandots, Delawares, Shawnees, Ottawas, Miamis, and others.
  • Loss: CachedMultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Evaluation Dataset

msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1

  • Dataset: msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1 at 84ed2d3
  • Size: 11,662,655 evaluation samples
  • Columns: query, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    query positive negative
    type string string string
    details
    • min: 4 tokens
    • mean: 8.92 tokens
    • max: 26 tokens
    • min: 22 tokens
    • mean: 79.14 tokens
    • max: 223 tokens
    • min: 21 tokens
    • mean: 78.96 tokens
    • max: 233 tokens
  • Samples:
    query positive negative
    what county is holly springs nc in Holly Springs, North Carolina. Holly Springs is a town in Wake County, North Carolina, United States. As of the 2010 census, the town population was 24,661, over 2½ times its population in 2000. Contents. The Mt. Holly Springs Park & Resort. One of the numerous trolley routes that carried people around the county at the turn of the century was the Carlisle & Mt. Holly Railway Company. The “Holly Trolley” as it came to be known was put into service by Patricio Russo and made its first run on May 14, 1901.
    how long does nyquil stay in your system In order to understand exactly how long Nyquil lasts, it is absolutely vital to learn about the various ingredients in the drug. One of the ingredients found in Nyquil is Doxylamine, which is an antihistamine. This specific medication has a biological half-life or 6 to 12 hours. With this in mind, it is possible for the drug to remain in the system for a period of 12 to 24 hours. It should be known that the specifics will depend on a wide variety of different factors, including your age and metabolism. I confirmed that NyQuil is about 10% alcohol, a higher content than most domestic beers. When I asked about the relatively high proof, I was told that the alcohol dilutes the active ingredients. The alcohol free version is there for customers with addiction issues.. also found that in that version there is twice the amount of DXM. When I asked if I could speak to a chemist or scientist, I was told they didn't have anyone who fit that description there. It’s been eight years since I kicked NyQuil. I've been sober from alcohol for four years.
    what are mineral water 1 Mineral water – water from a mineral spring that contains various minerals, such as salts and sulfur compounds. 2 It comes from a source tapped at one or more bore holes or spring, and originates from a geologically and physically protected underground water source. Mineral water – water from a mineral spring that contains various minerals, such as salts and sulfur compounds. 2 It comes from a source tapped at one or more bore holes or spring, and originates from a geologically and physically protected underground water source. Minerals for Your Body. Drinking mineral water is beneficial to health and well-being. But it is not only the amount of water you drink that is important-what the water contains is even more essential.inerals for Your Body. Drinking mineral water is beneficial to health and well-being. But it is not only the amount of water you drink that is important-what the water contains is even more essential.
  • Loss: CachedMultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 512
  • per_device_eval_batch_size: 512
  • num_train_epochs: 1
  • warmup_ratio: 0.05
  • bf16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 512
  • per_device_eval_batch_size: 512
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.05
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss msmarco-co-condenser-dev_cosine_accuracy
0 0 - 0.649
0.0041 10 6.3669 -
0.0082 20 5.3993 -
0.0123 30 3.4256 -
0.0164 40 2.0704 -
0.0205 50 1.0275 -
0.0246 60 0.6803 -
0.0287 70 0.5813 -
0.0328 80 0.5144 -
0.0369 90 0.4714 -
0.0410 100 0.4089 -
0.0450 110 0.3974 -
0.0491 120 0.363 -
0.0532 130 0.348 -
0.0573 140 0.3307 -
0.0614 150 0.3171 -
0.0655 160 0.3188 -
0.0696 170 0.3024 -
0.0737 180 0.2971 -
0.0778 190 0.2786 -
0.0819 200 0.2851 -
0.0860 210 0.2798 -
0.0901 220 0.2796 -
0.0942 230 0.2683 -
0.0983 240 0.2591 -
0.1024 250 0.265 -
0.1065 260 0.2703 -
0.1106 270 0.2547 -
0.1147 280 0.257 -
0.1188 290 0.2437 -
0.1229 300 0.2417 -
0.1269 310 0.2444 -
0.1310 320 0.2358 -
0.1351 330 0.2336 -
0.1392 340 0.2297 -
0.1433 350 0.224 -
0.1474 360 0.221 -
0.1515 370 0.227 -
0.1556 380 0.224 -
0.1597 390 0.2226 -
0.1638 400 0.2101 -
0.1679 410 0.2222 -
0.1720 420 0.217 -
0.1761 430 0.2093 -
0.1802 440 0.2077 -
0.1843 450 0.2073 -
0.1884 460 0.2061 -
0.1925 470 0.2074 -
0.1966 480 0.2019 -
0.2007 490 0.1978 -
0.2048 500 0.2115 -
0.2088 510 0.2005 -
0.2129 520 0.2059 -
0.2170 530 0.1925 -
0.2211 540 0.1943 -
0.2252 550 0.1969 -
0.2293 560 0.1899 -
0.2334 570 0.2122 -
0.2375 580 0.188 -
0.2416 590 0.1921 -
0.2457 600 0.1803 -
0.2498 610 0.1983 -
0.2539 620 0.1889 -
0.2580 630 0.1887 -
0.2621 640 0.1833 -
0.2662 650 0.1843 -
0.2703 660 0.1844 -
0.2744 670 0.1843 -
0.2785 680 0.1837 -
0.2826 690 0.173 -
0.2867 700 0.1785 -
0.2907 710 0.1704 -
0.2948 720 0.1703 -
0.2989 730 0.1782 -
0.3030 740 0.1623 -
0.3071 750 0.1688 -
0.3112 760 0.1603 -
0.3153 770 0.1518 -
0.3194 780 0.1605 -
0.3235 790 0.1661 -
0.3276 800 0.1678 -
0.3317 810 0.1656 -
0.3358 820 0.1582 -
0.3399 830 0.1551 -
0.3440 840 0.1587 -
0.3481 850 0.1526 -
0.3522 860 0.1601 -
0.3563 870 0.1557 -
0.3604 880 0.1576 -
0.3645 890 0.1655 -
0.3686 900 0.1595 -
0.3726 910 0.1575 -
0.3767 920 0.1544 -
0.3808 930 0.1432 -
0.3849 940 0.1484 -
0.3890 950 0.1556 -
0.3931 960 0.1552 -
0.3972 970 0.1462 -
0.4013 980 0.1562 -
0.4054 990 0.1461 -
0.4095 1000 0.1597 -
0.4136 1010 0.1466 -
0.4177 1020 0.143 -
0.4218 1030 0.1515 -
0.4259 1040 0.1317 -
0.4300 1050 0.1414 -
0.4341 1060 0.1554 -
0.4382 1070 0.1484 -
0.4423 1080 0.1487 -
0.4464 1090 0.1533 -
0.4505 1100 0.1494 -
0.4545 1110 0.1381 -
0.4586 1120 0.1495 -
0.4627 1130 0.1422 -
0.4668 1140 0.1424 -
0.4709 1150 0.1422 -
0.4750 1160 0.1429 -
0.4791 1170 0.1297 -
0.4832 1180 0.135 -
0.4873 1190 0.1431 -
0.4914 1200 0.143 -
0.4955 1210 0.1399 -
0.4996 1220 0.1339 -
0.5037 1230 0.1309 -
0.5078 1240 0.1377 -
0.5119 1250 0.1361 -
0.5160 1260 0.1311 -
0.5201 1270 0.1363 -
0.5242 1280 0.1368 -
0.5283 1290 0.1376 -
0.5324 1300 0.1323 -
0.5364 1310 0.1302 -
0.5405 1320 0.1322 -
0.5446 1330 0.1294 -
0.5487 1340 0.1295 -
0.5528 1350 0.1341 -
0.5569 1360 0.1244 -
0.5610 1370 0.1287 -
0.5651 1380 0.1247 -
0.5692 1390 0.1265 -
0.5733 1400 0.1221 -
0.5774 1410 0.1245 -
0.5815 1420 0.1252 -
0.5856 1430 0.1275 -
0.5897 1440 0.1211 -
0.5938 1450 0.1256 -
0.5979 1460 0.1208 -
0.6020 1470 0.1203 -
0.6061 1480 0.1243 -
0.6102 1490 0.1201 -
0.6143 1500 0.1233 -
0.6183 1510 0.1325 -
0.6224 1520 0.127 -
0.6265 1530 0.1195 -
0.6306 1540 0.1272 -
0.6347 1550 0.1176 -
0.6388 1560 0.1189 -
0.6429 1570 0.1231 -
0.6470 1580 0.1159 -
0.6511 1590 0.1233 -
0.6552 1600 0.1178 -
0.6593 1610 0.119 -
0.6634 1620 0.119 -
0.6675 1630 0.121 -
0.6716 1640 0.1185 -
0.6757 1650 0.117 -
0.6798 1660 0.1171 -
0.6839 1670 0.1198 -
0.6880 1680 0.1175 -
0.6921 1690 0.1173 -
0.6962 1700 0.1211 -
0.7002 1710 0.1154 -
0.7043 1720 0.1155 -
0.7084 1730 0.124 -
0.7125 1740 0.1147 -
0.7166 1750 0.1185 -
0.7207 1760 0.109 -
0.7248 1770 0.1119 -
0.7289 1780 0.1134 -
0.7330 1790 0.1163 -
0.7371 1800 0.1109 -
0.7412 1810 0.1223 -
0.7453 1820 0.1192 -
0.7494 1830 0.1142 -
0.7535 1840 0.1133 -
0.7576 1850 0.1148 -
0.7617 1860 0.1111 -
0.7658 1870 0.1128 -
0.7699 1880 0.1114 -
0.7740 1890 0.1111 -
0.7781 1900 0.1128 -
0.7821 1910 0.1128 -
0.7862 1920 0.1144 -
0.7903 1930 0.1102 -
0.7944 1940 0.107 -
0.7985 1950 0.1104 -
0.8026 1960 0.1074 -
0.8067 1970 0.1084 -
0.8108 1980 0.1091 -
0.8149 1990 0.1161 -
0.8190 2000 0.1077 -
0.8231 2010 0.1088 -
0.8272 2020 0.1099 -
0.8313 2030 0.11 -
0.8354 2040 0.1102 -
0.8395 2050 0.1098 -
0.8436 2060 0.1076 -
0.8477 2070 0.1062 -
0.8518 2080 0.1078 -
0.8559 2090 0.1058 -
0.8600 2100 0.1067 -
0.8640 2110 0.1037 -
0.8681 2120 0.1147 -
0.8722 2130 0.1169 -
0.8763 2140 0.1054 -
0.8804 2150 0.101 -
0.8845 2160 0.1026 -
0.8886 2170 0.1028 -
0.8927 2180 0.1084 -
0.8968 2190 0.1091 -
0.9009 2200 0.1045 -
0.9050 2210 0.1076 -
0.9091 2220 0.1129 -
0.9132 2230 0.1099 -
0.9173 2240 0.0969 -
0.9214 2250 0.1101 -
0.9255 2260 0.107 -
0.9296 2270 0.1042 -
0.9337 2280 0.1073 -
0.9378 2290 0.1035 -
0.9419 2300 0.1056 -
0.9459 2310 0.1026 -
0.9500 2320 0.1044 -
0.9541 2330 0.106 -
0.9582 2340 0.1054 -
0.9623 2350 0.1032 -
0.9664 2360 0.1019 -
0.9705 2370 0.1106 -
0.9746 2380 0.1076 -
0.9787 2390 0.1018 -
0.9828 2400 0.1026 -
0.9869 2410 0.1015 -
0.9910 2420 0.1036 -
0.9951 2430 0.1104 -
0.9992 2440 0.097 -
1.0 2442 - 0.983

Framework Versions

  • Python: 3.11.9
  • Sentence Transformers: 3.3.0
  • Transformers: 4.48.0.dev0
  • PyTorch: 2.4.0
  • Accelerate: 1.2.1
  • Datasets: 2.21.0
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CachedMultipleNegativesRankingLoss

@misc{gao2021scaling,
    title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
    author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
    year={2021},
    eprint={2101.06983},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}
Downloads last month
6
Safetensors
Model size
137M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for joe32140/gte-en-mlm-base-msmarco

Finetuned
(1)
this model

Dataset used to train joe32140/gte-en-mlm-base-msmarco

Collection including joe32140/gte-en-mlm-base-msmarco

Evaluation results