language:
- en
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- dense
- generated_from_trainer
- dataset_size:197462
- loss:MSELoss
base_model: Qwen/Qwen3-Embedding-0.6B
widget:
- source_sentence: >-
Instruct: Given a web search query, retrieve relevant passages that answer
the query
Query:who sings the song i don't want to work
sentences:
- >-
The Invisible Man Griffin is the surname of the story's protagonist. His
name is not mentioned until about halfway through the book. Consumed
with his greed for power and fame, he is the model of science without
humanity. A gifted young student, he becomes interested in the science
of refraction. During his experiments, he accidentally discovers
chemicals (combined with an unspecified kind of radiation) that would
make living tissue invisible. Obsessed with his discovery, he tries the
experiment on himself and becomes invisible. However, he does not know
how to reverse the process, and he slowly discovers that the advantages
of being invisible do not outweigh the disadvantages and the problems he
faces. Thus begins his downfall as he takes the road to crime for his
survival, revealing in the process his lack of conscience, inhumanity
and complete selfishness. He progresses from obsession to fanaticism, to
insanity, and finally to his fateful end.
- >-
Instruct: Given a web search query, retrieve relevant passages that
answer the query
Query:who did the united states become independent from
- >-
Jordan Belfort Jordan Ross Belfort (/ˈbɛlfɔːrt/; born July 9, 1962) is
an American author, motivational speaker, and former stockbroker. In
1999, he pleaded guilty to fraud and related crimes in connection with
stock-market manipulation and running a boiler room as part of a
penny-stock scam. Belfort spent 22 months in prison as part of an
agreement under which he gave testimony against numerous partners and
subordinates in his fraud scheme.[5] He published the memoir The Wolf of
Wall Street, which was adapted into a film and released in 2013.
- source_sentence: >-
London water supply infrastructure Most of London's water comes from
non-tidal parts of the Thames and Lea, with the remainder being abstracted
from underground sources.[22]
sentences:
- >-
Instruct: Given a web search query, retrieve relevant passages that
answer the query
Query:what is the number on the hogwarts express
- >-
Instruct: Given a web search query, retrieve relevant passages that
answer the query
Query:when did roughing the kicker become a rule
- >-
Agora Early in Greek history (18th century–8th century BC), free-born
citizens would gather in the agora for military duty or to hear
statements of the ruling king or council. Later, the Agora also served
as a marketplace where merchants kept stalls or shops to sell their
goods amid colonnades. This attracted artisans who built workshops
nearby.[2]
- source_sentence: >-
Instruct: Given a web search query, retrieve relevant passages that answer
the query
Query:what is meant by lagging and leading current in ac circuit
sentences:
- >-
.org The domain name org is a generic top-level domain (gTLD) of the
Domain Name System (DNS) used in the Internet. The name is truncated
from organization. It was one of the original domains established in
1985, and has been operated by the Public Interest Registry since 2003.
The domain was originally intended for non-profit entities, but this
restriction was not enforced and has been removed. The domain is
commonly used by schools, open-source projects, and communities, but
also by some for-profit entities. The number of registered domains in
org has increased from fewer than one million in the 1990s, to ten
million as of June 2013.
- >-
Instruct: Given a web search query, retrieve relevant passages that
answer the query
Query:how many episode in season 1 game of thrones
- >-
Instruct: Given a web search query, retrieve relevant passages that
answer the query
Query:when is season 11 of doctor who coming out
- source_sentence: >-
Gabriel Vlad (born April 9, 1969) in Bucharest, is a former Romanian
former rugby union football player.
sentences:
- >-
As of May 2013, The Jewish Tribune had a circulation of 60,500 copies a
week which made it, for a time, the largest Jewish weekly publication in
Canada.
- >-
Cunjamba Dima is a city and commune of Angola, located in the province
of Cuando Cubango.
- >-
He also acted in the National award winning Tamil movie Vazhakku Enn
18/9, directed by Balaji Sakthivel.
- source_sentence: The actress was thirteen when she was offered the role of Annie.
sentences:
- >-
All profits from the sale and streaming of the song go to music
education supported by the CMA Foundation.
- >-
Narsingh Temple is situated at the across of the village just across
confluence of Magri State village.
- >-
Contrasting significantly from other soccer leagues in the U.S., WLS
intends to be an open entry, promotion and relegation competition.
datasets:
- sentence-transformers/natural-questions
- sentence-transformers/gooaq
- sentence-transformers/wikipedia-en-sentences
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
- negative_mse
model-index:
- name: SentenceTransformer based on Qwen/Qwen3-Embedding-0.6B
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: NanoMSMARCO
type: NanoMSMARCO
metrics:
- type: cosine_accuracy@1
value: 0.42
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.64
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.76
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.82
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.42
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.21333333333333335
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.15200000000000002
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08199999999999999
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.42
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.64
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.76
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.82
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.620918816092183
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5567777777777778
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.5664067325709117
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: NanoNFCorpus
type: NanoNFCorpus
metrics:
- type: cosine_accuracy@1
value: 0.38
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.44
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.52
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.66
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.38
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.31333333333333335
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.29200000000000004
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.254
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.041275151654868704
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.06868331254409366
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.08524350018847202
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.11409038508225758
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.30429750607308503
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.44163492063492066
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.1254808602198398
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: NanoNQ
type: NanoNQ
metrics:
- type: cosine_accuracy@1
value: 0.4
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.72
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.76
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.82
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.4
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.24666666666666665
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.16
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.088
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.39
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.69
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.73
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.79
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.6214012092294585
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.571047619047619
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.564828869259454
name: Cosine Map@100
- task:
type: nano-beir
name: Nano BEIR
dataset:
name: NanoBEIR mean
type: NanoBEIR_mean
metrics:
- type: cosine_accuracy@1
value: 0.4000000000000001
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.6
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.68
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.7666666666666666
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.4000000000000001
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.25777777777777783
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.20133333333333336
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.1413333333333333
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.2837583838849562
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.4662277708480312
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5250811667294907
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.5746967950274192
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5155391771315755
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.5231534391534391
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.41890548735006855
name: Cosine Map@100
- task:
type: knowledge-distillation
name: Knowledge Distillation
dataset:
name: Unknown
type: unknown
metrics:
- type: negative_mse
value: -0.016825005412101746
name: Negative Mse
SentenceTransformer based on Qwen/Qwen3-Embedding-0.6B
This is a sentence-transformers model finetuned from Qwen/Qwen3-Embedding-0.6B on the nq dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
This is a first experiment attempting to distil the powerful Qwen/Qwen3-Embedding-0.6B model from 28 layers down to some lower layer count, in an attempt to speed up inference with minimal performance reductions.
To be specific, this early model falls from ~0.55 NDCG@10 average across NanoMSMARCO, NanoNFCorpus, and NanoNQ with the full 28 layers, to 0.5155 NDCG@10 on that selection with just 18 layers. Early tests indicate that using only 18 layers results in a 1.51x speedup compared to the full model.
This model was distilled using only 200k texts from one domain, reaching superior performance should be possible, especially with stronger distillation techniques like MarginMSE.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: Qwen/Qwen3-Embedding-0.6B
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 1024 dimensions
- Similarity Function: Cosine Similarity
- Training Dataset:
- Language: en
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: Qwen3Model
(1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': True, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("tomaarsen/Qwen3-Embedding-0.6B-18-layers")
# Run inference
sentences = [
'The actress was thirteen when she was offered the role of Annie.',
'Contrasting significantly from other soccer leagues in the U.S., WLS intends to be an open entry, promotion and relegation competition.',
'Narsingh Temple is situated at the across of the village just across confluence of Magri State village.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Information Retrieval
- Datasets:
NanoMSMARCO
,NanoNFCorpus
andNanoNQ
- Evaluated with
InformationRetrievalEvaluator
with these parameters:{ "query_prompt": "Instruct: Given a web search query, retrieve relevant passages that answer the query\nQuery:" }
Metric | NanoMSMARCO | NanoNFCorpus | NanoNQ |
---|---|---|---|
cosine_accuracy@1 | 0.42 | 0.38 | 0.4 |
cosine_accuracy@3 | 0.64 | 0.44 | 0.72 |
cosine_accuracy@5 | 0.76 | 0.52 | 0.76 |
cosine_accuracy@10 | 0.82 | 0.66 | 0.82 |
cosine_precision@1 | 0.42 | 0.38 | 0.4 |
cosine_precision@3 | 0.2133 | 0.3133 | 0.2467 |
cosine_precision@5 | 0.152 | 0.292 | 0.16 |
cosine_precision@10 | 0.082 | 0.254 | 0.088 |
cosine_recall@1 | 0.42 | 0.0413 | 0.39 |
cosine_recall@3 | 0.64 | 0.0687 | 0.69 |
cosine_recall@5 | 0.76 | 0.0852 | 0.73 |
cosine_recall@10 | 0.82 | 0.1141 | 0.79 |
cosine_ndcg@10 | 0.6209 | 0.3043 | 0.6214 |
cosine_mrr@10 | 0.5568 | 0.4416 | 0.571 |
cosine_map@100 | 0.5664 | 0.1255 | 0.5648 |
Nano BEIR
- Dataset:
NanoBEIR_mean
- Evaluated with
NanoBEIREvaluator
with these parameters:{ "dataset_names": [ "msmarco", "nfcorpus", "nq" ], "query_prompts": { "msmarco": "Instruct: Given a web search query, retrieve relevant passages that answer the query\nQuery:", "nfcorpus": "Instruct: Given a web search query, retrieve relevant passages that answer the query\nQuery:", "nq": "Instruct: Given a web search query, retrieve relevant passages that answer the query\nQuery:" } }
Metric | Value |
---|---|
cosine_accuracy@1 | 0.4 |
cosine_accuracy@3 | 0.6 |
cosine_accuracy@5 | 0.68 |
cosine_accuracy@10 | 0.7667 |
cosine_precision@1 | 0.4 |
cosine_precision@3 | 0.2578 |
cosine_precision@5 | 0.2013 |
cosine_precision@10 | 0.1413 |
cosine_recall@1 | 0.2838 |
cosine_recall@3 | 0.4662 |
cosine_recall@5 | 0.5251 |
cosine_recall@10 | 0.5747 |
cosine_ndcg@10 | 0.5155 |
cosine_mrr@10 | 0.5232 |
cosine_map@100 | 0.4189 |
Knowledge Distillation
- Evaluated with
MSEEvaluator
Metric | Value |
---|---|
negative_mse | -0.0168 |
Training Details
Training Dataset
nq
- Dataset: nq at f9e894e
- Size: 197,462 training samples
- Columns:
text
andlabel
- Approximate statistics based on the first 1000 samples:
text label type string list details - min: 27 tokens
- mean: 89.38 tokens
- max: 505 tokens
- size: 1024 elements
- Samples:
text label Instruct: Given a web search query, retrieve relevant passages that answer the query
Query:the movie bernie based on a true story[-0.05126953125, -0.0020294189453125, 0.00152587890625, 0.060791015625, 0.022216796875, ...]
College World Series The College World Series, or CWS, is an annual June baseball tournament held in Omaha, Nebraska. The CWS is the culmination of the National Collegiate Athletic Association (NCAA) Division I Baseball Championship tournament—featuring 64 teams in the first round—which determines the NCAA Division I college baseball champion. The eight participating teams are split into two, four-team, double-elimination brackets, with the winners of each bracket playing in a best-of-three championship series.
[0.033935546875, -0.0908203125, -0.010498046875, 0.0625, -0.01263427734375, ...]
Instruct: Given a web search query, retrieve relevant passages that answer the query
Query:does the femoral nerve turn into the saphenous nerve[0.052978515625, -0.0028228759765625, -0.0022430419921875, 0.0732421875, 0.044677734375, ...]
- Loss:
MSELoss
Evaluation Datasets
nq
- Dataset: nq at f9e894e
- Size: 3,000 evaluation samples
- Columns:
text
andlabel
- Approximate statistics based on the first 1000 samples:
text label type string list details - min: 21 tokens
- mean: 87.24 tokens
- max: 410 tokens
- size: 1024 elements
- Samples:
text label Instruct: Given a web search query, retrieve relevant passages that answer the query
Query:who was the heir apparent of the austro-hungarian empire in 1914[0.0262451171875, 0.0556640625, -0.0, -0.03076171875, -0.05712890625, ...]
Instruct: Given a web search query, retrieve relevant passages that answer the query
Query:who played tommy in coward of the county[-0.00848388671875, -0.02294921875, -0.00182342529296875, 0.060546875, -0.021240234375, ...]
Vertebra The vertebral arch is formed by pedicles and laminae. Two pedicles extend from the sides of the vertebral body to join the body to the arch. The pedicles are short thick processes that extend, one from each side, posteriorly, from the junctions of the posteriolateral surfaces of the centrum, on its upper surface. From each pedicle a broad plate, a lamina, projects backwards and medialwards to join and complete the vertebral arch and form the posterior border of the vertebral foramen, which completes the triangle of the vertebral foramen.[6] The upper surfaces of the laminae are rough to give attachment to the ligamenta flava. These ligaments connect the laminae of adjacent vertebra along the length of the spine from the level of the second cervical vertebra. Above and below the pedicles are shallow depressions called vertebral notches (superior and inferior). When the vertebrae articulate the notches align with those on adjacent vertebrae and these form the openings of the int...
[0.062255859375, -0.005706787109375, -0.009765625, 0.035400390625, -0.0125732421875, ...]
- Loss:
MSELoss
gooaq
- Dataset: gooaq at b089f72
- Size: 3,000 evaluation samples
- Columns:
text
andlabel
- Approximate statistics based on the first 1000 samples:
text label type string list details - min: 10 tokens
- mean: 43.88 tokens
- max: 117 tokens
- size: 1024 elements
- Samples:
text label Instruct: Given a web search query, retrieve relevant passages that answer the query
Query:what essential oils are soothing?[-0.025146484375, 0.06591796875, -0.0025634765625, 0.0732421875, -0.046630859375, ...]
Titles of books should be underlined or put in italics . (Titles of stories, essays and poems are in "quotation marks.") Refer to the text specifically as a novel, story, essay, memoir, or poem, depending on what it is.
[-0.006988525390625, -0.050537109375, -0.007476806640625, -0.07177734375, -0.049560546875, ...]
Dakine Cyclone Wet/Dry 32L Backpack. Born from the legacy of our most iconic surf pack, the Cyclone Collection is a family of super-technical and durable wet/dry packs and bags.
[0.0016632080078125, 0.04150390625, -0.01324462890625, 0.0234375, 0.03173828125, ...]
- Loss:
MSELoss
wikipedia
- Dataset: wikipedia at 4a0972d
- Size: 3,000 evaluation samples
- Columns:
text
andlabel
- Approximate statistics based on the first 1000 samples:
text label type string list details - min: 5 tokens
- mean: 28.1 tokens
- max: 105 tokens
- size: 1024 elements
- Samples:
text label The daughter of Vice-admiral George Davies and Julia Hume, she spent her younger years on board the ship he was stationed, the Griper.
[0.0361328125, 0.01904296875, -0.003662109375, 0.0247802734375, 0.0140380859375, ...]
The impetus for the project began when Amalgamated Dynamics, hired to provide the practical effects for The Thing, a prequel to John Carpenter's 1982 classic film-renowned for its almost exclusive use of practical effects-became disillusioned upon discovering the theatrical release had the bulk of their effects digitally replaced with computer-generated imagery.
[-0.0106201171875, -0.0439453125, -0.01104736328125, 0.00946044921875, 0.0322265625, ...]
Lost Angeles, his second feature film, starring Joelle Carter and Kelly Blatz, had its world premiere at the Oldenburg International Film Festival in 2012.
[0.0272216796875, 0.0263671875, -0.007110595703125, 0.0294189453125, 0.01129150390625, ...]
- Loss:
MSELoss
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 16per_device_eval_batch_size
: 16learning_rate
: 0.0001num_train_epochs
: 1warmup_ratio
: 0.1bf16
: Trueload_best_model_at_end
: True
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 16per_device_eval_batch_size
: 16per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 0.0001weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Trueignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}tp_size
: 0fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
Training Logs
Epoch | Step | Training Loss | nq loss | gooaq loss | wikipedia loss | NanoMSMARCO_cosine_ndcg@10 | NanoNFCorpus_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 | negative_mse |
---|---|---|---|---|---|---|---|---|---|---|
-1 | -1 | - | - | - | - | 0.2033 | 0.0972 | 0.1638 | 0.1548 | -0.0985 |
0.0162 | 200 | 0.0008 | - | - | - | - | - | - | - | - |
0.0324 | 400 | 0.0004 | - | - | - | - | - | - | - | - |
0.0486 | 600 | 0.0003 | - | - | - | - | - | - | - | - |
0.0648 | 800 | 0.0003 | - | - | - | - | - | - | - | - |
0.0810 | 1000 | 0.0002 | 0.0002 | 0.0003 | 0.0003 | 0.5482 | 0.2864 | 0.5995 | 0.4780 | -0.0280 |
0.0972 | 1200 | 0.0002 | - | - | - | - | - | - | - | - |
0.1134 | 1400 | 0.0002 | - | - | - | - | - | - | - | - |
0.1296 | 1600 | 0.0002 | - | - | - | - | - | - | - | - |
0.1458 | 1800 | 0.0002 | - | - | - | - | - | - | - | - |
0.1620 | 2000 | 0.0002 | 0.0002 | 0.0003 | 0.0003 | 0.6136 | 0.2926 | 0.6028 | 0.5030 | -0.0218 |
0.1783 | 2200 | 0.0002 | - | - | - | - | - | - | - | - |
0.1945 | 2400 | 0.0001 | - | - | - | - | - | - | - | - |
0.2107 | 2600 | 0.0001 | - | - | - | - | - | - | - | - |
0.2269 | 2800 | 0.0001 | - | - | - | - | - | - | - | - |
0.2431 | 3000 | 0.0001 | 0.0001 | 0.0002 | 0.0002 | 0.6169 | 0.2990 | 0.5781 | 0.4980 | -0.0199 |
0.2593 | 3200 | 0.0001 | - | - | - | - | - | - | - | - |
0.2755 | 3400 | 0.0001 | - | - | - | - | - | - | - | - |
0.2917 | 3600 | 0.0001 | - | - | - | - | - | - | - | - |
0.3079 | 3800 | 0.0001 | - | - | - | - | - | - | - | - |
0.3241 | 4000 | 0.0001 | 0.0001 | 0.0002 | 0.0002 | 0.6137 | 0.3000 | 0.5987 | 0.5041 | -0.0187 |
0.3403 | 4200 | 0.0001 | - | - | - | - | - | - | - | - |
0.3565 | 4400 | 0.0001 | - | - | - | - | - | - | - | - |
0.3727 | 4600 | 0.0001 | - | - | - | - | - | - | - | - |
0.3889 | 4800 | 0.0001 | - | - | - | - | - | - | - | - |
0.4051 | 5000 | 0.0001 | 0.0001 | 0.0002 | 0.0002 | 0.6235 | 0.2945 | 0.6105 | 0.5095 | -0.0182 |
0.4213 | 5200 | 0.0001 | - | - | - | - | - | - | - | - |
0.4375 | 5400 | 0.0001 | - | - | - | - | - | - | - | - |
0.4537 | 5600 | 0.0001 | - | - | - | - | - | - | - | - |
0.4699 | 5800 | 0.0001 | - | - | - | - | - | - | - | - |
0.4861 | 6000 | 0.0001 | 0.0001 | 0.0002 | 0.0002 | 0.6183 | 0.2999 | 0.6141 | 0.5108 | -0.0175 |
0.5023 | 6200 | 0.0001 | - | - | - | - | - | - | - | - |
0.5186 | 6400 | 0.0001 | - | - | - | - | - | - | - | - |
0.5348 | 6600 | 0.0001 | - | - | - | - | - | - | - | - |
0.5510 | 6800 | 0.0001 | - | - | - | - | - | - | - | - |
0.5672 | 7000 | 0.0001 | 0.0001 | 0.0002 | 0.0002 | 0.6129 | 0.3005 | 0.6201 | 0.5112 | -0.0173 |
0.5834 | 7200 | 0.0001 | - | - | - | - | - | - | - | - |
0.5996 | 7400 | 0.0001 | - | - | - | - | - | - | - | - |
0.6158 | 7600 | 0.0001 | - | - | - | - | - | - | - | - |
0.6320 | 7800 | 0.0001 | - | - | - | - | - | - | - | - |
0.6482 | 8000 | 0.0001 | 0.0001 | 0.0002 | 0.0002 | 0.6258 | 0.3032 | 0.6099 | 0.5130 | -0.0170 |
0.6644 | 8200 | 0.0001 | - | - | - | - | - | - | - | - |
0.6806 | 8400 | 0.0001 | - | - | - | - | - | - | - | - |
0.6968 | 8600 | 0.0001 | - | - | - | - | - | - | - | - |
0.7130 | 8800 | 0.0001 | - | - | - | - | - | - | - | - |
0.7292 | 9000 | 0.0001 | 0.0001 | 0.0002 | 0.0002 | 0.6209 | 0.3043 | 0.6214 | 0.5155 | -0.0168 |
0.7454 | 9200 | 0.0001 | - | - | - | - | - | - | - | - |
0.7616 | 9400 | 0.0001 | - | - | - | - | - | - | - | - |
0.7778 | 9600 | 0.0001 | - | - | - | - | - | - | - | - |
0.7940 | 9800 | 0.0001 | - | - | - | - | - | - | - | - |
0.8102 | 10000 | 0.0001 | 0.0001 | 0.0002 | 0.0002 | 0.6224 | 0.3015 | 0.6183 | 0.5141 | -0.0168 |
0.8264 | 10200 | 0.0001 | - | - | - | - | - | - | - | - |
0.8427 | 10400 | 0.0001 | - | - | - | - | - | - | - | - |
0.8589 | 10600 | 0.0001 | - | - | - | - | - | - | - | - |
0.8751 | 10800 | 0.0001 | - | - | - | - | - | - | - | - |
0.8913 | 11000 | 0.0001 | 0.0001 | 0.0002 | 0.0002 | 0.6224 | 0.3014 | 0.6155 | 0.5131 | -0.0167 |
0.9075 | 11200 | 0.0001 | - | - | - | - | - | - | - | - |
0.9237 | 11400 | 0.0001 | - | - | - | - | - | - | - | - |
0.9399 | 11600 | 0.0001 | - | - | - | - | - | - | - | - |
0.9561 | 11800 | 0.0001 | - | - | - | - | - | - | - | - |
0.9723 | 12000 | 0.0001 | 0.0001 | 0.0002 | 0.0002 | 0.6247 | 0.3020 | 0.6133 | 0.5133 | -0.0167 |
0.9885 | 12200 | 0.0001 | - | - | - | - | - | - | - | - |
-1 | -1 | - | - | - | - | 0.6209 | 0.3043 | 0.6214 | 0.5155 | -0.0168 |
- The bold row denotes the saved checkpoint.
Framework Versions
- Python: 3.11.10
- Sentence Transformers: 4.2.0.dev0
- Transformers: 4.51.2
- PyTorch: 2.5.1+cu124
- Accelerate: 1.5.2
- Datasets: 3.5.0
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MSELoss
@inproceedings{reimers-2020-multilingual-sentence-bert,
title = "Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2020",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/2004.09813",
}