tags:
- ColBERT
- PyLate
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:3787118
- loss:Contrastive
base_model: answerdotai/ModernBERT-base
pipeline_tag: sentence-similarity
library_name: PyLate
metrics:
- accuracy
model-index:
- name: PyLate model based on answerdotai/ModernBERT-base
results:
- task:
type: col-berttriplet
name: Col BERTTriplet
dataset:
name: Unknown
type: unknown
metrics:
- type: accuracy
value: 0.5331999659538269
name: Accuracy
PyLate model based on answerdotai/ModernBERT-base
This is a PyLate model finetuned from answerdotai/ModernBERT-base. It maps sentences & paragraphs to sequences of 128-dimensional dense vectors and can be used for semantic textual similarity using the MaxSim operator.
Model Details
Model Description
- Model Type: PyLate model
- Base model: answerdotai/ModernBERT-base
- Document Length: 180 tokens
- Query Length: 32 tokens
- Output Dimensionality: 128 tokens
- Similarity Function: MaxSim
Model Sources
- Documentation: PyLate Documentation
- Repository: PyLate on GitHub
- Hugging Face: PyLate models on Hugging Face
Full Model Architecture
ColBERT(
(0): Transformer({'max_seq_length': 179, 'do_lower_case': False}) with Transformer model: ModernBertModel
(1): Dense({'in_features': 768, 'out_features': 128, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
)
Usage
First install the PyLate library:
pip install -U pylate
Retrieval
PyLate provides a streamlined interface to index and retrieve documents using ColBERT models. The index leverages the Voyager HNSW index to efficiently handle document embeddings and enable fast retrieval.
Indexing documents
First, load the ColBERT model and initialize the Voyager index, then encode and index your documents:
from pylate import indexes, models, retrieve
# Step 1: Load the ColBERT model
model = models.ColBERT(
model_name_or_path=pylate_model_id,
)
# Step 2: Initialize the Voyager index
index = indexes.Voyager(
index_folder="pylate-index",
index_name="index",
override=True, # This overwrites the existing index if any
)
# Step 3: Encode the documents
documents_ids = ["1", "2", "3"]
documents = ["document 1 text", "document 2 text", "document 3 text"]
documents_embeddings = model.encode(
documents,
batch_size=32,
is_query=False, # Ensure that it is set to False to indicate that these are documents, not queries
show_progress_bar=True,
)
# Step 4: Add document embeddings to the index by providing embeddings and corresponding ids
index.add_documents(
documents_ids=documents_ids,
documents_embeddings=documents_embeddings,
)
Note that you do not have to recreate the index and encode the documents every time. Once you have created an index and added the documents, you can re-use the index later by loading it:
# To load an index, simply instantiate it with the correct folder/name and without overriding it
index = indexes.Voyager(
index_folder="pylate-index",
index_name="index",
)
Retrieving top-k documents for queries
Once the documents are indexed, you can retrieve the top-k most relevant documents for a given set of queries. To do so, initialize the ColBERT retriever with the index you want to search in, encode the queries and then retrieve the top-k documents to get the top matches ids and relevance scores:
# Step 1: Initialize the ColBERT retriever
retriever = retrieve.ColBERT(index=index)
# Step 2: Encode the queries
queries_embeddings = model.encode(
["query for document 3", "query for document 1"],
batch_size=32,
is_query=True, # # Ensure that it is set to False to indicate that these are queries
show_progress_bar=True,
)
# Step 3: Retrieve top-k documents
scores = retriever.retrieve(
queries_embeddings=queries_embeddings,
k=10, # Retrieve the top 10 matches for each query
)
Reranking
If you only want to use the ColBERT model to perform reranking on top of your first-stage retrieval pipeline without building an index, you can simply use rank function and pass the queries and documents to rerank:
from pylate import rank, models
queries = [
"query A",
"query B",
]
documents = [
["document A", "document B"],
["document 1", "document C", "document B"],
]
documents_ids = [
[1, 2],
[1, 3, 2],
]
model = models.ColBERT(
model_name_or_path=pylate_model_id,
)
queries_embeddings = model.encode(
queries,
is_query=True,
)
documents_embeddings = model.encode(
documents,
is_query=False,
)
reranked_documents = rank.rerank(
documents_ids=documents_ids,
queries_embeddings=queries_embeddings,
documents_embeddings=documents_embeddings,
)
Evaluation
Metrics
Col BERTTriplet
- Evaluated with
pylate.evaluation.colbert_triplet.ColBERTTripletEvaluator
Metric | Value |
---|---|
accuracy | 0.5332 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 3,787,118 training samples
- Columns:
question
,answer
, andnegative
- Approximate statistics based on the first 1000 samples:
question answer negative type string string string details - min: 9 tokens
- mean: 13.17 tokens
- max: 21 tokens
- min: 17 tokens
- mean: 31.79 tokens
- max: 32 tokens
- min: 15 tokens
- mean: 31.69 tokens
- max: 32 tokens
- Samples:
question answer negative what does it mean when you dream you are a vampire?
Dream Bible - Dream Interpretation of Vampires. To see a vampire in your dream symbolizes an aspect of your personality that is parasitic or selfishly feeds off others. A person or situation that drains you of time, energy, or resources. ... To dream of being a vampire represents a selfish need to use or feed off others.
Many people believe that witches, warlocks, vampires, werewolves are necessarily seen as a sign of something bad. ... If you dream that a vampire is chasing you, but it is not able to catch you, it means that you are afraid of someone, but your fear is groundless, this person will not hurt you.
what does it mean when you dream you are a vampire?
Dream Bible - Dream Interpretation of Vampires. To see a vampire in your dream symbolizes an aspect of your personality that is parasitic or selfishly feeds off others. A person or situation that drains you of time, energy, or resources. ... To dream of being a vampire represents a selfish need to use or feed off others.
Blood on the ground in a dream indicates that the dreamer should be careful of unusual or new friendships. ... To lose blood in your dream represents that you may be tired in your waking life and that you feel emotional. If you dream you are in the hospital and you see blood, it means that past actions may haunt you.
what is the difference between thrombosis and atherosclerosis?
Arterial thrombosis usually affects people whose arteries are clogged with fatty deposits. This is known as atherosclerosis. These deposits cause the arteries to harden and narrow over time and increase the risk of blood clots.
Advertisement. Atherosclerosis is a specific type of arteriosclerosis, but the terms are sometimes used interchangeably. Atherosclerosis refers to the buildup of fats, cholesterol and other substances in and on your artery walls (plaque), which can restrict blood flow.
- Loss:
pylate.losses.contrastive.Contrastive
Evaluation Dataset
Unnamed Dataset
- Size: 5,000 evaluation samples
- Columns:
question
,answer
, andnegative_1
- Approximate statistics based on the first 1000 samples:
question answer negative_1 type string string string details - min: 9 tokens
- mean: 12.96 tokens
- max: 24 tokens
- min: 16 tokens
- mean: 31.73 tokens
- max: 32 tokens
- min: 14 tokens
- mean: 31.43 tokens
- max: 32 tokens
- Samples:
question answer negative_1 are upc codes and barcodes the same?
The UPC-A barcode was the original format for product barcodes. ... The only major difference is the placement of the numbers below (human readable numbers) which are there only as a back-up in case the barcode doesn't scan properly and the information has to be manually entered into the point of sale system.
Barcodes do not support any of the mentioned attributes. Thus, QR Codes will not be replaced by barcodes.
what does it mean when it says your application is under review?
"Under review" is a phrase that typically means your application is being screened by human resources or the hiring manager. "Applicants being selected" indicates that hiring managers are selecting candidates for interviews. "Referred to hiring manager" means your application has passed initial HR screening.
There are two different types of review: In Review and Under Review. If your verification status is In Review, this means that your ID uploaded successfully and your submission is being reviewed automatically. Automated reviews typically take anywhere from 5 minutes to 2 hours.
when was the last time the kansas city chiefs won a playoff game?
The Chiefs were overwhelmed by the Bills and lost the game by a score of 30–13. The Chiefs' victory on January 16, 1994, against the Oilers remained the franchise's last post-season victory for 21 years until their 30–0 victory over the Houston Texans on January 9, 2016.
The Broncos have not beaten the Chiefs since a Week 2 game in Kansas City in 2015. Kansas City's streak is the second longest in the series' history. Only the Chiefs' 11-game winning streak from 1964 to 1969 lasted longer.
- Loss:
pylate.losses.contrastive.Contrastive
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 128per_device_eval_batch_size
: 128learning_rate
: 3e-06num_train_epochs
: 5warmup_ratio
: 0.1seed
: 12bf16
: Truedataloader_num_workers
: 12load_best_model_at_end
: True
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 128per_device_eval_batch_size
: 128per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 3e-06weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 5max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 12data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 12dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Trueignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch | Step | Training Loss | Validation Loss | accuracy |
---|---|---|---|---|
0 | 0 | - | - | 0.4392 |
0.0000 | 1 | 22.1457 | - | - |
0.0068 | 200 | 19.4909 | - | - |
0.0135 | 400 | 13.7828 | - | - |
0.0203 | 600 | 9.9129 | - | - |
0.0270 | 800 | 7.6419 | - | - |
0.0338 | 1000 | 6.3482 | - | - |
0.0406 | 1200 | 5.7019 | - | - |
0.0473 | 1400 | 5.2986 | - | - |
0.0541 | 1600 | 4.741 | - | - |
0.0608 | 1800 | 3.7375 | - | - |
0.0676 | 2000 | 2.7031 | - | - |
0.0744 | 2200 | 2.0178 | - | - |
0.0811 | 2400 | 1.6433 | - | - |
0.0879 | 2600 | 1.3954 | - | - |
0.0946 | 2800 | 1.2457 | - | - |
0.1014 | 3000 | 1.1331 | - | - |
0.1082 | 3200 | 1.0455 | - | - |
0.1149 | 3400 | 0.9814 | - | - |
0.1217 | 3600 | 0.9191 | - | - |
0.1284 | 3800 | 0.8788 | - | - |
0.1352 | 4000 | 0.8551 | - | - |
0.1420 | 4200 | 0.8079 | - | - |
0.1487 | 4400 | 0.7789 | - | - |
0.1555 | 4600 | 0.7448 | - | - |
0.1622 | 4800 | 0.7439 | - | - |
0.1690 | 5000 | 0.7255 | - | - |
0.1758 | 5200 | 0.6902 | - | - |
0.1825 | 5400 | 0.6737 | - | - |
0.1893 | 5600 | 0.6621 | - | - |
0.1960 | 5800 | 0.6438 | - | - |
0.2028 | 6000 | 0.6099 | - | - |
0.2096 | 6200 | 0.6171 | - | - |
0.2163 | 6400 | 0.5861 | - | - |
0.2231 | 6600 | 0.5788 | - | - |
0.2298 | 6800 | 0.5736 | - | - |
0.2366 | 7000 | 0.5608 | - | - |
0.2434 | 7200 | 0.5457 | - | - |
0.2501 | 7400 | 0.5339 | - | - |
0.2569 | 7600 | 0.5046 | - | - |
0.2636 | 7800 | 0.513 | - | - |
0.2704 | 8000 | 0.5087 | - | - |
0.2771 | 8200 | 0.4826 | - | - |
0.2839 | 8400 | 0.4836 | - | - |
0.2907 | 8600 | 0.4761 | - | - |
0.2974 | 8800 | 0.4655 | - | - |
0.3042 | 9000 | 0.4421 | - | - |
0.3109 | 9200 | 0.4436 | - | - |
0.3177 | 9400 | 0.4418 | - | - |
0.3245 | 9600 | 0.4272 | - | - |
0.3312 | 9800 | 0.4334 | - | - |
0.3380 | 10000 | 0.417 | - | - |
0.3447 | 10200 | 0.4033 | - | - |
0.3515 | 10400 | 0.4008 | - | - |
0.3583 | 10600 | 0.3986 | - | - |
0.3650 | 10800 | 0.3865 | - | - |
0.3718 | 11000 | 0.3846 | - | - |
0.3785 | 11200 | 0.3803 | - | - |
0.3853 | 11400 | 0.373 | - | - |
0.3921 | 11600 | 0.3741 | - | - |
0.3988 | 11800 | 0.3588 | - | - |
0.4056 | 12000 | 0.3568 | - | - |
0.4123 | 12200 | 0.3486 | - | - |
0.4191 | 12400 | 0.3439 | - | - |
0.4259 | 12600 | 0.3373 | - | - |
0.4326 | 12800 | 0.3396 | - | - |
0.4394 | 13000 | 0.3283 | - | - |
0.4461 | 13200 | 0.3267 | - | - |
0.4529 | 13400 | 0.3235 | - | - |
0.4597 | 13600 | 0.3058 | - | - |
0.4664 | 13800 | 0.3059 | - | - |
0.4732 | 14000 | 0.3071 | - | - |
0.4799 | 14200 | 0.299 | - | - |
0.4867 | 14400 | 0.2963 | - | - |
0.4935 | 14600 | 0.2941 | - | - |
0.5002 | 14800 | 0.286 | - | - |
0.5070 | 15000 | 0.2894 | - | - |
0.5137 | 15200 | 0.276 | - | - |
0.5205 | 15400 | 0.2741 | - | - |
0.5273 | 15600 | 0.2772 | - | - |
0.5340 | 15800 | 0.2751 | - | - |
0.5408 | 16000 | 0.2613 | - | - |
0.5475 | 16200 | 0.2621 | - | - |
0.5543 | 16400 | 0.2615 | - | - |
0.5611 | 16600 | 0.2642 | - | - |
0.5678 | 16800 | 0.2527 | - | - |
0.5746 | 17000 | 0.248 | - | - |
0.5813 | 17200 | 0.2453 | - | - |
0.5881 | 17400 | 0.2478 | - | - |
0.5949 | 17600 | 0.2469 | - | - |
0.6016 | 17800 | 0.2478 | - | - |
0.6084 | 18000 | 0.2443 | - | - |
0.6151 | 18200 | 0.2473 | - | - |
0.6219 | 18400 | 0.2394 | - | - |
0.6287 | 18600 | 0.2343 | - | - |
0.6354 | 18800 | 0.2368 | - | - |
0.6422 | 19000 | 0.2317 | - | - |
0.6489 | 19200 | 0.2337 | - | - |
0.6557 | 19400 | 0.2325 | - | - |
0.6625 | 19600 | 0.2272 | - | - |
0.6692 | 19800 | 0.2303 | - | - |
0.6760 | 20000 | 0.2191 | - | - |
0 | 0 | - | - | 0.5022 |
0.6760 | 20000 | - | 1.0284 | - |
0.6827 | 20200 | 0.2258 | - | - |
0.6895 | 20400 | 0.2173 | - | - |
0.6963 | 20600 | 0.2183 | - | - |
0.7030 | 20800 | 0.2117 | - | - |
0.7098 | 21000 | 0.208 | - | - |
0.7165 | 21200 | 0.2173 | - | - |
0.7233 | 21400 | 0.2129 | - | - |
0.7301 | 21600 | 0.2092 | - | - |
0.7368 | 21800 | 0.212 | - | - |
0.7436 | 22000 | 0.2076 | - | - |
0.7503 | 22200 | 0.2113 | - | - |
0.7571 | 22400 | 0.2027 | - | - |
0.7638 | 22600 | 0.2069 | - | - |
0.7706 | 22800 | 0.2015 | - | - |
0.7774 | 23000 | 0.2041 | - | - |
0.7841 | 23200 | 0.2059 | - | - |
0.7909 | 23400 | 0.2073 | - | - |
0.7976 | 23600 | 0.2018 | - | - |
0.8044 | 23800 | 0.2006 | - | - |
0.8112 | 24000 | 0.1998 | - | - |
0.8179 | 24200 | 0.2009 | - | - |
0.8247 | 24400 | 0.1971 | - | - |
0.8314 | 24600 | 0.1956 | - | - |
0.8382 | 24800 | 0.1943 | - | - |
0.8450 | 25000 | 0.1876 | - | - |
0.8517 | 25200 | 0.2 | - | - |
0.8585 | 25400 | 0.1892 | - | - |
0.8652 | 25600 | 0.1923 | - | - |
0.8720 | 25800 | 0.1879 | - | - |
0.8788 | 26000 | 0.1874 | - | - |
0.8855 | 26200 | 0.1868 | - | - |
0.8923 | 26400 | 0.1853 | - | - |
0.8990 | 26600 | 0.1869 | - | - |
0.9058 | 26800 | 0.1879 | - | - |
0.9126 | 27000 | 0.1862 | - | - |
0.9193 | 27200 | 0.1891 | - | - |
0.9261 | 27400 | 0.1793 | - | - |
0.9328 | 27600 | 0.1786 | - | - |
0.9396 | 27800 | 0.1736 | - | - |
0.9464 | 28000 | 0.1787 | - | - |
0.9531 | 28200 | 0.1874 | - | - |
0.9599 | 28400 | 0.1798 | - | - |
0.9666 | 28600 | 0.1818 | - | - |
0.9734 | 28800 | 0.1799 | - | - |
0.9802 | 29000 | 0.1763 | - | - |
0.9869 | 29200 | 0.1758 | - | - |
0.9937 | 29400 | 0.1799 | - | - |
1.0004 | 29600 | 0.1737 | - | - |
1.0072 | 29800 | 0.17 | - | - |
1.0140 | 30000 | 0.1671 | - | - |
1.0207 | 30200 | 0.168 | - | - |
1.0275 | 30400 | 0.1703 | - | - |
1.0342 | 30600 | 0.1611 | - | - |
1.0410 | 30800 | 0.1629 | - | - |
1.0478 | 31000 | 0.1621 | - | - |
1.0545 | 31200 | 0.1612 | - | - |
1.0613 | 31400 | 0.1607 | - | - |
1.0680 | 31600 | 0.1623 | - | - |
1.0748 | 31800 | 0.1659 | - | - |
1.0816 | 32000 | 0.1619 | - | - |
1.0883 | 32200 | 0.1595 | - | - |
1.0951 | 32400 | 0.1649 | - | - |
1.1018 | 32600 | 0.16 | - | - |
1.1086 | 32800 | 0.1573 | - | - |
1.1154 | 33000 | 0.1624 | - | - |
1.1221 | 33200 | 0.159 | - | - |
1.1289 | 33400 | 0.1596 | - | - |
1.1356 | 33600 | 0.1573 | - | - |
1.1424 | 33800 | 0.1544 | - | - |
1.1492 | 34000 | 0.1512 | - | - |
1.1559 | 34200 | 0.1592 | - | - |
1.1627 | 34400 | 0.1602 | - | - |
1.1694 | 34600 | 0.1562 | - | - |
1.1762 | 34800 | 0.1531 | - | - |
1.1830 | 35000 | 0.1504 | - | - |
1.1897 | 35200 | 0.1533 | - | - |
1.1965 | 35400 | 0.1578 | - | - |
1.2032 | 35600 | 0.155 | - | - |
1.2100 | 35800 | 0.1539 | - | - |
1.2168 | 36000 | 0.1514 | - | - |
1.2235 | 36200 | 0.157 | - | - |
1.2303 | 36400 | 0.1586 | - | - |
1.2370 | 36600 | 0.151 | - | - |
1.2438 | 36800 | 0.1522 | - | - |
1.2505 | 37000 | 0.1521 | - | - |
1.2573 | 37200 | 0.1457 | - | - |
1.2641 | 37400 | 0.1465 | - | - |
1.2708 | 37600 | 0.1513 | - | - |
1.2776 | 37800 | 0.1478 | - | - |
1.2843 | 38000 | 0.1529 | - | - |
1.2911 | 38200 | 0.1486 | - | - |
1.2979 | 38400 | 0.149 | - | - |
1.3046 | 38600 | 0.1516 | - | - |
1.3114 | 38800 | 0.1486 | - | - |
1.3181 | 39000 | 0.1423 | - | - |
1.3249 | 39200 | 0.1479 | - | - |
1.3317 | 39400 | 0.1499 | - | - |
1.3384 | 39600 | 0.1455 | - | - |
1.3452 | 39800 | 0.1482 | - | - |
1.3519 | 40000 | 0.1471 | - | - |
0 | 0 | - | - | 0.5316 |
1.3519 | 40000 | - | 0.9581 | - |
1.3587 | 40200 | 0.1504 | - | - |
1.3655 | 40400 | 0.1494 | - | - |
1.3722 | 40600 | 0.147 | - | - |
1.3790 | 40800 | 0.1457 | - | - |
1.3857 | 41000 | 0.1494 | - | - |
1.3925 | 41200 | 0.1428 | - | - |
1.3993 | 41400 | 0.146 | - | - |
1.4060 | 41600 | 0.1487 | - | - |
1.4128 | 41800 | 0.1406 | - | - |
1.4195 | 42000 | 0.1483 | - | - |
1.4263 | 42200 | 0.1417 | - | - |
1.4331 | 42400 | 0.1408 | - | - |
1.4398 | 42600 | 0.1493 | - | - |
1.4466 | 42800 | 0.1465 | - | - |
1.4533 | 43000 | 0.1423 | - | - |
1.4601 | 43200 | 0.1438 | - | - |
1.4669 | 43400 | 0.1432 | - | - |
1.4736 | 43600 | 0.1426 | - | - |
1.4804 | 43800 | 0.1387 | - | - |
1.4871 | 44000 | 0.1455 | - | - |
1.4939 | 44200 | 0.1405 | - | - |
1.5007 | 44400 | 0.1391 | - | - |
1.5074 | 44600 | 0.1433 | - | - |
1.5142 | 44800 | 0.1424 | - | - |
1.5209 | 45000 | 0.1397 | - | - |
1.5277 | 45200 | 0.1391 | - | - |
1.5345 | 45400 | 0.1464 | - | - |
1.5412 | 45600 | 0.1351 | - | - |
1.5480 | 45800 | 0.1362 | - | - |
1.5547 | 46000 | 0.1366 | - | - |
1.5615 | 46200 | 0.1402 | - | - |
1.5683 | 46400 | 0.1339 | - | - |
1.5750 | 46600 | 0.1348 | - | - |
1.5818 | 46800 | 0.1394 | - | - |
1.5885 | 47000 | 0.1413 | - | - |
1.5953 | 47200 | 0.1394 | - | - |
1.6021 | 47400 | 0.14 | - | - |
1.6088 | 47600 | 0.138 | - | - |
1.6156 | 47800 | 0.1387 | - | - |
1.6223 | 48000 | 0.14 | - | - |
1.6291 | 48200 | 0.1372 | - | - |
1.6359 | 48400 | 0.1379 | - | - |
1.6426 | 48600 | 0.1332 | - | - |
1.6494 | 48800 | 0.1341 | - | - |
1.6561 | 49000 | 0.1319 | - | - |
1.6629 | 49200 | 0.1363 | - | - |
1.6697 | 49400 | 0.1327 | - | - |
1.6764 | 49600 | 0.1297 | - | - |
1.6832 | 49800 | 0.1345 | - | - |
1.6899 | 50000 | 0.1362 | - | - |
1.6967 | 50200 | 0.1323 | - | - |
1.7035 | 50400 | 0.1351 | - | - |
1.7102 | 50600 | 0.1334 | - | - |
1.7170 | 50800 | 0.136 | - | - |
1.7237 | 51000 | 0.1353 | - | - |
1.7305 | 51200 | 0.1271 | - | - |
1.7372 | 51400 | 0.1345 | - | - |
1.7440 | 51600 | 0.1347 | - | - |
1.7508 | 51800 | 0.1254 | - | - |
1.7575 | 52000 | 0.1308 | - | - |
1.7643 | 52200 | 0.132 | - | - |
1.7710 | 52400 | 0.1275 | - | - |
1.7778 | 52600 | 0.1306 | - | - |
1.7846 | 52800 | 0.1277 | - | - |
1.7913 | 53000 | 0.1304 | - | - |
1.7981 | 53200 | 0.1319 | - | - |
1.8048 | 53400 | 0.1339 | - | - |
1.8116 | 53600 | 0.1332 | - | - |
1.8184 | 53800 | 0.1293 | - | - |
1.8251 | 54000 | 0.13 | - | - |
1.8319 | 54200 | 0.1333 | - | - |
1.8386 | 54400 | 0.1273 | - | - |
1.8454 | 54600 | 0.1296 | - | - |
1.8522 | 54800 | 0.125 | - | - |
1.8589 | 55000 | 0.1332 | - | - |
1.8657 | 55200 | 0.1288 | - | - |
1.8724 | 55400 | 0.1233 | - | - |
1.8792 | 55600 | 0.1282 | - | - |
1.8860 | 55800 | 0.1341 | - | - |
1.8927 | 56000 | 0.1275 | - | - |
1.8995 | 56200 | 0.1286 | - | - |
1.9062 | 56400 | 0.1279 | - | - |
1.9130 | 56600 | 0.126 | - | - |
1.9198 | 56800 | 0.1306 | - | - |
1.9265 | 57000 | 0.1289 | - | - |
1.9333 | 57200 | 0.1219 | - | - |
1.9400 | 57400 | 0.1271 | - | - |
1.9468 | 57600 | 0.1262 | - | - |
1.9536 | 57800 | 0.1253 | - | - |
1.9603 | 58000 | 0.1283 | - | - |
1.9671 | 58200 | 0.1314 | - | - |
1.9738 | 58400 | 0.1237 | - | - |
1.9806 | 58600 | 0.1239 | - | - |
1.9874 | 58800 | 0.1242 | - | - |
1.9941 | 59000 | 0.1257 | - | - |
2.0009 | 59200 | 0.1228 | - | - |
2.0076 | 59400 | 0.1106 | - | - |
2.0144 | 59600 | 0.1143 | - | - |
2.0212 | 59800 | 0.1132 | - | - |
2.0279 | 60000 | 0.1122 | - | - |
0 | 0 | - | - | 0.5386 |
2.0279 | 60000 | - | 1.0542 | - |
2.0347 | 60200 | 0.116 | - | - |
2.0414 | 60400 | 0.1081 | - | - |
2.0482 | 60600 | 0.1134 | - | - |
2.0550 | 60800 | 0.1139 | - | - |
2.0617 | 61000 | 0.106 | - | - |
2.0685 | 61200 | 0.1131 | - | - |
2.0752 | 61400 | 0.1121 | - | - |
2.0820 | 61600 | 0.1077 | - | - |
2.0888 | 61800 | 0.112 | - | - |
2.0955 | 62000 | 0.1087 | - | - |
2.1023 | 62200 | 0.1155 | - | - |
2.1090 | 62400 | 0.1118 | - | - |
2.1158 | 62600 | 0.1107 | - | - |
2.1226 | 62800 | 0.1067 | - | - |
2.1293 | 63000 | 0.1079 | - | - |
2.1361 | 63200 | 0.1094 | - | - |
2.1428 | 63400 | 0.1114 | - | - |
2.1496 | 63600 | 0.1094 | - | - |
2.1564 | 63800 | 0.1056 | - | - |
2.1631 | 64000 | 0.1095 | - | - |
2.1699 | 64200 | 0.1074 | - | - |
2.1766 | 64400 | 0.107 | - | - |
2.1834 | 64600 | 0.1124 | - | - |
2.1902 | 64800 | 0.1121 | - | - |
2.1969 | 65000 | 0.1115 | - | - |
2.2037 | 65200 | 0.109 | - | - |
2.2104 | 65400 | 0.1158 | - | - |
2.2172 | 65600 | 0.1068 | - | - |
2.2239 | 65800 | 0.1095 | - | - |
2.2307 | 66000 | 0.1062 | - | - |
2.2375 | 66200 | 0.1063 | - | - |
2.2442 | 66400 | 0.108 | - | - |
2.2510 | 66600 | 0.1128 | - | - |
2.2577 | 66800 | 0.1082 | - | - |
2.2645 | 67000 | 0.1066 | - | - |
2.2713 | 67200 | 0.1101 | - | - |
2.2780 | 67400 | 0.1072 | - | - |
2.2848 | 67600 | 0.109 | - | - |
2.2915 | 67800 | 0.1069 | - | - |
2.2983 | 68000 | 0.1136 | - | - |
2.3051 | 68200 | 0.1108 | - | - |
2.3118 | 68400 | 0.109 | - | - |
2.3186 | 68600 | 0.1139 | - | - |
2.3253 | 68800 | 0.1081 | - | - |
2.3321 | 69000 | 0.1082 | - | - |
2.3389 | 69200 | 0.1064 | - | - |
2.3456 | 69400 | 0.1064 | - | - |
2.3524 | 69600 | 0.1084 | - | - |
2.3591 | 69800 | 0.1058 | - | - |
2.3659 | 70000 | 0.1109 | - | - |
2.3727 | 70200 | 0.105 | - | - |
2.3794 | 70400 | 0.1096 | - | - |
2.3862 | 70600 | 0.1064 | - | - |
2.3929 | 70800 | 0.1079 | - | - |
2.3997 | 71000 | 0.1038 | - | - |
2.4065 | 71200 | 0.1072 | - | - |
2.4132 | 71400 | 0.1097 | - | - |
2.4200 | 71600 | 0.1098 | - | - |
2.4267 | 71800 | 0.1067 | - | - |
2.4335 | 72000 | 0.1098 | - | - |
2.4403 | 72200 | 0.1088 | - | - |
2.4470 | 72400 | 0.1031 | - | - |
2.4538 | 72600 | 0.1034 | - | - |
2.4605 | 72800 | 0.1046 | - | - |
2.4673 | 73000 | 0.1065 | - | - |
2.4741 | 73200 | 0.1062 | - | - |
2.4808 | 73400 | 0.1064 | - | - |
2.4876 | 73600 | 0.1053 | - | - |
2.4943 | 73800 | 0.1055 | - | - |
2.5011 | 74000 | 0.1057 | - | - |
2.5079 | 74200 | 0.1095 | - | - |
2.5146 | 74400 | 0.1038 | - | - |
2.5214 | 74600 | 0.1025 | - | - |
2.5281 | 74800 | 0.1032 | - | - |
2.5349 | 75000 | 0.1036 | - | - |
2.5417 | 75200 | 0.1021 | - | - |
2.5484 | 75400 | 0.1066 | - | - |
2.5552 | 75600 | 0.1054 | - | - |
2.5619 | 75800 | 0.1028 | - | - |
2.5687 | 76000 | 0.1066 | - | - |
2.5755 | 76200 | 0.1079 | - | - |
2.5822 | 76400 | 0.1083 | - | - |
2.5890 | 76600 | 0.1065 | - | - |
2.5957 | 76800 | 0.1072 | - | - |
2.6025 | 77000 | 0.1031 | - | - |
2.6093 | 77200 | 0.1022 | - | - |
2.6160 | 77400 | 0.1084 | - | - |
2.6228 | 77600 | 0.1068 | - | - |
2.6295 | 77800 | 0.1009 | - | - |
2.6363 | 78000 | 0.098 | - | - |
2.6431 | 78200 | 0.1019 | - | - |
2.6498 | 78400 | 0.1038 | - | - |
2.6566 | 78600 | 0.1032 | - | - |
2.6633 | 78800 | 0.1059 | - | - |
2.6701 | 79000 | 0.1058 | - | - |
2.6769 | 79200 | 0.1049 | - | - |
2.6836 | 79400 | 0.1023 | - | - |
2.6904 | 79600 | 0.1079 | - | - |
2.6971 | 79800 | 0.1061 | - | - |
2.7039 | 80000 | 0.1078 | - | - |
0 | 0 | - | - | 0.5402 |
2.7039 | 80000 | - | 1.0363 | - |
2.7106 | 80200 | 0.1075 | - | - |
2.7174 | 80400 | 0.105 | - | - |
2.7242 | 80600 | 0.1053 | - | - |
2.7309 | 80800 | 0.1007 | - | - |
2.7377 | 81000 | 0.1009 | - | - |
2.7444 | 81200 | 0.1022 | - | - |
2.7512 | 81400 | 0.1017 | - | - |
2.7580 | 81600 | 0.1013 | - | - |
2.7647 | 81800 | 0.1044 | - | - |
2.7715 | 82000 | 0.1045 | - | - |
2.7782 | 82200 | 0.1003 | - | - |
2.7850 | 82400 | 0.1038 | - | - |
2.7918 | 82600 | 0.1049 | - | - |
2.7985 | 82800 | 0.1026 | - | - |
2.8053 | 83000 | 0.1082 | - | - |
2.8120 | 83200 | 0.1033 | - | - |
2.8188 | 83400 | 0.1044 | - | - |
2.8256 | 83600 | 0.1013 | - | - |
2.8323 | 83800 | 0.1057 | - | - |
2.8391 | 84000 | 0.0992 | - | - |
2.8458 | 84200 | 0.0973 | - | - |
2.8526 | 84400 | 0.102 | - | - |
2.8594 | 84600 | 0.1048 | - | - |
2.8661 | 84800 | 0.1013 | - | - |
2.8729 | 85000 | 0.1038 | - | - |
2.8796 | 85200 | 0.1027 | - | - |
2.8864 | 85400 | 0.1037 | - | - |
2.8932 | 85600 | 0.1007 | - | - |
2.8999 | 85800 | 0.1033 | - | - |
2.9067 | 86000 | 0.1016 | - | - |
2.9134 | 86200 | 0.1051 | - | - |
2.9202 | 86400 | 0.1001 | - | - |
2.9270 | 86600 | 0.1003 | - | - |
2.9337 | 86800 | 0.0969 | - | - |
2.9405 | 87000 | 0.1056 | - | - |
2.9472 | 87200 | 0.0997 | - | - |
2.9540 | 87400 | 0.1 | - | - |
2.9608 | 87600 | 0.1008 | - | - |
2.9675 | 87800 | 0.1003 | - | - |
2.9743 | 88000 | 0.0991 | - | - |
2.9810 | 88200 | 0.1022 | - | - |
2.9878 | 88400 | 0.1012 | - | - |
2.9946 | 88600 | 0.0992 | - | - |
3.0013 | 88800 | 0.1002 | - | - |
3.0081 | 89000 | 0.0885 | - | - |
3.0148 | 89200 | 0.0918 | - | - |
3.0216 | 89400 | 0.0888 | - | - |
3.0284 | 89600 | 0.088 | - | - |
3.0351 | 89800 | 0.0893 | - | - |
3.0419 | 90000 | 0.0896 | - | - |
3.0486 | 90200 | 0.0899 | - | - |
3.0554 | 90400 | 0.0918 | - | - |
3.0622 | 90600 | 0.0888 | - | - |
3.0689 | 90800 | 0.0961 | - | - |
3.0757 | 91000 | 0.0867 | - | - |
3.0824 | 91200 | 0.0927 | - | - |
3.0892 | 91400 | 0.0891 | - | - |
3.0960 | 91600 | 0.0864 | - | - |
3.1027 | 91800 | 0.0884 | - | - |
3.1095 | 92000 | 0.0879 | - | - |
3.1162 | 92200 | 0.088 | - | - |
3.1230 | 92400 | 0.086 | - | - |
3.1298 | 92600 | 0.0885 | - | - |
3.1365 | 92800 | 0.0918 | - | - |
3.1433 | 93000 | 0.0885 | - | - |
3.1500 | 93200 | 0.0873 | - | - |
3.1568 | 93400 | 0.0922 | - | - |
3.1636 | 93600 | 0.0878 | - | - |
3.1703 | 93800 | 0.0893 | - | - |
3.1771 | 94000 | 0.0865 | - | - |
3.1838 | 94200 | 0.0888 | - | - |
3.1906 | 94400 | 0.0871 | - | - |
3.1974 | 94600 | 0.0884 | - | - |
3.2041 | 94800 | 0.0858 | - | - |
3.2109 | 95000 | 0.0872 | - | - |
3.2176 | 95200 | 0.0871 | - | - |
3.2244 | 95400 | 0.0887 | - | - |
3.2311 | 95600 | 0.0903 | - | - |
3.2379 | 95800 | 0.0865 | - | - |
3.2447 | 96000 | 0.0901 | - | - |
3.2514 | 96200 | 0.0894 | - | - |
3.2582 | 96400 | 0.0945 | - | - |
3.2649 | 96600 | 0.0851 | - | - |
3.2717 | 96800 | 0.086 | - | - |
3.2785 | 97000 | 0.0884 | - | - |
3.2852 | 97200 | 0.0889 | - | - |
3.2920 | 97400 | 0.0878 | - | - |
3.2987 | 97600 | 0.0872 | - | - |
3.3055 | 97800 | 0.0885 | - | - |
3.3123 | 98000 | 0.0861 | - | - |
3.3190 | 98200 | 0.0854 | - | - |
3.3258 | 98400 | 0.0871 | - | - |
3.3325 | 98600 | 0.0884 | - | - |
3.3393 | 98800 | 0.0833 | - | - |
3.3461 | 99000 | 0.0912 | - | - |
3.3528 | 99200 | 0.0859 | - | - |
3.3596 | 99400 | 0.088 | - | - |
3.3663 | 99600 | 0.0861 | - | - |
3.3731 | 99800 | 0.0908 | - | - |
3.3799 | 100000 | 0.0852 | - | - |
0 | 0 | - | - | 0.5332 |
3.3799 | 100000 | - | 1.1603 | - |
Framework Versions
- Python: 3.11.0
- Sentence Transformers: 4.0.1
- PyLate: 1.1.7
- Transformers: 4.48.2
- PyTorch: 2.6.0+cu124
- Accelerate: 1.6.0
- Datasets: 3.5.0
- Tokenizers: 0.21.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084"
}
PyLate
@misc{PyLate,
title={PyLate: Flexible Training and Retrieval for Late Interaction Models},
author={Chaffin, Antoine and Sourty, Raphaël},
url={https://github.com/lightonai/pylate},
year={2024}
}