
Reason-ModernColBERT
Reason-ModernColBERT is a late interaction model trained on the reasonir-hq dataset. It achieves extremely competitive performance on the BRIGHT benchmark aimed at evaluating reasoning-intensive retrieval performance, outperforming all existing models up to 7B (more than 45 times its size) and even surprisingly improving performance of ReasonIR-8B (a 8B model trained on the same data) by more than 2.5 NDCG@10 on average on Stack Exchange splits. We attribute such strong results to late-interaction, see evaluation section.
License
Unfortunately, since the ReasonIR data has been released under a cc-by-nc-4.0 license, we cannot release this model under an Apache 2.0 license. However, the authors of ReasonIR released code to generate the data. Anyone willing to reproduce the data could then easily reproduce this model under an Apache 2.0 license by running a fine-tuning lasting lower than 2 hours using this boilerplate.
PyLate model based on lightonai/GTE-ModernColBERT-v1
This is a PyLate model finetuned from lightonai/GTE-ModernColBERT-v1 on the reasonir-hq dataset. It maps sentences & paragraphs to sequences of 128-dimensional dense vectors and can be used for semantic textual similarity using the MaxSim operator.
Model Details
Model Description
- Model Type: PyLate model
- Base model: lightonai/GTE-ModernColBERT-v1
- Document Length: 8192 tokens
- Query Length: 128 tokens
- Output Dimensionality: 128 tokens
- Similarity Function: MaxSim
- Training Dataset:
- Language: en
Model Sources
- Documentation: PyLate Documentation
- Repository: PyLate on GitHub
- Hugging Face: PyLate models on Hugging Face
Full Model Architecture
ColBERT(
(0): Transformer({'max_seq_length': 127, 'do_lower_case': False}) with Transformer model: ModernBertModel
(1): Dense({'in_features': 768, 'out_features': 128, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
)
Usage
First install the PyLate library:
pip install -U pylate
Retrieval
PyLate provides a streamlined interface to index and retrieve documents using ColBERT models. The index leverages the Voyager HNSW index to efficiently handle document embeddings and enable fast retrieval.
Indexing documents
First, load the ColBERT model and initialize the Voyager index, then encode and index your documents:
from pylate import indexes, models, retrieve
# Step 1: Load the ColBERT model
model = models.ColBERT(
model_name_or_path=pylate_model_id,
)
# Step 2: Initialize the Voyager index
index = indexes.Voyager(
index_folder="pylate-index",
index_name="index",
override=True, # This overwrites the existing index if any
)
# Step 3: Encode the documents
documents_ids = ["1", "2", "3"]
documents = ["document 1 text", "document 2 text", "document 3 text"]
documents_embeddings = model.encode(
documents,
batch_size=32,
is_query=False, # Ensure that it is set to False to indicate that these are documents, not queries
show_progress_bar=True,
)
# Step 4: Add document embeddings to the index by providing embeddings and corresponding ids
index.add_documents(
documents_ids=documents_ids,
documents_embeddings=documents_embeddings,
)
Note that you do not have to recreate the index and encode the documents every time. Once you have created an index and added the documents, you can re-use the index later by loading it:
# To load an index, simply instantiate it with the correct folder/name and without overriding it
index = indexes.Voyager(
index_folder="pylate-index",
index_name="index",
)
Retrieving top-k documents for queries
Once the documents are indexed, you can retrieve the top-k most relevant documents for a given set of queries. To do so, initialize the ColBERT retriever with the index you want to search in, encode the queries and then retrieve the top-k documents to get the top matches ids and relevance scores:
# Step 1: Initialize the ColBERT retriever
retriever = retrieve.ColBERT(index=index)
# Step 2: Encode the queries
queries_embeddings = model.encode(
["query for document 3", "query for document 1"],
batch_size=32,
is_query=True, # # Ensure that it is set to False to indicate that these are queries
show_progress_bar=True,
)
# Step 3: Retrieve top-k documents
scores = retriever.retrieve(
queries_embeddings=queries_embeddings,
k=10, # Retrieve the top 10 matches for each query
)
Reranking
If you only want to use the ColBERT model to perform reranking on top of your first-stage retrieval pipeline without building an index, you can simply use rank function and pass the queries and documents to rerank:
from pylate import rank, models
queries = [
"query A",
"query B",
]
documents = [
["document A", "document B"],
["document 1", "document C", "document B"],
]
documents_ids = [
[1, 2],
[1, 3, 2],
]
model = models.ColBERT(
model_name_or_path=pylate_model_id,
)
queries_embeddings = model.encode(
queries,
is_query=True,
)
documents_embeddings = model.encode(
documents,
is_query=False,
)
reranked_documents = rank.rerank(
documents_ids=documents_ids,
queries_embeddings=queries_embeddings,
documents_embeddings=documents_embeddings,
)
Evaluation
BRIGHT Benchmark
The BRIGHT benchmark is aimed at evaluating reasoning-intensive retrieval performance. Reason-ModernColBERT outperforms all existing models up to 7B (more than 45 times its size) and even surprisingly improving performance of ReasonIR-8B (a 8B model trained on the same data) by more than 2.5 NDCG@10 on average on Stack Exchange splits. We attribute such strong results to late-interaction compared to usual dense (single vector) retrieval performed by other models as highlighted in the next section.
Model / Metric | Biology | Earth | Economics | Psychology | Robotics | Stackoverflow | Sustainable | Leetcode | Pony | AoPS | Theorem - Q | Theorem - T | Mean StackExchange | Mean coding | Mean theorem | Full mean |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
BM25 | 18.9 | 27.2 | 14.9 | 12.5 | 13.6 | 18.4 | 15 | 24.4 | 7.9 | 6.2 | 10.4 | 4.9 | 17.21 | 16.15 | 7.17 | 14.53 |
< 1B OS | ||||||||||||||||
BGE | 11.7 | 24.6 | 16.6 | 17.5 | 11.7 | 10.8 | 13.3 | 26.7 | 5.7 | 6 | 13 | 6.9 | 15.17 | 16.2 | 8.63 | 13.71 |
Inst-L | 15.2 | 21.2 | 14.7 | 22.3 | 11.4 | 13.3 | 13.5 | 19.5 | 1.3 | 8.1 | 20.9 | 9.1 | 15.94 | 10.4 | 12.7 | 14.21 |
SBERT | 15.1 | 20.4 | 16.6 | 22.7 | 8.2 | 11 | 15.3 | 26.4 | 7 | 5.3 | 20 | 10.8 | 15.61 | 16.7 | 12.03 | 14.9 |
> 1B OS | ||||||||||||||||
E5 | 18.6 | 26 | 15.5 | 15.8 | 16.3 | 11.2 | 18.1 | 28.7 | 4.9 | 7.1 | 26.1 | 26.8 | 17.36 | 16.8 | 20 | 17.93 |
SFR | 19.1 | 26.7 | 17.8 | 19 | 16.3 | 14.4 | 19.2 | 27.4 | 2 | 7.4 | 24.3 | 26 | 18.93 | 14.7 | 19.23 | 18.3 |
Inst-XL | 21.6 | 34.3 | 22.4 | 27.4 | 18.2 | 21.2 | 19.1 | 27.5 | 5 | 8.5 | 15.6 | 5.9 | 23.46 | 16.25 | 10 | 18.89 |
GritLM | 24.8 | 32.3 | 18.9 | 19.8 | 17.1 | 13.6 | 17.8 | 29.9 | 22 | 8.8 | 25.2 | 21.2 | 20.61 | 25.95 | 18.4 | 20.95 |
Qwen | 30.6 | 36.4 | 17.8 | 24.6 | 13.2 | 22.2 | 14.8 | 25.5 | 9.9 | 14.4 | 27.8 | 32.9 | 22.8 | 17.7 | 25.03 | 22.51 |
Proprietary | ||||||||||||||||
Cohere | 18.7 | 28.4 | 20.4 | 21.6 | 16.3 | 18.3 | 17.6 | 26.8 | 1.9 | 6.3 | 15.7 | 7.2 | 20.19 | 14.35 | 9.73 | 16.6 |
OpenAI | 23.3 | 26.7 | 19.5 | 27.6 | 12.8 | 14.3 | 20.5 | 23.6 | 2.4 | 8.5 | 23.5 | 11.7 | 20.67 | 13 | 14.57 | 17.87 |
Voyage | 23.1 | 25.4 | 19.9 | 24.9 | 10.8 | 16.8 | 15.4 | 30.6 | 1.5 | 7.5 | 27.4 | 11.6 | 19.47 | 16.05 | 15.5 | 17.91 |
22.7 | 34.8 | 19.6 | 27.8 | 15.7 | 20.1 | 17.1 | 29.6 | 3.6 | 9.3 | 23.8 | 15.9 | 22.54 | 16.6 | 16.33 | 20 | |
ReasonIR data | ||||||||||||||||
ReasonIR-8B | 26.2 | 31.4 | 23.3 | 30 | 18 | 23.9 | 20.5 | 35 | 10.5 | 14.7 | 31.9 | 27.2 | 24.76 | 22.75 | 24.6 | 24.38 |
Reason-ModernColBERT (150M) | 33.25 | 41.02 | 24.93 | 30.73 | 21.12 | 20.62 | 20.31 | 31.07 | 8.51 | 9.17 | 19.51 | 11.24 | 27.43 | 19.79 | 15.38 | 22.62 |
Comparison with a dense model
A fair claim would be that the performance of Reason-ModernColBERT are mostly due to the ReasonIR data. Although the differences between ReasonIR-8B and Reason-ModernColBERT already hint that it is most likely more than just that, we conducted a small experiment by training a dense (single vector) model in the same setup using Sentence Transformers as a multi-vector one trained using PyLate. This experiment highlights a very large gap in performance. Obviously, more rigourous experiments are required to draw conclusion (e.g, both models could have been further tuned and the training could have been enhanced (e.g, we did not gather negatives from other GPUs in these experiments because ST do not supports it for now)) but the gap seems really big and it does correlate pretty well with Reason-ModernColBERT being competitive with ReasonIR-8B while being more than 50 times smaller.
Model/Split | Biology | Earth | Economics | Psychology | Robotics | Stackoverflow | Sustainable | Leetcode | Pony | AoPS | Theorem Q | Theorem T | Mean StackExchange | Mean coding | Mean theorem | Full mean |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Dense (single vector) model | 7.51 | 16.92 | 13.43 | 17.18 | 10.23 | 8.93 | 8.85 | 24.88 | 1.43 | 9.81 | 18.83 | 9.71 | 11.86 | 13.16 | 12.78 | 12.31 |
Late-interaction (multi vector model) | 28.02 | 39.25 | 21.51 | 27.05 | 19.86 | 17.23 | 21.1 | 27.37 | 3.76 | 6.87 | 16.06 | 7.21 | 24.86 | 15.57 | 10.05 | 19.61 |
Training Details
Training Dataset
reasonir-hq
- Dataset: train at 0275f82
- Size: 100,521 training samples
- Columns:
query
,pos
, andneg
- Approximate statistics based on the first 1000 samples:
query pos neg type string string string details - min: 38 tokens
- mean: 97.84 tokens
- max: 128 tokens
- min: 85 tokens
- mean: 127.63 tokens
- max: 128 tokens
- min: 81 tokens
- mean: 127.77 tokens
- max: 128 tokens
- Samples:
query pos neg Given this reasoning-intensive query, find relevant documents that could help answer the question. A researcher is analyzing a sound signal represented by the equation f(t) = 2sin(3πt) + sin(5πt) + 0.5sin(7πt). Using the Fourier transform, what are the frequencies, amplitudes, and phases of the individual sinusoidal components in the signal?
A sound signal is given by the equation f(t) = sin(2πt) + sin(4πt) + sin(6πt) where t is time in seconds. Use Fourier transform to find the frequencies, amplitudes, and phases of the individual sinusoidal components in the signal.
To find the frequencies, amplitudes, and phases of the individual sinusoidal components in the signal f(t) = sin(2πt) + sin(4πt) + sin(6πt), we can use the Fourier transform. The Fourier transform of a continuous function f(t) is given by:
F(ω) = ∫[f(t) * e^(-jωt)] dt
where F(ω) is the Fourier transform of f(t), ω is the angular frequency, and j is the imaginary unit (j^2 = -1). In this case, f(t) is already given as a sum of sinusoidal functions, so we can directly identify the frequencies, amplitudes, and phases of the individual components.
1. First component: sin(2πt)
- Frequency: The angular frequency is 2π, so the frequency is ω/(2π) = 1 Hz.
- Amplitude: The coefficient of the sine function is 1, so the amplitude is 1.
- Phase: There is no phase shi...The Fourier transform is widely used in various fields, including engineering, physics, and data analysis. It is a powerful tool for decomposing a signal into its constituent frequencies. In music, for example, the Fourier transform can be used to analyze the frequency components of a sound wave. By applying the Fourier transform to a sound signal, one can identify the different frequencies present in the signal, as well as their relative amplitudes. This information can be useful in a variety of applications, such as sound filtering and audio processing. The Fourier transform can also be used to analyze images and other types of data. In image processing, the Fourier transform can be used to filter out noise and other unwanted features from an image. It can also be used to compress images by representing them in the frequency domain. In addition to its many practical applications, the Fourier transform also has a number of interesting theoretical properties. For example, it has been ...
Given this reasoning-intensive query, find relevant documents that could help answer the question. A manufacturer is designing a cone-shaped container with a fixed volume of 200π cubic centimeters. The container's height is 12 centimeters, and the radius of the base is unknown. If the manufacturer wants to minimize the surface area of the container while maintaining its volume, what should be the radius of the base?
A right circular cone has a radius of 6cm and a slant height of 10cm. Determine the surface area of the cone.
To find the surface area of a right circular cone, we need to calculate the area of the base and the lateral surface area, and then add them together.
The base of the cone is a circle with radius r = 6 cm. The area of the base (A_base) can be found using the formula for the area of a circle:
A_base = πr^2
A_base = π(6 cm)^2
A_base = 36π cm^2
The lateral surface area (A_lateral) can be found using the formula for the lateral surface area of a cone:
A_lateral = πrs, where r is the radius and s is the slant height.
Given that the slant height s = 10 cm, we can calculate the lateral surface area:
A_lateral = π(6 cm)(10 cm)
A_lateral = 60π cm^2
Now, we can find the total surface area (A_total) by adding the base area and the lateral surface area:
A_total = A_base + A_lateral
A_total = 36π cm^2 + 60π cm^2
A_total = 96π cm^2
The surface area of the cone is 96π cm^2.Torus-Shaped Containers in Chemical Engineering - New Designs and ApplicationsTorus-shaped containers are commonly used in chemical engineering for storing and transporting fluids. These containers have a distinctive doughnut shape, with a central hole and a circular cross-section. In this article, we will explore the design and applications of torus-shaped containers in chemical engineering.One of the main advantages of torus-shaped containers is their high volume-to-surface-area ratio. This makes them ideal for storing large quantities of fluids while minimizing the amount of material needed for construction. Additionally, the curved shape of the container provides added strength and stability, making it less prone to rupture or leakage.The design of torus-shaped containers typically involves the use of computer-aided design (CAD) software to create detailed models of the container's geometry. Engineers can then use these models to simulate various scenarios, such as fluid flow and ...
Given this reasoning-intensive query, find relevant documents that could help answer the question. On the xy-coordinate plane, points A and B are given as A(2, 4) and B(8, -3). Determine the coordinates of the point on line segment AB that is three times as far from A as it is from B.
On the xy co-ordinate plane, point C is (5,-2) and point D is (-1,1.5). The point on line segment CD that is twice as far from C as from D is:
Answer Choices: (A) (1,-1) (B) (1,1) (C) (2,0.25) (D) (3,0.5) (E) (3,1)
Let's think about the multi-choice question step by step.
We want the point on the line that is twice as far from C as it is from D. We can examine the x and y coordinates separately since they are independent.
*It should be noted that there are two solutions to this problem, one point between C and D, and another point with D in the middle of C and the point. We can quickly look at the answer choices and see that all the points are between C and D, therefore we can search for that point using the following method:
Taking the x-coordinate first, the distance between C and D is(x-coordinate ofC - (x-coordinate ofD - Loss:
pylate.losses.cached_contrastive.CachedContrastive
Training Hyperparameters
Non-Default Hyperparameters
per_device_train_batch_size
: 256per_device_eval_batch_size
: 256learning_rate
: 1e-05bf16
: Truedataloader_num_workers
: 8
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: noprediction_loss_only
: Trueper_device_train_batch_size
: 256per_device_eval_batch_size
: 256per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 1e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 3max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 8dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch | Step | Training Loss |
---|---|---|
0.0025 | 1 | 4.9684 |
0.0051 | 2 | 4.6956 |
0.0076 | 3 | 4.5076 |
0.0102 | 4 | 4.3723 |
0.0127 | 5 | 4.3305 |
0.0153 | 6 | 4.0355 |
0.0178 | 7 | 3.7886 |
0.0204 | 8 | 3.6133 |
0.0229 | 9 | 3.2395 |
0.0254 | 10 | 3.1481 |
0.0280 | 11 | 2.7444 |
0.0305 | 12 | 2.4946 |
0.0331 | 13 | 2.333 |
0.0356 | 14 | 2.2471 |
0.0382 | 15 | 1.9117 |
0.0407 | 16 | 1.6753 |
0.0433 | 17 | 1.2413 |
0.0458 | 18 | 1.1201 |
0.0483 | 19 | 1.0335 |
0.0509 | 20 | 1.0583 |
0.0534 | 21 | 1.067 |
0.0560 | 22 | 0.7056 |
0.0585 | 23 | 0.761 |
0.0611 | 24 | 0.5501 |
0.0636 | 25 | 0.6486 |
0.0662 | 26 | 0.4639 |
0.0687 | 27 | 0.3885 |
0.0712 | 28 | 0.4982 |
0.0738 | 29 | 0.4784 |
0.0763 | 30 | 0.5189 |
0.0789 | 31 | 0.4824 |
0.0814 | 32 | 0.4183 |
0.0840 | 33 | 0.4945 |
0.0865 | 34 | 0.2579 |
0.0891 | 35 | 0.3312 |
0.0916 | 36 | 0.4035 |
0.0941 | 37 | 0.305 |
0.0967 | 38 | 0.2898 |
0.0992 | 39 | 0.2899 |
0.1018 | 40 | 0.2713 |
0.1043 | 41 | 0.3017 |
0.1069 | 42 | 0.2395 |
0.1094 | 43 | 0.1548 |
0.1120 | 44 | 0.2468 |
0.1145 | 45 | 0.1876 |
0.1170 | 46 | 0.2322 |
0.1196 | 47 | 0.2823 |
0.1221 | 48 | 0.2158 |
0.1247 | 49 | 0.2679 |
0.1272 | 50 | 0.273 |
0.1298 | 51 | 0.2876 |
0.1323 | 52 | 0.197 |
0.1349 | 53 | 0.1282 |
0.1374 | 54 | 0.3355 |
0.1399 | 55 | 0.1941 |
0.1425 | 56 | 0.1873 |
0.1450 | 57 | 0.2288 |
0.1476 | 58 | 0.2802 |
0.1501 | 59 | 0.2087 |
0.1527 | 60 | 0.2239 |
0.1552 | 61 | 0.225 |
0.1578 | 62 | 0.1582 |
0.1603 | 63 | 0.1972 |
0.1628 | 64 | 0.1632 |
0.1654 | 65 | 0.2101 |
0.1679 | 66 | 0.2084 |
0.1705 | 67 | 0.1499 |
0.1730 | 68 | 0.1467 |
0.1756 | 69 | 0.1428 |
0.1781 | 70 | 0.2298 |
0.1807 | 71 | 0.1883 |
0.1832 | 72 | 0.22 |
0.1858 | 73 | 0.1988 |
0.1883 | 74 | 0.2091 |
0.1908 | 75 | 0.1948 |
0.1934 | 76 | 0.1348 |
0.1959 | 77 | 0.112 |
0.1985 | 78 | 0.1474 |
0.2010 | 79 | 0.1949 |
0.2036 | 80 | 0.1664 |
0.2061 | 81 | 0.1807 |
0.2087 | 82 | 0.1403 |
0.2112 | 83 | 0.1225 |
0.2137 | 84 | 0.1919 |
0.2163 | 85 | 0.1403 |
0.2188 | 86 | 0.1402 |
0.2214 | 87 | 0.0981 |
0.2239 | 88 | 0.1214 |
0.2265 | 89 | 0.1755 |
0.2290 | 90 | 0.1509 |
0.2316 | 91 | 0.1551 |
0.2341 | 92 | 0.176 |
0.2366 | 93 | 0.1648 |
0.2392 | 94 | 0.1622 |
0.2417 | 95 | 0.1372 |
0.2443 | 96 | 0.1016 |
0.2468 | 97 | 0.1134 |
0.2494 | 98 | 0.1436 |
0.2519 | 99 | 0.1478 |
0.2545 | 100 | 0.2065 |
0.2570 | 101 | 0.1901 |
0.2595 | 102 | 0.1859 |
0.2621 | 103 | 0.212 |
0.2646 | 104 | 0.2179 |
0.2672 | 105 | 0.2471 |
0.2697 | 106 | 0.1769 |
0.2723 | 107 | 0.1593 |
0.2748 | 108 | 0.204 |
0.2774 | 109 | 0.1496 |
0.2799 | 110 | 0.1212 |
0.2824 | 111 | 0.1282 |
0.2850 | 112 | 0.1126 |
0.2875 | 113 | 0.1254 |
0.2901 | 114 | 0.1422 |
0.2926 | 115 | 0.1266 |
0.2952 | 116 | 0.1305 |
0.2977 | 117 | 0.1283 |
0.3003 | 118 | 0.0737 |
0.3028 | 119 | 0.1237 |
0.3053 | 120 | 0.1185 |
0.3079 | 121 | 0.0891 |
0.3104 | 122 | 0.2312 |
0.3130 | 123 | 0.2384 |
0.3155 | 124 | 0.155 |
0.3181 | 125 | 0.1118 |
0.3206 | 126 | 0.1575 |
0.3232 | 127 | 0.2115 |
0.3257 | 128 | 0.098 |
0.3282 | 129 | 0.1811 |
0.3308 | 130 | 0.1704 |
0.3333 | 131 | 0.1494 |
0.3359 | 132 | 0.1531 |
0.3384 | 133 | 0.1032 |
0.3410 | 134 | 0.1137 |
0.3435 | 135 | 0.1271 |
0.3461 | 136 | 0.1591 |
0.3486 | 137 | 0.1586 |
0.3511 | 138 | 0.1292 |
0.3537 | 139 | 0.1115 |
0.3562 | 140 | 0.1337 |
0.3588 | 141 | 0.1298 |
0.3613 | 142 | 0.1649 |
0.3639 | 143 | 0.0855 |
0.3664 | 144 | 0.1124 |
0.3690 | 145 | 0.0764 |
0.3715 | 146 | 0.1402 |
0.3740 | 147 | 0.137 |
0.3766 | 148 | 0.0736 |
0.3791 | 149 | 0.0772 |
0.3817 | 150 | 0.1689 |
0.3842 | 151 | 0.1371 |
0.3868 | 152 | 0.1195 |
0.3893 | 153 | 0.1536 |
0.3919 | 154 | 0.1421 |
0.3944 | 155 | 0.1222 |
0.3969 | 156 | 0.1121 |
0.3995 | 157 | 0.0892 |
0.4020 | 158 | 0.1516 |
0.4046 | 159 | 0.1071 |
0.4071 | 160 | 0.1593 |
0.4097 | 161 | 0.1078 |
0.4122 | 162 | 0.1112 |
0.4148 | 163 | 0.2101 |
0.4173 | 164 | 0.2096 |
0.4198 | 165 | 0.1337 |
0.4224 | 166 | 0.1501 |
0.4249 | 167 | 0.0989 |
0.4275 | 168 | 0.0992 |
0.4300 | 169 | 0.0926 |
0.4326 | 170 | 0.0692 |
0.4351 | 171 | 0.1235 |
0.4377 | 172 | 0.1029 |
0.4402 | 173 | 0.1351 |
0.4427 | 174 | 0.0899 |
0.4453 | 175 | 0.0844 |
0.4478 | 176 | 0.1167 |
0.4504 | 177 | 0.1355 |
0.4529 | 178 | 0.092 |
0.4555 | 179 | 0.1005 |
0.4580 | 180 | 0.0891 |
0.4606 | 181 | 0.1396 |
0.4631 | 182 | 0.1024 |
0.4656 | 183 | 0.1325 |
0.4682 | 184 | 0.1061 |
0.4707 | 185 | 0.1657 |
0.4733 | 186 | 0.1141 |
0.4758 | 187 | 0.149 |
0.4784 | 188 | 0.1125 |
0.4809 | 189 | 0.1524 |
0.4835 | 190 | 0.1129 |
0.4860 | 191 | 0.1089 |
0.4885 | 192 | 0.1333 |
0.4911 | 193 | 0.1377 |
0.4936 | 194 | 0.0547 |
0.4962 | 195 | 0.1057 |
0.4987 | 196 | 0.1321 |
0.5013 | 197 | 0.0979 |
0.5038 | 198 | 0.1706 |
0.5064 | 199 | 0.1559 |
0.5089 | 200 | 0.1111 |
0.5115 | 201 | 0.1258 |
0.5140 | 202 | 0.0816 |
0.5165 | 203 | 0.1362 |
0.5191 | 204 | 0.1604 |
0.5216 | 205 | 0.1104 |
0.5242 | 206 | 0.1494 |
0.5267 | 207 | 0.1402 |
0.5293 | 208 | 0.1282 |
0.5318 | 209 | 0.1543 |
0.5344 | 210 | 0.1576 |
0.5369 | 211 | 0.2071 |
0.5394 | 212 | 0.1248 |
0.5420 | 213 | 0.1237 |
0.5445 | 214 | 0.0592 |
0.5471 | 215 | 0.1769 |
0.5496 | 216 | 0.1118 |
0.5522 | 217 | 0.1608 |
0.5547 | 218 | 0.1192 |
0.5573 | 219 | 0.0551 |
0.5598 | 220 | 0.1401 |
0.5623 | 221 | 0.2046 |
0.5649 | 222 | 0.1273 |
0.5674 | 223 | 0.1319 |
0.5700 | 224 | 0.1518 |
0.5725 | 225 | 0.0929 |
0.5751 | 226 | 0.1262 |
0.5776 | 227 | 0.1566 |
0.5802 | 228 | 0.1128 |
0.5827 | 229 | 0.1467 |
0.5852 | 230 | 0.1513 |
0.5878 | 231 | 0.1989 |
0.5903 | 232 | 0.0594 |
0.5929 | 233 | 0.0838 |
0.5954 | 234 | 0.0711 |
0.5980 | 235 | 0.0854 |
0.6005 | 236 | 0.1775 |
0.6031 | 237 | 0.118 |
0.6056 | 238 | 0.1297 |
0.6081 | 239 | 0.1092 |
0.6107 | 240 | 0.1469 |
0.6132 | 241 | 0.1203 |
0.6158 | 242 | 0.0901 |
0.6183 | 243 | 0.1179 |
0.6209 | 244 | 0.0864 |
0.6234 | 245 | 0.1277 |
0.6260 | 246 | 0.1313 |
0.6285 | 247 | 0.089 |
0.6310 | 248 | 0.0727 |
0.6336 | 249 | 0.0556 |
0.6361 | 250 | 0.0782 |
0.6387 | 251 | 0.0869 |
0.6412 | 252 | 0.0988 |
0.6438 | 253 | 0.0818 |
0.6463 | 254 | 0.1013 |
0.6489 | 255 | 0.096 |
0.6514 | 256 | 0.0622 |
0.6539 | 257 | 0.1561 |
0.6565 | 258 | 0.1282 |
0.6590 | 259 | 0.1087 |
0.6616 | 260 | 0.1312 |
0.6641 | 261 | 0.1343 |
0.6667 | 262 | 0.0955 |
0.6692 | 263 | 0.0844 |
0.6718 | 264 | 0.1209 |
0.6743 | 265 | 0.0858 |
0.6768 | 266 | 0.0714 |
0.6794 | 267 | 0.1431 |
0.6819 | 268 | 0.0632 |
0.6845 | 269 | 0.115 |
0.6870 | 270 | 0.1115 |
0.6896 | 271 | 0.1239 |
0.6921 | 272 | 0.1206 |
0.6947 | 273 | 0.1894 |
0.6972 | 274 | 0.0755 |
0.6997 | 275 | 0.0709 |
0.7023 | 276 | 0.1304 |
0.7048 | 277 | 0.1476 |
0.7074 | 278 | 0.1497 |
0.7099 | 279 | 0.113 |
0.7125 | 280 | 0.1676 |
0.7150 | 281 | 0.0999 |
0.7176 | 282 | 0.2044 |
0.7201 | 283 | 0.1125 |
0.7226 | 284 | 0.0956 |
0.7252 | 285 | 0.0956 |
0.7277 | 286 | 0.0771 |
0.7303 | 287 | 0.0712 |
0.7328 | 288 | 0.0525 |
0.7354 | 289 | 0.0689 |
0.7379 | 290 | 0.0964 |
0.7405 | 291 | 0.1068 |
0.7430 | 292 | 0.0536 |
0.7455 | 293 | 0.0861 |
0.7481 | 294 | 0.0813 |
0.7506 | 295 | 0.0885 |
0.7532 | 296 | 0.1083 |
0.7557 | 297 | 0.1124 |
0.7583 | 298 | 0.1095 |
0.7608 | 299 | 0.08 |
0.7634 | 300 | 0.1081 |
0.7659 | 301 | 0.0719 |
0.7684 | 302 | 0.0933 |
0.7710 | 303 | 0.1143 |
0.7735 | 304 | 0.065 |
0.7761 | 305 | 0.1276 |
0.7786 | 306 | 0.102 |
0.7812 | 307 | 0.186 |
0.7837 | 308 | 0.0778 |
0.7863 | 309 | 0.1419 |
0.7888 | 310 | 0.0895 |
0.7913 | 311 | 0.1154 |
0.7939 | 312 | 0.1037 |
0.7964 | 313 | 0.0711 |
0.7990 | 314 | 0.1559 |
0.8015 | 315 | 0.0755 |
0.8041 | 316 | 0.0799 |
0.8066 | 317 | 0.1137 |
0.8092 | 318 | 0.0837 |
0.8117 | 319 | 0.1052 |
0.8142 | 320 | 0.0846 |
0.8168 | 321 | 0.0715 |
0.8193 | 322 | 0.0923 |
0.8219 | 323 | 0.1397 |
0.8244 | 324 | 0.0899 |
0.8270 | 325 | 0.1414 |
0.8295 | 326 | 0.0422 |
0.8321 | 327 | 0.0748 |
0.8346 | 328 | 0.0739 |
0.8372 | 329 | 0.0855 |
0.8397 | 330 | 0.071 |
0.8422 | 331 | 0.0557 |
0.8448 | 332 | 0.1055 |
0.8473 | 333 | 0.096 |
0.8499 | 334 | 0.1083 |
0.8524 | 335 | 0.133 |
0.8550 | 336 | 0.1308 |
0.8575 | 337 | 0.0661 |
0.8601 | 338 | 0.0974 |
0.8626 | 339 | 0.1027 |
0.8651 | 340 | 0.1068 |
0.8677 | 341 | 0.1653 |
0.8702 | 342 | 0.097 |
0.8728 | 343 | 0.0845 |
0.8753 | 344 | 0.0546 |
0.8779 | 345 | 0.1273 |
0.8804 | 346 | 0.0982 |
0.8830 | 347 | 0.0893 |
0.8855 | 348 | 0.1222 |
0.8880 | 349 | 0.1072 |
0.8906 | 350 | 0.1254 |
0.8931 | 351 | 0.0679 |
0.8957 | 352 | 0.0995 |
0.8982 | 353 | 0.0878 |
0.9008 | 354 | 0.0564 |
0.9033 | 355 | 0.113 |
0.9059 | 356 | 0.0567 |
0.9084 | 357 | 0.0968 |
0.9109 | 358 | 0.1023 |
0.9135 | 359 | 0.1106 |
0.9160 | 360 | 0.091 |
0.9186 | 361 | 0.0988 |
0.9211 | 362 | 0.1374 |
0.9237 | 363 | 0.0855 |
0.9262 | 364 | 0.0824 |
0.9288 | 365 | 0.058 |
0.9313 | 366 | 0.0776 |
0.9338 | 367 | 0.1195 |
0.9364 | 368 | 0.0506 |
0.9389 | 369 | 0.0893 |
0.9415 | 370 | 0.1145 |
0.9440 | 371 | 0.0695 |
0.9466 | 372 | 0.0805 |
0.9491 | 373 | 0.0824 |
0.9517 | 374 | 0.0841 |
0.9542 | 375 | 0.0919 |
0.9567 | 376 | 0.064 |
0.9593 | 377 | 0.2194 |
0.9618 | 378 | 0.1165 |
0.9644 | 379 | 0.0888 |
0.9669 | 380 | 0.0826 |
0.9695 | 381 | 0.0687 |
0.9720 | 382 | 0.0933 |
0.9746 | 383 | 0.1337 |
0.9771 | 384 | 0.0738 |
0.9796 | 385 | 0.0749 |
0.9822 | 386 | 0.0742 |
0.9847 | 387 | 0.1111 |
0.9873 | 388 | 0.093 |
0.9898 | 389 | 0.0877 |
0.9924 | 390 | 0.0637 |
0.9949 | 391 | 0.0897 |
0.9975 | 392 | 0.0818 |
1.0 | 393 | 0.0362 |
1.0025 | 394 | 0.0561 |
1.0051 | 395 | 0.0847 |
1.0076 | 396 | 0.0752 |
1.0102 | 397 | 0.0951 |
1.0127 | 398 | 0.1069 |
1.0153 | 399 | 0.0553 |
1.0178 | 400 | 0.0929 |
1.0204 | 401 | 0.0876 |
1.0229 | 402 | 0.0381 |
1.0254 | 403 | 0.1074 |
1.0280 | 404 | 0.0763 |
1.0305 | 405 | 0.0881 |
1.0331 | 406 | 0.0481 |
1.0356 | 407 | 0.1398 |
1.0382 | 408 | 0.09 |
1.0407 | 409 | 0.1045 |
1.0433 | 410 | 0.088 |
1.0458 | 411 | 0.0751 |
1.0483 | 412 | 0.0781 |
1.0509 | 413 | 0.0844 |
1.0534 | 414 | 0.0949 |
1.0560 | 415 | 0.0467 |
1.0585 | 416 | 0.1159 |
1.0611 | 417 | 0.0511 |
1.0636 | 418 | 0.0659 |
1.0662 | 419 | 0.043 |
1.0687 | 420 | 0.0468 |
1.0712 | 421 | 0.068 |
1.0738 | 422 | 0.1022 |
1.0763 | 423 | 0.1096 |
1.0789 | 424 | 0.1113 |
1.0814 | 425 | 0.1219 |
1.0840 | 426 | 0.0852 |
1.0865 | 427 | 0.0413 |
1.0891 | 428 | 0.0797 |
1.0916 | 429 | 0.1048 |
1.0941 | 430 | 0.0494 |
1.0967 | 431 | 0.079 |
1.0992 | 432 | 0.0698 |
1.1018 | 433 | 0.0908 |
1.1043 | 434 | 0.0993 |
1.1069 | 435 | 0.0397 |
1.1094 | 436 | 0.0312 |
1.1120 | 437 | 0.089 |
1.1145 | 438 | 0.0318 |
1.1170 | 439 | 0.0356 |
1.1196 | 440 | 0.0588 |
1.1221 | 441 | 0.0311 |
1.1247 | 442 | 0.0578 |
1.1272 | 443 | 0.1313 |
1.1298 | 444 | 0.0897 |
1.1323 | 445 | 0.0798 |
1.1349 | 446 | 0.0326 |
1.1374 | 447 | 0.143 |
1.1399 | 448 | 0.0661 |
1.1425 | 449 | 0.0433 |
1.1450 | 450 | 0.0782 |
1.1476 | 451 | 0.08 |
1.1501 | 452 | 0.0505 |
1.1527 | 453 | 0.0542 |
1.1552 | 454 | 0.0755 |
1.1578 | 455 | 0.0315 |
1.1603 | 456 | 0.0667 |
1.1628 | 457 | 0.0329 |
1.1654 | 458 | 0.0791 |
1.1679 | 459 | 0.0698 |
1.1705 | 460 | 0.0194 |
1.1730 | 461 | 0.0501 |
1.1756 | 462 | 0.0449 |
1.1781 | 463 | 0.0903 |
1.1807 | 464 | 0.0503 |
1.1832 | 465 | 0.0664 |
1.1858 | 466 | 0.0457 |
1.1883 | 467 | 0.0568 |
1.1908 | 468 | 0.064 |
1.1934 | 469 | 0.0253 |
1.1959 | 470 | 0.046 |
1.1985 | 471 | 0.0279 |
1.2010 | 472 | 0.0733 |
1.2036 | 473 | 0.0463 |
1.2061 | 474 | 0.07 |
1.2087 | 475 | 0.0281 |
1.2112 | 476 | 0.0373 |
1.2137 | 477 | 0.0738 |
1.2163 | 478 | 0.0412 |
1.2188 | 479 | 0.0545 |
1.2214 | 480 | 0.0247 |
1.2239 | 481 | 0.0293 |
1.2265 | 482 | 0.0845 |
1.2290 | 483 | 0.055 |
1.2316 | 484 | 0.072 |
1.2341 | 485 | 0.0481 |
1.2366 | 486 | 0.0443 |
1.2392 | 487 | 0.0807 |
1.2417 | 488 | 0.0421 |
1.2443 | 489 | 0.0237 |
1.2468 | 490 | 0.0189 |
1.2494 | 491 | 0.0604 |
1.2519 | 492 | 0.0428 |
1.2545 | 493 | 0.061 |
1.2570 | 494 | 0.0723 |
1.2595 | 495 | 0.0539 |
1.2621 | 496 | 0.0747 |
1.2646 | 497 | 0.0917 |
1.2672 | 498 | 0.1161 |
1.2697 | 499 | 0.087 |
1.2723 | 500 | 0.0616 |
1.2748 | 501 | 0.0756 |
1.2774 | 502 | 0.0674 |
1.2799 | 503 | 0.04 |
1.2824 | 504 | 0.0354 |
1.2850 | 505 | 0.0403 |
1.2875 | 506 | 0.0596 |
1.2901 | 507 | 0.0359 |
1.2926 | 508 | 0.0648 |
1.2952 | 509 | 0.0424 |
1.2977 | 510 | 0.0605 |
1.3003 | 511 | 0.0136 |
1.3028 | 512 | 0.0547 |
1.3053 | 513 | 0.0385 |
1.3079 | 514 | 0.0191 |
1.3104 | 515 | 0.1222 |
1.3130 | 516 | 0.0906 |
1.3155 | 517 | 0.0603 |
1.3181 | 518 | 0.0366 |
1.3206 | 519 | 0.0416 |
1.3232 | 520 | 0.0832 |
1.3257 | 521 | 0.0355 |
1.3282 | 522 | 0.0614 |
1.3308 | 523 | 0.0539 |
1.3333 | 524 | 0.0566 |
1.3359 | 525 | 0.0727 |
1.3384 | 526 | 0.0311 |
1.3410 | 527 | 0.0254 |
1.3435 | 528 | 0.0376 |
1.3461 | 529 | 0.0652 |
1.3486 | 530 | 0.0717 |
1.3511 | 531 | 0.0521 |
1.3537 | 532 | 0.0404 |
1.3562 | 533 | 0.041 |
1.3588 | 534 | 0.0435 |
1.3613 | 535 | 0.0842 |
1.3639 | 536 | 0.0203 |
1.3664 | 537 | 0.072 |
1.3690 | 538 | 0.0277 |
1.3715 | 539 | 0.0575 |
1.3740 | 540 | 0.0665 |
1.3766 | 541 | 0.024 |
1.3791 | 542 | 0.0202 |
1.3817 | 543 | 0.052 |
1.3842 | 544 | 0.0532 |
1.3868 | 545 | 0.0623 |
1.3893 | 546 | 0.0643 |
1.3919 | 547 | 0.0694 |
1.3944 | 548 | 0.0582 |
1.3969 | 549 | 0.0411 |
1.3995 | 550 | 0.0245 |
1.4020 | 551 | 0.0714 |
1.4046 | 552 | 0.0489 |
1.4071 | 553 | 0.0696 |
1.4097 | 554 | 0.0316 |
1.4122 | 555 | 0.0554 |
1.4148 | 556 | 0.097 |
1.4173 | 557 | 0.0665 |
1.4198 | 558 | 0.0578 |
1.4224 | 559 | 0.0746 |
1.4249 | 560 | 0.0347 |
1.4275 | 561 | 0.0471 |
1.4300 | 562 | 0.0237 |
1.4326 | 563 | 0.0269 |
1.4351 | 564 | 0.068 |
1.4377 | 565 | 0.0362 |
1.4402 | 566 | 0.059 |
1.4427 | 567 | 0.0321 |
1.4453 | 568 | 0.0469 |
1.4478 | 569 | 0.0445 |
1.4504 | 570 | 0.0804 |
1.4529 | 571 | 0.0387 |
1.4555 | 572 | 0.0358 |
1.4580 | 573 | 0.0322 |
1.4606 | 574 | 0.0673 |
1.4631 | 575 | 0.0302 |
1.4656 | 576 | 0.0612 |
1.4682 | 577 | 0.0553 |
1.4707 | 578 | 0.0998 |
1.4733 | 579 | 0.0396 |
1.4758 | 580 | 0.0764 |
1.4784 | 581 | 0.0427 |
1.4809 | 582 | 0.0785 |
1.4835 | 583 | 0.0419 |
1.4860 | 584 | 0.0584 |
1.4885 | 585 | 0.0437 |
1.4911 | 586 | 0.0561 |
1.4936 | 587 | 0.0131 |
1.4962 | 588 | 0.0472 |
1.4987 | 589 | 0.0479 |
1.5013 | 590 | 0.0477 |
1.5038 | 591 | 0.0745 |
1.5064 | 592 | 0.0918 |
1.5089 | 593 | 0.041 |
1.5115 | 594 | 0.0463 |
1.5140 | 595 | 0.0227 |
1.5165 | 596 | 0.0427 |
1.5191 | 597 | 0.0754 |
1.5216 | 598 | 0.0489 |
1.5242 | 599 | 0.0765 |
1.5267 | 600 | 0.0651 |
1.5293 | 601 | 0.0544 |
1.5318 | 602 | 0.0777 |
1.5344 | 603 | 0.0638 |
1.5369 | 604 | 0.1198 |
1.5394 | 605 | 0.0882 |
1.5420 | 606 | 0.0236 |
1.5445 | 607 | 0.0202 |
1.5471 | 608 | 0.0955 |
1.5496 | 609 | 0.0366 |
1.5522 | 610 | 0.1021 |
1.5547 | 611 | 0.0669 |
1.5573 | 612 | 0.0185 |
1.5598 | 613 | 0.0575 |
1.5623 | 614 | 0.1001 |
1.5649 | 615 | 0.0664 |
1.5674 | 616 | 0.0617 |
1.5700 | 617 | 0.0661 |
1.5725 | 618 | 0.0425 |
1.5751 | 619 | 0.0445 |
1.5776 | 620 | 0.0773 |
1.5802 | 621 | 0.0504 |
1.5827 | 622 | 0.0785 |
1.5852 | 623 | 0.0802 |
1.5878 | 624 | 0.0882 |
1.5903 | 625 | 0.0125 |
1.5929 | 626 | 0.0305 |
1.5954 | 627 | 0.0275 |
1.5980 | 628 | 0.0245 |
1.6005 | 629 | 0.0897 |
1.6031 | 630 | 0.0444 |
1.6056 | 631 | 0.0589 |
1.6081 | 632 | 0.0337 |
1.6107 | 633 | 0.0889 |
1.6132 | 634 | 0.0556 |
1.6158 | 635 | 0.0426 |
1.6183 | 636 | 0.046 |
1.6209 | 637 | 0.0342 |
1.6234 | 638 | 0.0573 |
1.6260 | 639 | 0.0569 |
1.6285 | 640 | 0.0248 |
1.6310 | 641 | 0.0214 |
1.6336 | 642 | 0.0147 |
1.6361 | 643 | 0.0203 |
1.6387 | 644 | 0.0366 |
1.6412 | 645 | 0.0484 |
1.6438 | 646 | 0.0301 |
1.6463 | 647 | 0.0314 |
1.6489 | 648 | 0.0369 |
1.6514 | 649 | 0.0168 |
1.6539 | 650 | 0.0645 |
1.6565 | 651 | 0.0755 |
1.6590 | 652 | 0.0448 |
1.6616 | 653 | 0.0795 |
1.6641 | 654 | 0.0673 |
1.6667 | 655 | 0.0431 |
1.6692 | 656 | 0.0265 |
1.6718 | 657 | 0.0567 |
1.6743 | 658 | 0.0235 |
1.6768 | 659 | 0.034 |
1.6794 | 660 | 0.0812 |
1.6819 | 661 | 0.0157 |
1.6845 | 662 | 0.0448 |
1.6870 | 663 | 0.0488 |
1.6896 | 664 | 0.0515 |
1.6921 | 665 | 0.0531 |
1.6947 | 666 | 0.1166 |
1.6972 | 667 | 0.0264 |
1.6997 | 668 | 0.0325 |
1.7023 | 669 | 0.0784 |
1.7048 | 670 | 0.0859 |
1.7074 | 671 | 0.0981 |
1.7099 | 672 | 0.0411 |
1.7125 | 673 | 0.0915 |
1.7150 | 674 | 0.0396 |
1.7176 | 675 | 0.1381 |
1.7201 | 676 | 0.0547 |
1.7226 | 677 | 0.0436 |
1.7252 | 678 | 0.0519 |
1.7277 | 679 | 0.0305 |
1.7303 | 680 | 0.0356 |
1.7328 | 681 | 0.0173 |
1.7354 | 682 | 0.0299 |
1.7379 | 683 | 0.0424 |
1.7405 | 684 | 0.038 |
1.7430 | 685 | 0.0159 |
1.7455 | 686 | 0.0273 |
1.7481 | 687 | 0.0301 |
1.7506 | 688 | 0.0315 |
1.7532 | 689 | 0.0566 |
1.7557 | 690 | 0.0478 |
1.7583 | 691 | 0.0533 |
1.7608 | 692 | 0.0248 |
1.7634 | 693 | 0.0454 |
1.7659 | 694 | 0.0252 |
1.7684 | 695 | 0.0326 |
1.7710 | 696 | 0.0501 |
1.7735 | 697 | 0.0196 |
1.7761 | 698 | 0.0487 |
1.7786 | 699 | 0.0445 |
1.7812 | 700 | 0.1264 |
1.7837 | 701 | 0.0312 |
1.7863 | 702 | 0.1022 |
1.7888 | 703 | 0.0293 |
1.7913 | 704 | 0.0671 |
1.7939 | 705 | 0.051 |
1.7964 | 706 | 0.0246 |
1.7990 | 707 | 0.1115 |
1.8015 | 708 | 0.0203 |
1.8041 | 709 | 0.0359 |
1.8066 | 710 | 0.0699 |
1.8092 | 711 | 0.0435 |
1.8117 | 712 | 0.0689 |
1.8142 | 713 | 0.0359 |
1.8168 | 714 | 0.0321 |
1.8193 | 715 | 0.0439 |
1.8219 | 716 | 0.0652 |
1.8244 | 717 | 0.0494 |
1.8270 | 718 | 0.0864 |
1.8295 | 719 | 0.0119 |
1.8321 | 720 | 0.0284 |
1.8346 | 721 | 0.0344 |
1.8372 | 722 | 0.0454 |
1.8397 | 723 | 0.0267 |
1.8422 | 724 | 0.0152 |
1.8448 | 725 | 0.0512 |
1.8473 | 726 | 0.0537 |
1.8499 | 727 | 0.0873 |
1.8524 | 728 | 0.0934 |
1.8550 | 729 | 0.0583 |
1.8575 | 730 | 0.0206 |
1.8601 | 731 | 0.0308 |
1.8626 | 732 | 0.0443 |
1.8651 | 733 | 0.0435 |
1.8677 | 734 | 0.1254 |
1.8702 | 735 | 0.0525 |
1.8728 | 736 | 0.039 |
1.8753 | 737 | 0.0157 |
1.8779 | 738 | 0.0621 |
1.8804 | 739 | 0.0405 |
1.8830 | 740 | 0.0369 |
1.8855 | 741 | 0.0568 |
1.8880 | 742 | 0.0451 |
1.8906 | 743 | 0.0657 |
1.8931 | 744 | 0.0304 |
1.8957 | 745 | 0.047 |
1.8982 | 746 | 0.0457 |
1.9008 | 747 | 0.0239 |
1.9033 | 748 | 0.0669 |
1.9059 | 749 | 0.0252 |
1.9084 | 750 | 0.061 |
1.9109 | 751 | 0.0429 |
1.9135 | 752 | 0.0611 |
1.9160 | 753 | 0.0482 |
1.9186 | 754 | 0.0381 |
1.9211 | 755 | 0.0749 |
1.9237 | 756 | 0.0481 |
1.9262 | 757 | 0.0405 |
1.9288 | 758 | 0.0248 |
1.9313 | 759 | 0.0377 |
1.9338 | 760 | 0.061 |
1.9364 | 761 | 0.0203 |
1.9389 | 762 | 0.0315 |
1.9415 | 763 | 0.0534 |
1.9440 | 764 | 0.0383 |
1.9466 | 765 | 0.0431 |
1.9491 | 766 | 0.0509 |
1.9517 | 767 | 0.0361 |
1.9542 | 768 | 0.054 |
1.9567 | 769 | 0.0248 |
1.9593 | 770 | 0.1599 |
1.9618 | 771 | 0.0657 |
1.9644 | 772 | 0.0373 |
1.9669 | 773 | 0.0632 |
1.9695 | 774 | 0.0385 |
1.9720 | 775 | 0.0456 |
1.9746 | 776 | 0.0857 |
1.9771 | 777 | 0.0253 |
1.9796 | 778 | 0.0378 |
1.9822 | 779 | 0.0366 |
1.9847 | 780 | 0.0646 |
1.9873 | 781 | 0.062 |
1.9898 | 782 | 0.0513 |
1.9924 | 783 | 0.0291 |
1.9949 | 784 | 0.0466 |
1.9975 | 785 | 0.0345 |
2.0 | 786 | 0.0108 |
2.0025 | 787 | 0.0196 |
2.0051 | 788 | 0.0402 |
2.0076 | 789 | 0.034 |
2.0102 | 790 | 0.0606 |
2.0127 | 791 | 0.0677 |
2.0153 | 792 | 0.0174 |
2.0178 | 793 | 0.0548 |
2.0204 | 794 | 0.0385 |
2.0229 | 795 | 0.0146 |
2.0254 | 796 | 0.0716 |
2.0280 | 797 | 0.0304 |
2.0305 | 798 | 0.0512 |
2.0331 | 799 | 0.0158 |
2.0356 | 800 | 0.0973 |
2.0382 | 801 | 0.0394 |
2.0407 | 802 | 0.0724 |
2.0433 | 803 | 0.0518 |
2.0458 | 804 | 0.0385 |
2.0483 | 805 | 0.0464 |
2.0509 | 806 | 0.0501 |
2.0534 | 807 | 0.051 |
2.0560 | 808 | 0.0232 |
2.0585 | 809 | 0.0631 |
2.0611 | 810 | 0.0192 |
2.0636 | 811 | 0.0301 |
2.0662 | 812 | 0.0177 |
2.0687 | 813 | 0.0172 |
2.0712 | 814 | 0.0313 |
2.0738 | 815 | 0.0653 |
2.0763 | 816 | 0.0715 |
2.0789 | 817 | 0.0548 |
2.0814 | 818 | 0.0729 |
2.0840 | 819 | 0.0399 |
2.0865 | 820 | 0.0208 |
2.0891 | 821 | 0.0476 |
2.0916 | 822 | 0.054 |
2.0941 | 823 | 0.0174 |
2.0967 | 824 | 0.0431 |
2.0992 | 825 | 0.0361 |
2.1018 | 826 | 0.0514 |
2.1043 | 827 | 0.0513 |
2.1069 | 828 | 0.0099 |
2.1094 | 829 | 0.0137 |
2.1120 | 830 | 0.0493 |
2.1145 | 831 | 0.0133 |
2.1170 | 832 | 0.0087 |
2.1196 | 833 | 0.0306 |
2.1221 | 834 | 0.0092 |
2.1247 | 835 | 0.0242 |
2.1272 | 836 | 0.0905 |
2.1298 | 837 | 0.0544 |
2.1323 | 838 | 0.0462 |
2.1349 | 839 | 0.0107 |
2.1374 | 840 | 0.0846 |
2.1399 | 841 | 0.031 |
2.1425 | 842 | 0.027 |
2.1450 | 843 | 0.05 |
2.1476 | 844 | 0.0468 |
2.1501 | 845 | 0.0251 |
2.1527 | 846 | 0.031 |
2.1552 | 847 | 0.0343 |
2.1578 | 848 | 0.0149 |
2.1603 | 849 | 0.0347 |
2.1628 | 850 | 0.014 |
2.1654 | 851 | 0.0471 |
2.1679 | 852 | 0.0413 |
2.1705 | 853 | 0.0047 |
2.1730 | 854 | 0.0232 |
2.1756 | 855 | 0.025 |
2.1781 | 856 | 0.0621 |
2.1807 | 857 | 0.0198 |
2.1832 | 858 | 0.0346 |
2.1858 | 859 | 0.0177 |
2.1883 | 860 | 0.0298 |
2.1908 | 861 | 0.0325 |
2.1934 | 862 | 0.0075 |
2.1959 | 863 | 0.0224 |
2.1985 | 864 | 0.0085 |
2.2010 | 865 | 0.0498 |
2.2036 | 866 | 0.0222 |
2.2061 | 867 | 0.0309 |
2.2087 | 868 | 0.0074 |
2.2112 | 869 | 0.0126 |
2.2137 | 870 | 0.0372 |
2.2163 | 871 | 0.0232 |
2.2188 | 872 | 0.033 |
2.2214 | 873 | 0.0111 |
2.2239 | 874 | 0.0121 |
2.2265 | 875 | 0.0552 |
2.2290 | 876 | 0.0305 |
2.2316 | 877 | 0.042 |
2.2341 | 878 | 0.0147 |
2.2366 | 879 | 0.0222 |
2.2392 | 880 | 0.0341 |
2.2417 | 881 | 0.0163 |
2.2443 | 882 | 0.0084 |
2.2468 | 883 | 0.0081 |
2.2494 | 884 | 0.0312 |
2.2519 | 885 | 0.0153 |
2.2545 | 886 | 0.0262 |
2.2570 | 887 | 0.0404 |
2.2595 | 888 | 0.0198 |
2.2621 | 889 | 0.0304 |
2.2646 | 890 | 0.0544 |
2.2672 | 891 | 0.065 |
2.2697 | 892 | 0.0473 |
2.2723 | 893 | 0.0291 |
2.2748 | 894 | 0.0415 |
2.2774 | 895 | 0.0398 |
2.2799 | 896 | 0.018 |
2.2824 | 897 | 0.0158 |
2.2850 | 898 | 0.0161 |
2.2875 | 899 | 0.0347 |
2.2901 | 900 | 0.0104 |
2.2926 | 901 | 0.044 |
2.2952 | 902 | 0.019 |
2.2977 | 903 | 0.0416 |
2.3003 | 904 | 0.0039 |
2.3028 | 905 | 0.0246 |
2.3053 | 906 | 0.0133 |
2.3079 | 907 | 0.0053 |
2.3104 | 908 | 0.0992 |
2.3130 | 909 | 0.0569 |
2.3155 | 910 | 0.0326 |
2.3181 | 911 | 0.0189 |
2.3206 | 912 | 0.0115 |
2.3232 | 913 | 0.0417 |
2.3257 | 914 | 0.0161 |
2.3282 | 915 | 0.0308 |
2.3308 | 916 | 0.0234 |
2.3333 | 917 | 0.027 |
2.3359 | 918 | 0.0391 |
2.3384 | 919 | 0.0107 |
2.3410 | 920 | 0.0092 |
2.3435 | 921 | 0.016 |
2.3461 | 922 | 0.0299 |
2.3486 | 923 | 0.0493 |
2.3511 | 924 | 0.025 |
2.3537 | 925 | 0.0127 |
2.3562 | 926 | 0.0131 |
2.3588 | 927 | 0.0214 |
2.3613 | 928 | 0.0538 |
2.3639 | 929 | 0.0082 |
2.3664 | 930 | 0.043 |
2.3690 | 931 | 0.0074 |
2.3715 | 932 | 0.042 |
2.3740 | 933 | 0.044 |
2.3766 | 934 | 0.01 |
2.3791 | 935 | 0.0055 |
2.3817 | 936 | 0.0215 |
2.3842 | 937 | 0.0258 |
2.3868 | 938 | 0.0302 |
2.3893 | 939 | 0.0326 |
2.3919 | 940 | 0.0348 |
2.3944 | 941 | 0.0444 |
2.3969 | 942 | 0.019 |
2.3995 | 943 | 0.0098 |
2.4020 | 944 | 0.0283 |
2.4046 | 945 | 0.0306 |
2.4071 | 946 | 0.0316 |
2.4097 | 947 | 0.01 |
2.4122 | 948 | 0.0253 |
2.4148 | 949 | 0.0664 |
2.4173 | 950 | 0.0366 |
2.4198 | 951 | 0.0307 |
2.4224 | 952 | 0.0422 |
2.4249 | 953 | 0.0133 |
2.4275 | 954 | 0.0209 |
2.4300 | 955 | 0.0065 |
2.4326 | 956 | 0.0107 |
2.4351 | 957 | 0.0396 |
2.4377 | 958 | 0.0137 |
2.4402 | 959 | 0.0258 |
2.4427 | 960 | 0.0138 |
2.4453 | 961 | 0.0275 |
2.4478 | 962 | 0.0208 |
2.4504 | 963 | 0.0302 |
2.4529 | 964 | 0.0292 |
2.4555 | 965 | 0.018 |
2.4580 | 966 | 0.0168 |
2.4606 | 967 | 0.0365 |
2.4631 | 968 | 0.0141 |
2.4656 | 969 | 0.0348 |
2.4682 | 970 | 0.022 |
2.4707 | 971 | 0.0677 |
2.4733 | 972 | 0.0156 |
2.4758 | 973 | 0.0424 |
2.4784 | 974 | 0.0188 |
2.4809 | 975 | 0.0494 |
2.4835 | 976 | 0.0192 |
2.4860 | 977 | 0.0346 |
2.4885 | 978 | 0.0167 |
2.4911 | 979 | 0.0274 |
2.4936 | 980 | 0.0046 |
2.4962 | 981 | 0.0301 |
2.4987 | 982 | 0.0246 |
2.5013 | 983 | 0.0222 |
2.5038 | 984 | 0.0346 |
2.5064 | 985 | 0.0595 |
2.5089 | 986 | 0.0221 |
2.5115 | 987 | 0.0211 |
2.5140 | 988 | 0.0092 |
2.5165 | 989 | 0.0225 |
2.5191 | 990 | 0.0452 |
2.5216 | 991 | 0.0288 |
2.5242 | 992 | 0.044 |
2.5267 | 993 | 0.0308 |
2.5293 | 994 | 0.0309 |
2.5318 | 995 | 0.0495 |
2.5344 | 996 | 0.0384 |
2.5369 | 997 | 0.0834 |
2.5394 | 998 | 0.0866 |
2.5420 | 999 | 0.0076 |
2.5445 | 1000 | 0.0071 |
2.5471 | 1001 | 0.0634 |
2.5496 | 1002 | 0.0144 |
2.5522 | 1003 | 0.077 |
2.5547 | 1004 | 0.0347 |
2.5573 | 1005 | 0.0081 |
2.5598 | 1006 | 0.0216 |
2.5623 | 1007 | 0.0437 |
2.5649 | 1008 | 0.0367 |
2.5674 | 1009 | 0.0281 |
2.5700 | 1010 | 0.0312 |
2.5725 | 1011 | 0.0181 |
2.5751 | 1012 | 0.0226 |
2.5776 | 1013 | 0.0558 |
2.5802 | 1014 | 0.0267 |
2.5827 | 1015 | 0.0596 |
2.5852 | 1016 | 0.046 |
2.5878 | 1017 | 0.0465 |
2.5903 | 1018 | 0.0035 |
2.5929 | 1019 | 0.019 |
2.5954 | 1020 | 0.0118 |
2.5980 | 1021 | 0.0128 |
2.6005 | 1022 | 0.0458 |
2.6031 | 1023 | 0.0185 |
2.6056 | 1024 | 0.0309 |
2.6081 | 1025 | 0.0142 |
2.6107 | 1026 | 0.0732 |
2.6132 | 1027 | 0.0327 |
2.6158 | 1028 | 0.0296 |
2.6183 | 1029 | 0.0237 |
2.6209 | 1030 | 0.0169 |
2.6234 | 1031 | 0.0306 |
2.6260 | 1032 | 0.0235 |
2.6285 | 1033 | 0.009 |
2.6310 | 1034 | 0.0118 |
2.6336 | 1035 | 0.0067 |
2.6361 | 1036 | 0.008 |
2.6387 | 1037 | 0.0202 |
2.6412 | 1038 | 0.0241 |
2.6438 | 1039 | 0.0118 |
2.6463 | 1040 | 0.0161 |
2.6489 | 1041 | 0.0242 |
2.6514 | 1042 | 0.0072 |
2.6539 | 1043 | 0.037 |
2.6565 | 1044 | 0.0362 |
2.6590 | 1045 | 0.0213 |
2.6616 | 1046 | 0.0458 |
2.6641 | 1047 | 0.0358 |
2.6667 | 1048 | 0.024 |
2.6692 | 1049 | 0.0093 |
2.6718 | 1050 | 0.0306 |
2.6743 | 1051 | 0.0075 |
2.6768 | 1052 | 0.0193 |
2.6794 | 1053 | 0.048 |
2.6819 | 1054 | 0.0058 |
2.6845 | 1055 | 0.0233 |
2.6870 | 1056 | 0.0264 |
2.6896 | 1057 | 0.0276 |
2.6921 | 1058 | 0.0346 |
2.6947 | 1059 | 0.0854 |
2.6972 | 1060 | 0.0119 |
2.6997 | 1061 | 0.0174 |
2.7023 | 1062 | 0.0514 |
2.7048 | 1063 | 0.0628 |
2.7074 | 1064 | 0.0721 |
2.7099 | 1065 | 0.0246 |
2.7125 | 1066 | 0.049 |
2.7150 | 1067 | 0.0148 |
2.7176 | 1068 | 0.1024 |
2.7201 | 1069 | 0.0312 |
2.7226 | 1070 | 0.029 |
2.7252 | 1071 | 0.0352 |
2.7277 | 1072 | 0.0131 |
2.7303 | 1073 | 0.0195 |
2.7328 | 1074 | 0.0064 |
2.7354 | 1075 | 0.0169 |
2.7379 | 1076 | 0.0232 |
2.7405 | 1077 | 0.0216 |
2.7430 | 1078 | 0.0058 |
2.7455 | 1079 | 0.0089 |
2.7481 | 1080 | 0.0143 |
2.7506 | 1081 | 0.0168 |
2.7532 | 1082 | 0.0331 |
2.7557 | 1083 | 0.0255 |
2.7583 | 1084 | 0.0312 |
2.7608 | 1085 | 0.0125 |
2.7634 | 1086 | 0.0228 |
2.7659 | 1087 | 0.0083 |
2.7684 | 1088 | 0.0141 |
2.7710 | 1089 | 0.0189 |
2.7735 | 1090 | 0.0109 |
2.7761 | 1091 | 0.0195 |
2.7786 | 1092 | 0.0169 |
2.7812 | 1093 | 0.0937 |
2.7837 | 1094 | 0.019 |
2.7863 | 1095 | 0.0856 |
2.7888 | 1096 | 0.0155 |
2.7913 | 1097 | 0.0408 |
2.7939 | 1098 | 0.0279 |
2.7964 | 1099 | 0.008 |
2.7990 | 1100 | 0.086 |
2.8015 | 1101 | 0.0078 |
2.8041 | 1102 | 0.0186 |
2.8066 | 1103 | 0.0468 |
2.8092 | 1104 | 0.0255 |
2.8117 | 1105 | 0.0418 |
2.8142 | 1106 | 0.0188 |
2.8168 | 1107 | 0.0197 |
2.8193 | 1108 | 0.023 |
2.8219 | 1109 | 0.0421 |
2.8244 | 1110 | 0.0301 |
2.8270 | 1111 | 0.0627 |
2.8295 | 1112 | 0.0052 |
2.8321 | 1113 | 0.0163 |
2.8346 | 1114 | 0.0209 |
2.8372 | 1115 | 0.0277 |
2.8397 | 1116 | 0.0211 |
2.8422 | 1117 | 0.0066 |
2.8448 | 1118 | 0.0263 |
2.8473 | 1119 | 0.0408 |
2.8499 | 1120 | 0.0516 |
2.8524 | 1121 | 0.0748 |
2.8550 | 1122 | 0.0309 |
2.8575 | 1123 | 0.007 |
2.8601 | 1124 | 0.014 |
2.8626 | 1125 | 0.0284 |
2.8651 | 1126 | 0.0165 |
2.8677 | 1127 | 0.0975 |
2.8702 | 1128 | 0.0354 |
2.8728 | 1129 | 0.0235 |
2.8753 | 1130 | 0.0074 |
2.8779 | 1131 | 0.0386 |
2.8804 | 1132 | 0.0173 |
2.8830 | 1133 | 0.0211 |
2.8855 | 1134 | 0.0305 |
2.8880 | 1135 | 0.0219 |
2.8906 | 1136 | 0.0454 |
2.8931 | 1137 | 0.0176 |
2.8957 | 1138 | 0.0261 |
2.8982 | 1139 | 0.0274 |
2.9008 | 1140 | 0.0131 |
2.9033 | 1141 | 0.0485 |
2.9059 | 1142 | 0.0129 |
2.9084 | 1143 | 0.05 |
2.9109 | 1144 | 0.0306 |
2.9135 | 1145 | 0.0352 |
2.9160 | 1146 | 0.0271 |
2.9186 | 1147 | 0.0216 |
2.9211 | 1148 | 0.0567 |
2.9237 | 1149 | 0.0258 |
2.9262 | 1150 | 0.0221 |
2.9288 | 1151 | 0.0112 |
2.9313 | 1152 | 0.0199 |
2.9338 | 1153 | 0.0388 |
2.9364 | 1154 | 0.0101 |
2.9389 | 1155 | 0.0179 |
2.9415 | 1156 | 0.0358 |
2.9440 | 1157 | 0.0247 |
2.9466 | 1158 | 0.031 |
2.9491 | 1159 | 0.0367 |
2.9517 | 1160 | 0.0198 |
2.9542 | 1161 | 0.0346 |
2.9567 | 1162 | 0.011 |
2.9593 | 1163 | 0.139 |
2.9618 | 1164 | 0.0555 |
2.9644 | 1165 | 0.0228 |
2.9669 | 1166 | 0.0377 |
2.9695 | 1167 | 0.024 |
2.9720 | 1168 | 0.0331 |
2.9746 | 1169 | 0.0815 |
2.9771 | 1170 | 0.0116 |
2.9796 | 1171 | 0.0186 |
2.9822 | 1172 | 0.0153 |
2.9847 | 1173 | 0.0557 |
2.9873 | 1174 | 0.0406 |
2.9898 | 1175 | 0.0334 |
2.9924 | 1176 | 0.0265 |
2.9949 | 1177 | 0.0333 |
2.9975 | 1178 | 0.0177 |
3.0 | 1179 | 0.0028 |
Framework Versions
- Python: 3.11.10
- Sentence Transformers: 4.0.2
- PyLate: 1.1.7
- Transformers: 4.48.2
- PyTorch: 2.5.1+cu124
- Accelerate: 1.1.1
- Datasets: 2.21.0
- Tokenizers: 0.21.0
Citation
BibTeX
Reason-ModernColBERT
@misc{Reason-ModernColBERT,
title={Reason-ModernColBERT},
author={Chaffin, Antoine},
url={https://huggingface.co/lightonai/Reason-ModernColBERT},
year={2025}
}
GTE-ModernColBERT
@misc{GTE-ModernColBERT,
title={GTE-ModernColBERT},
author={Chaffin, Antoine},
url={https://huggingface.co/lightonai/GTE-ModernColBERT-v1},
year={2025}
}
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084"
}
PyLate
@misc{PyLate,
title={PyLate: Flexible Training and Retrieval for Late Interaction Models},
author={Chaffin, Antoine and Sourty, Raphaël},
url={https://github.com/lightonai/pylate},
year={2024}
}
CachedContrastive
@misc{gao2021scaling,
title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
year={2021},
eprint={2101.06983},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
- Downloads last month
- 95
Model tree for lightonai/Reason-ModernColBERT
Base model
answerdotai/ModernBERT-base