NohTow commited on
Commit
d98da6a
·
verified ·
1 Parent(s): 79c54f7

Add new SentenceTransformer model

Browse files
1_Dense/config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"in_features": 768, "out_features": 128, "bias": false, "activation_function": "torch.nn.modules.linear.Identity"}
1_Dense/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:37fe595979a3bb75600e83b6d8e9db88983417366acb47a2c0915a89562843fd
3
+ size 393304
README.md ADDED
@@ -0,0 +1,1683 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ tags:
5
+ - ColBERT
6
+ - PyLate
7
+ - sentence-transformers
8
+ - sentence-similarity
9
+ - feature-extraction
10
+ - generated_from_trainer
11
+ - dataset_size:100521
12
+ - loss:CachedContrastive
13
+ base_model: lightonai/GTE-ModernColBERT-v1
14
+ datasets:
15
+ - reasonir/reasonir-data
16
+ pipeline_tag: sentence-similarity
17
+ library_name: PyLate
18
+ license: cc-by-nc-4.0
19
+ ---
20
+ <img src="https://cdn-uploads.huggingface.co/production/uploads/609bbe2f4932693ca2009d6a/_XuHfcsUg_uV7vGU42lvE.png" width="500" height="auto">
21
+
22
+ # Reason-ModernColBERT
23
+ Reason-ModernColBERT is a late interaction model trained on the [reasonir-hq](https://huggingface.co/datasets/reasonir/reasonir-data) dataset.
24
+ It achieves extremely competitive performance on the [BRIGHT benchmark](https://huggingface.co/datasets/xlangai/BRIGHT) aimed at evaluating reasoning-intensive retrieval performance, outperforming all existing models up to 7B (more than 45 times its size) and even surprisingly improving performance of [ReasonIR-8B](https://huggingface.co/reasonir/ReasonIR-8B) (a 8B model trained on the same data) by more than 2.5 NDCG@10 on average on Stack Exchange splits. We attribute such strong results to late-interaction, see [evaluation section](#evaluation).
25
+
26
+ # License
27
+ Unfortunately, since the [ReasonIR data](https://huggingface.co/datasets/reasonir/reasonir-data) has been released under a cc-by-nc-4.0 license, we cannot release this model under an Apache 2.0 license. However, the authors of ReasonIR [released code to generate the data](https://github.com/facebookresearch/ReasonIR/tree/main/synthetic_data_generation). Anyone willing to reproduce the data could then easily reproduce this model under an Apache 2.0 license by running a fine-tuning lasting lower than 2 hours using [this boilerplate](https://gist.github.com/NohTow/d563244596548bf387f19fcd790664d3).
28
+ # PyLate model based on lightonai/GTE-ModernColBERT-v1
29
+
30
+ This is a [PyLate](https://github.com/lightonai/pylate) model finetuned from [lightonai/GTE-ModernColBERT-v1](https://huggingface.co/lightonai/GTE-ModernColBERT-v1) on the [reasonir-hq](https://huggingface.co/datasets/reasonir/reasonir-data) dataset. It maps sentences & paragraphs to sequences of 128-dimensional dense vectors and can be used for semantic textual similarity using the MaxSim operator.
31
+
32
+ ## Model Details
33
+
34
+ ### Model Description
35
+ - **Model Type:** PyLate model
36
+ - **Base model:** [lightonai/GTE-ModernColBERT-v1](https://huggingface.co/lightonai/GTE-ModernColBERT-v1) <!-- at revision 78d50a162b04dfdc45c3af6b4294ba77c24888a3 -->
37
+ - **Document Length:** 8192 tokens
38
+ - **Query Length:** 128 tokens
39
+ - **Output Dimensionality:** 128 tokens
40
+ - **Similarity Function:** MaxSim
41
+ - **Training Dataset:**
42
+ - [reasonir-hq](https://huggingface.co/datasets/reasonir/reasonir-data)
43
+ - **Language:** en
44
+ <!-- - **License:** Unknown -->
45
+
46
+ ### Model Sources
47
+
48
+ - **Documentation:** [PyLate Documentation](https://lightonai.github.io/pylate/)
49
+ - **Repository:** [PyLate on GitHub](https://github.com/lightonai/pylate)
50
+ - **Hugging Face:** [PyLate models on Hugging Face](https://huggingface.co/models?library=PyLate)
51
+
52
+ ### Full Model Architecture
53
+
54
+ ```
55
+ ColBERT(
56
+ (0): Transformer({'max_seq_length': 127, 'do_lower_case': False}) with Transformer model: ModernBertModel
57
+ (1): Dense({'in_features': 768, 'out_features': 128, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
58
+ )
59
+ ```
60
+
61
+ ## Usage
62
+ First install the PyLate library:
63
+
64
+ ```bash
65
+ pip install -U pylate
66
+ ```
67
+
68
+ ### Retrieval
69
+
70
+ PyLate provides a streamlined interface to index and retrieve documents using ColBERT models. The index leverages the Voyager HNSW index to efficiently handle document embeddings and enable fast retrieval.
71
+
72
+ #### Indexing documents
73
+
74
+ First, load the ColBERT model and initialize the Voyager index, then encode and index your documents:
75
+
76
+ ```python
77
+ from pylate import indexes, models, retrieve
78
+
79
+ # Step 1: Load the ColBERT model
80
+ model = models.ColBERT(
81
+ model_name_or_path=pylate_model_id,
82
+ )
83
+
84
+ # Step 2: Initialize the Voyager index
85
+ index = indexes.Voyager(
86
+ index_folder="pylate-index",
87
+ index_name="index",
88
+ override=True, # This overwrites the existing index if any
89
+ )
90
+
91
+ # Step 3: Encode the documents
92
+ documents_ids = ["1", "2", "3"]
93
+ documents = ["document 1 text", "document 2 text", "document 3 text"]
94
+
95
+ documents_embeddings = model.encode(
96
+ documents,
97
+ batch_size=32,
98
+ is_query=False, # Ensure that it is set to False to indicate that these are documents, not queries
99
+ show_progress_bar=True,
100
+ )
101
+
102
+ # Step 4: Add document embeddings to the index by providing embeddings and corresponding ids
103
+ index.add_documents(
104
+ documents_ids=documents_ids,
105
+ documents_embeddings=documents_embeddings,
106
+ )
107
+ ```
108
+
109
+ Note that you do not have to recreate the index and encode the documents every time. Once you have created an index and added the documents, you can re-use the index later by loading it:
110
+
111
+ ```python
112
+ # To load an index, simply instantiate it with the correct folder/name and without overriding it
113
+ index = indexes.Voyager(
114
+ index_folder="pylate-index",
115
+ index_name="index",
116
+ )
117
+ ```
118
+
119
+ #### Retrieving top-k documents for queries
120
+
121
+ Once the documents are indexed, you can retrieve the top-k most relevant documents for a given set of queries.
122
+ To do so, initialize the ColBERT retriever with the index you want to search in, encode the queries and then retrieve the top-k documents to get the top matches ids and relevance scores:
123
+
124
+ ```python
125
+ # Step 1: Initialize the ColBERT retriever
126
+ retriever = retrieve.ColBERT(index=index)
127
+
128
+ # Step 2: Encode the queries
129
+ queries_embeddings = model.encode(
130
+ ["query for document 3", "query for document 1"],
131
+ batch_size=32,
132
+ is_query=True, # # Ensure that it is set to False to indicate that these are queries
133
+ show_progress_bar=True,
134
+ )
135
+
136
+ # Step 3: Retrieve top-k documents
137
+ scores = retriever.retrieve(
138
+ queries_embeddings=queries_embeddings,
139
+ k=10, # Retrieve the top 10 matches for each query
140
+ )
141
+ ```
142
+
143
+ ### Reranking
144
+ If you only want to use the ColBERT model to perform reranking on top of your first-stage retrieval pipeline without building an index, you can simply use rank function and pass the queries and documents to rerank:
145
+
146
+ ```python
147
+ from pylate import rank, models
148
+
149
+ queries = [
150
+ "query A",
151
+ "query B",
152
+ ]
153
+
154
+ documents = [
155
+ ["document A", "document B"],
156
+ ["document 1", "document C", "document B"],
157
+ ]
158
+
159
+ documents_ids = [
160
+ [1, 2],
161
+ [1, 3, 2],
162
+ ]
163
+
164
+ model = models.ColBERT(
165
+ model_name_or_path=pylate_model_id,
166
+ )
167
+
168
+ queries_embeddings = model.encode(
169
+ queries,
170
+ is_query=True,
171
+ )
172
+
173
+ documents_embeddings = model.encode(
174
+ documents,
175
+ is_query=False,
176
+ )
177
+
178
+ reranked_documents = rank.rerank(
179
+ documents_ids=documents_ids,
180
+ queries_embeddings=queries_embeddings,
181
+ documents_embeddings=documents_embeddings,
182
+ )
183
+ ```
184
+
185
+ <!--
186
+ ### Direct Usage (Transformers)
187
+
188
+ <details><summary>Click to see the direct usage in Transformers</summary>
189
+
190
+ </details>
191
+ -->
192
+
193
+ <!--
194
+ ### Downstream Usage (Sentence Transformers)
195
+
196
+ You can finetune this model on your own dataset.
197
+
198
+ <details><summary>Click to expand</summary>
199
+
200
+ </details>
201
+ -->
202
+
203
+ <!--
204
+ ### Out-of-Scope Use
205
+
206
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
207
+ -->
208
+
209
+ <!--
210
+ ## Bias, Risks and Limitations
211
+
212
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
213
+ -->
214
+
215
+ <!--
216
+ ### Recommendations
217
+
218
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
219
+ -->
220
+ ## Evaluation
221
+ ### BRIGHT Benchmark
222
+ The [BRIGHT benchmark](https://huggingface.co/datasets/xlangai/BRIGHT) is aimed at evaluating reasoning-intensive retrieval performance. Reason-ModernColBERT outperforms all existing models up to 7B (more than 45 times its size) and even surprisingly improving performance of [ReasonIR-8B](https://huggingface.co/reasonir/ReasonIR-8B) (a 8B model trained on the same data) by more than 2.5 NDCG@10 on average on Stack Exchange splits. We attribute such strong results to late-interaction compared to usual dense (single vector) retrieval performed by other models as highlighted in the next section.
223
+
224
+
225
+ | Model / Metric | Biology | Earth | Economics | Psychology | Robotics | Stackoverflow | Sustainable | Leetcode | Pony | AoPS | Theorem - Q | Theorem - T | Mean StackExchange | Mean coding | Mean theorem | Full mean |
226
+ |----------------------------------------------------------|---------|-------|-----------|------------|----------|---------------|-------------|----------|------|------|-----------|-----------|-------------------|-------------|--------------|-----------|
227
+ | BM25 | 18.9 | 27.2 | 14.9 | 12.5 | 13.6 | 18.4 | 15 | 24.4 | 7.9 | 6.2 | 10.4 | 4.9 | 17.21 | 16.15 | 7.17 | 14.53 |
228
+ | **< 1B OS** | | | | | | | | | | | | | | | | |
229
+ | BGE | 11.7 | 24.6 | 16.6 | 17.5 | 11.7 | 10.8 | 13.3 | 26.7 | 5.7 | 6 | 13 | 6.9 | 15.17 | 16.2 | 8.63 | 13.71 |
230
+ | Inst-L | 15.2 | 21.2 | 14.7 | 22.3 | 11.4 | 13.3 | 13.5 | 19.5 | 1.3 | 8.1 | 20.9 | 9.1 | 15.94 | 10.4 | 12.7 | 14.21 |
231
+ | SBERT | 15.1 | 20.4 | 16.6 | 22.7 | 8.2 | 11 | 15.3 | 26.4 | 7 | 5.3 | 20 | 10.8 | 15.61 | 16.7 | 12.03 | 14.9 |
232
+ | **> 1B OS** | | | | | | | | | | | | | | | | |
233
+ | E5 | 18.6 | 26 | 15.5 | 15.8 | 16.3 | 11.2 | 18.1 | 28.7 | 4.9 | 7.1 | 26.1 | 26.8 | 17.36 | 16.8 | 20 | 17.93 |
234
+ | SFR | 19.1 | 26.7 | 17.8 | 19 | 16.3 | 14.4 | 19.2 | 27.4 | 2 | 7.4 | 24.3 | 26 | 18.93 | 14.7 | 19.23 | 18.3 |
235
+ | Inst-XL | 21.6 | 34.3 | 22.4 | 27.4 | 18.2 | 21.2 | 19.1 | 27.5 | 5 | 8.5 | 15.6 | 5.9 | 23.46 | 16.25 | 10 | 18.89 |
236
+ | GritLM | 24.8 | 32.3 | 18.9 | 19.8 | 17.1 | 13.6 | 17.8 | 29.9 | 22 | 8.8 | 25.2 | 21.2 | 20.61 | 25.95 | 18.4 | 20.95 |
237
+ | Qwen | 30.6 | 36.4 | 17.8 | 24.6 | 13.2 | 22.2 | 14.8 | 25.5 | 9.9 | 14.4 | 27.8 | 32.9 | 22.8 | 17.7 | 25.03 | 22.51 |
238
+ | **Proprietary** | | | | | | | | | | | | | | | | |
239
+ | Cohere | 18.7 | 28.4 | 20.4 | 21.6 | 16.3 | 18.3 | 17.6 | 26.8 | 1.9 | 6.3 | 15.7 | 7.2 | 20.19 | 14.35 | 9.73 | 16.6 |
240
+ | OpenAI | 23.3 | 26.7 | 19.5 | 27.6 | 12.8 | 14.3 | 20.5 | 23.6 | 2.4 | 8.5 | 23.5 | 11.7 | 20.67 | 13 | 14.57 | 17.87 |
241
+ | Voyage | 23.1 | 25.4 | 19.9 | 24.9 | 10.8 | 16.8 | 15.4 | 30.6 | 1.5 | 7.5 | 27.4 | 11.6 | 19.47 | 16.05 | 15.5 | 17.91 |
242
+ | Google | 22.7 | 34.8 | 19.6 | 27.8 | 15.7 | 20.1 | 17.1 | 29.6 | 3.6 | 9.3 | 23.8 | 15.9 | 22.54 | 16.6 | 16.33 | 20 |
243
+ **ReasonIR data**
244
+ | ReasonIR-8B | 26.2 | 31.4 | 23.3 | 30 | 18 | **23.9** | **20.5** | **35** | **10.5** | **14.7** | **31.9** | **27.2** | 24.76 | **22.75** | **24.6** | **24.38** |
245
+ | Reason-ModernColBERT (150M) | **33.25** | **41.02** | **24.93** | **30.73** | **21.12** | 20.62 | 20.31 | 31.07 | 8.51 | 9.17 | 19.51 | 11.24 | **27.43** | 19.79 | 15.38 | 22.62 |
246
+
247
+ ### Comparison with a dense model
248
+ A fair claim would be that the performance of Reason-ModernColBERT are mostly due to the [ReasonIR data](https://huggingface.co/datasets/reasonir/reasonir-data). Although the differences between ReasonIR-8B and Reason-ModernColBERT already hint that it is most likely more than just that, we conducted a small experiment by training a dense (single vector) model in the same setup using Sentence Transformers as a multi-vector one trained using PyLate. This experiment highlights a very large gap in performance.
249
+ Obviously, more rigourous experiments are required to draw conclusion (e.g, both models could have been further tuned and the training could have been enhanced (e.g, we did not gather negatives from other GPUs in these experiments because ST do not supports it for now)) but the gap seems really big and it does correlate pretty well with Reason-ModernColBERT being competitive with ReasonIR-8B while being more than 50 times smaller.
250
+
251
+ | Model/Split | Biology | Earth | Economics | Psychology | Robotics | Stackoverflow | Sustainable | Leetcode | Pony | AoPS | Theorem Q | Theorem T | Mean StackExchange | Mean coding | Mean theorem | Full mean |
252
+ | :-- | :-- | :-- | :-- | :-- | :-- | :-- | :-- | :-- | :-- | :-- | :-- | :-- | :-- | :-- | :-- | :-- |
253
+ | Dense (single vector) model | 7.51 | 16.92 | 13.43 | 17.18 | 10.23 | 8.93 | 8.85 | 24.88 | 1.43 | 9.81 | **18.83** | **9.71** | 11.86 | 13.16 | **12.78** | 12.31 |
254
+ | Late-interaction (multi vector model) | **28.02** | **39.25** | **21.51** | **27.05** | **19.86** | **17.23** | **21.1** | **27.37** | **3.76** | **6.87** | 16.06 | 7.21 | **24.86** | **15.57** | 10.05 | **19.61** |
255
+
256
+
257
+
258
+ ## Training Details
259
+
260
+ ### Training Dataset
261
+
262
+ #### reasonir-hq
263
+
264
+ * Dataset: [train](https://huggingface.co/datasets/reasonir/reasonir-data) at [0275f82](https://huggingface.co/datasets/reasonir/reasonir-data/tree/0275f825929b206d4ead23d34b4f8a50d4eddbc8)
265
+ * Size: 100,521 training samples
266
+ * Columns: <code>query</code>, <code>pos</code>, and <code>neg</code>
267
+ * Approximate statistics based on the first 1000 samples:
268
+ | | query | pos | neg |
269
+ |:--------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
270
+ | type | string | string | string |
271
+ | details | <ul><li>min: 38 tokens</li><li>mean: 97.84 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 85 tokens</li><li>mean: 127.63 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 81 tokens</li><li>mean: 127.77 tokens</li><li>max: 128 tokens</li></ul> |
272
+ * Samples:
273
+ | query | pos | neg |
274
+ |:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
275
+ | <code>Given this reasoning-intensive query, find relevant documents that could help answer the question. A researcher is analyzing a sound signal represented by the equation f(t) = 2sin(3πt) + sin(5πt) + 0.5sin(7πt). Using the Fourier transform, what are the frequencies, amplitudes, and phases of the individual sinusoidal components in the signal?</code> | <code> A sound signal is given by the equation f(t) = sin(2πt) + sin(4πt) + sin(6πt) where t is time in seconds. Use Fourier transform to find the frequencies, amplitudes, and phases of the individual sinusoidal components in the signal.<br>To find the frequencies, amplitudes, and phases of the individual sinusoidal components in the signal f(t) = sin(2πt) + sin(4πt) + sin(6πt), we can use the Fourier transform. The Fourier transform of a continuous function f(t) is given by:<br><br>F(ω) = ∫[f(t) * e^(-jωt)] dt<br><br>where F(ω) is the Fourier transform of f(t), ω is the angular frequency, and j is the imaginary unit (j^2 = -1). In this case, f(t) is already given as a sum of sinusoidal functions, so we can directly identify the frequencies, amplitudes, and phases of the individual components.<br><br>1. First component: sin(2πt)<br>- Frequency: The angular frequency is 2π, so the frequency is ω/(2π) = 1 Hz.<br>- Amplitude: The coefficient of the sine function is 1, so the amplitude is 1.<br>- Phase: There is no phase shi...</code> | <code> The Fourier transform is widely used in various fields, including engineering, physics, and data analysis. It is a powerful tool for decomposing a signal into its constituent frequencies. In music, for example, the Fourier transform can be used to analyze the frequency components of a sound wave. By applying the Fourier transform to a sound signal, one can identify the different frequencies present in the signal, as well as their relative amplitudes. This information can be useful in a variety of applications, such as sound filtering and audio processing. The Fourier transform can also be used to analyze images and other types of data. In image processing, the Fourier transform can be used to filter out noise and other unwanted features from an image. It can also be used to compress images by representing them in the frequency domain. In addition to its many practical applications, the Fourier transform also has a number of interesting theoretical properties. For example, it has been ...</code> |
276
+ | <code>Given this reasoning-intensive query, find relevant documents that could help answer the question. A manufacturer is designing a cone-shaped container with a fixed volume of 200π cubic centimeters. The container's height is 12 centimeters, and the radius of the base is unknown. If the manufacturer wants to minimize the surface area of the container while maintaining its volume, what should be the radius of the base?</code> | <code> A right circular cone has a radius of 6cm and a slant height of 10cm. Determine the surface area of the cone.<br>To find the surface area of a right circular cone, we need to calculate the area of the base and the lateral surface area, and then add them together.<br><br>The base of the cone is a circle with radius r = 6 cm. The area of the base (A_base) can be found using the formula for the area of a circle:<br><br>A_base = πr^2<br>A_base = π(6 cm)^2<br>A_base = 36π cm^2<br><br>The lateral surface area (A_lateral) can be found using the formula for the lateral surface area of a cone:<br><br>A_lateral = πrs, where r is the radius and s is the slant height.<br><br>Given that the slant height s = 10 cm, we can calculate the lateral surface area:<br><br>A_lateral = π(6 cm)(10 cm)<br>A_lateral = 60π cm^2<br><br>Now, we can find the total surface area (A_total) by adding the base area and the lateral surface area:<br><br>A_total = A_base + A_lateral<br>A_total = 36π cm^2 + 60π cm^2<br>A_total = 96π cm^2<br><br>The surface area of the cone is 96π cm^2.</code> | <code> Torus-Shaped Containers in Chemical Engineering - New Designs and ApplicationsTorus-shaped containers are commonly used in chemical engineering for storing and transporting fluids. These containers have a distinctive doughnut shape, with a central hole and a circular cross-section. In this article, we will explore the design and applications of torus-shaped containers in chemical engineering.One of the main advantages of torus-shaped containers is their high volume-to-surface-area ratio. This makes them ideal for storing large quantities of fluids while minimizing the amount of material needed for construction. Additionally, the curved shape of the container provides added strength and stability, making it less prone to rupture or leakage.The design of torus-shaped containers typically involves the use of computer-aided design (CAD) software to create detailed models of the container's geometry. Engineers can then use these models to simulate various scenarios, such as fluid flow and ...</code> |
277
+ | <code>Given this reasoning-intensive query, find relevant documents that could help answer the question. On the xy-coordinate plane, points A and B are given as A(2, 4) and B(8, -3). Determine the coordinates of the point on line segment AB that is three times as far from A as it is from B.</code> | <code> On the xy co-ordinate plane, point C is (5,-2) and point D is (-1,1.5). The point on line segment CD that is twice as far from C as from D is:<br>Answer Choices: (A) (1,-1) (B) (1,1) (C) (2,0.25) (D) (3,0.5) (E) (3,1) <br>Let's think about the multi-choice question step by step.<br>We want the point on the line that is twice as far from C as it is from D. We can examine the x and y coordinates separately since they are independent.<br>*It should be noted that there are two solutions to this problem, one point between C and D, and another point with D in the middle of C and the point. We can quickly look at the answer choices and see that all the points are between C and D, therefore we can search for that point using the following method:<br>Taking the x-coordinate first, the distance between C and D is |(x-coordinate ofC - (x-coordinate ofD|= |5 - (-1)| = 6<br>The x-coordinate that is twice as far from C as it is from D (and in between C andD will be 4 units from C and 2 units from D. So the ...</code> | <code> The concept of midpoint is often useful in various mathematical problems, but sometimes we need to find other points that divide a line segment in a particular ratio. One common scenario is when we need to find the point that divides the line segment in the ratio of the other two points. Let's consider an example to understand this better. Suppose we have two points E(3, 4) and F(7, -2) on the xy-coordinate plane, and we want to find the point G on the line segment EF such that EG:GF = 2:5. To solve this problem, we can use the concept of section formula, which states that if a point P(x, y) divides the line segment joining the points A(x1, y1) and B(x2, y2) in the ratio m:n, then the coordinates of P are ((mx2+nx1)/(m+n), (my2+ny1)/(m+n)). Using this formula, we can find the coordinates of point G. First, we need to find the difference in x-coordinates and y-coordinates of points E and F. The difference in x-coordinates is 7 - 3 = 4, and the difference in y-coordinates is -2 - 4 = -6...</code> |
278
+ * Loss: <code>pylate.losses.cached_contrastive.CachedContrastive</code>
279
+
280
+ ### Training Hyperparameters
281
+ #### Non-Default Hyperparameters
282
+
283
+ - `per_device_train_batch_size`: 256
284
+ - `per_device_eval_batch_size`: 256
285
+ - `learning_rate`: 1e-05
286
+ - `bf16`: True
287
+ - `dataloader_num_workers`: 8
288
+
289
+ #### All Hyperparameters
290
+ <details><summary>Click to expand</summary>
291
+
292
+ - `overwrite_output_dir`: False
293
+ - `do_predict`: False
294
+ - `eval_strategy`: no
295
+ - `prediction_loss_only`: True
296
+ - `per_device_train_batch_size`: 256
297
+ - `per_device_eval_batch_size`: 256
298
+ - `per_gpu_train_batch_size`: None
299
+ - `per_gpu_eval_batch_size`: None
300
+ - `gradient_accumulation_steps`: 1
301
+ - `eval_accumulation_steps`: None
302
+ - `torch_empty_cache_steps`: None
303
+ - `learning_rate`: 1e-05
304
+ - `weight_decay`: 0.0
305
+ - `adam_beta1`: 0.9
306
+ - `adam_beta2`: 0.999
307
+ - `adam_epsilon`: 1e-08
308
+ - `max_grad_norm`: 1.0
309
+ - `num_train_epochs`: 3
310
+ - `max_steps`: -1
311
+ - `lr_scheduler_type`: linear
312
+ - `lr_scheduler_kwargs`: {}
313
+ - `warmup_ratio`: 0.0
314
+ - `warmup_steps`: 0
315
+ - `log_level`: passive
316
+ - `log_level_replica`: warning
317
+ - `log_on_each_node`: True
318
+ - `logging_nan_inf_filter`: True
319
+ - `save_safetensors`: True
320
+ - `save_on_each_node`: False
321
+ - `save_only_model`: False
322
+ - `restore_callback_states_from_checkpoint`: False
323
+ - `no_cuda`: False
324
+ - `use_cpu`: False
325
+ - `use_mps_device`: False
326
+ - `seed`: 42
327
+ - `data_seed`: None
328
+ - `jit_mode_eval`: False
329
+ - `use_ipex`: False
330
+ - `bf16`: True
331
+ - `fp16`: False
332
+ - `fp16_opt_level`: O1
333
+ - `half_precision_backend`: auto
334
+ - `bf16_full_eval`: False
335
+ - `fp16_full_eval`: False
336
+ - `tf32`: None
337
+ - `local_rank`: 0
338
+ - `ddp_backend`: None
339
+ - `tpu_num_cores`: None
340
+ - `tpu_metrics_debug`: False
341
+ - `debug`: []
342
+ - `dataloader_drop_last`: False
343
+ - `dataloader_num_workers`: 8
344
+ - `dataloader_prefetch_factor`: None
345
+ - `past_index`: -1
346
+ - `disable_tqdm`: False
347
+ - `remove_unused_columns`: True
348
+ - `label_names`: None
349
+ - `load_best_model_at_end`: False
350
+ - `ignore_data_skip`: False
351
+ - `fsdp`: []
352
+ - `fsdp_min_num_params`: 0
353
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
354
+ - `fsdp_transformer_layer_cls_to_wrap`: None
355
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
356
+ - `deepspeed`: None
357
+ - `label_smoothing_factor`: 0.0
358
+ - `optim`: adamw_torch
359
+ - `optim_args`: None
360
+ - `adafactor`: False
361
+ - `group_by_length`: False
362
+ - `length_column_name`: length
363
+ - `ddp_find_unused_parameters`: None
364
+ - `ddp_bucket_cap_mb`: None
365
+ - `ddp_broadcast_buffers`: False
366
+ - `dataloader_pin_memory`: True
367
+ - `dataloader_persistent_workers`: False
368
+ - `skip_memory_metrics`: True
369
+ - `use_legacy_prediction_loop`: False
370
+ - `push_to_hub`: False
371
+ - `resume_from_checkpoint`: None
372
+ - `hub_model_id`: None
373
+ - `hub_strategy`: every_save
374
+ - `hub_private_repo`: None
375
+ - `hub_always_push`: False
376
+ - `gradient_checkpointing`: False
377
+ - `gradient_checkpointing_kwargs`: None
378
+ - `include_inputs_for_metrics`: False
379
+ - `include_for_metrics`: []
380
+ - `eval_do_concat_batches`: True
381
+ - `fp16_backend`: auto
382
+ - `push_to_hub_model_id`: None
383
+ - `push_to_hub_organization`: None
384
+ - `mp_parameters`:
385
+ - `auto_find_batch_size`: False
386
+ - `full_determinism`: False
387
+ - `torchdynamo`: None
388
+ - `ray_scope`: last
389
+ - `ddp_timeout`: 1800
390
+ - `torch_compile`: False
391
+ - `torch_compile_backend`: None
392
+ - `torch_compile_mode`: None
393
+ - `dispatch_batches`: None
394
+ - `split_batches`: None
395
+ - `include_tokens_per_second`: False
396
+ - `include_num_input_tokens_seen`: False
397
+ - `neftune_noise_alpha`: None
398
+ - `optim_target_modules`: None
399
+ - `batch_eval_metrics`: False
400
+ - `eval_on_start`: False
401
+ - `use_liger_kernel`: False
402
+ - `eval_use_gather_object`: False
403
+ - `average_tokens_across_devices`: False
404
+ - `prompts`: None
405
+ - `batch_sampler`: batch_sampler
406
+ - `multi_dataset_batch_sampler`: proportional
407
+
408
+ </details>
409
+
410
+ ### Training Logs
411
+ <details><summary>Click to expand</summary>
412
+
413
+ | Epoch | Step | Training Loss |
414
+ |:------:|:----:|:-------------:|
415
+ | 0.0025 | 1 | 4.9684 |
416
+ | 0.0051 | 2 | 4.6956 |
417
+ | 0.0076 | 3 | 4.5076 |
418
+ | 0.0102 | 4 | 4.3723 |
419
+ | 0.0127 | 5 | 4.3305 |
420
+ | 0.0153 | 6 | 4.0355 |
421
+ | 0.0178 | 7 | 3.7886 |
422
+ | 0.0204 | 8 | 3.6133 |
423
+ | 0.0229 | 9 | 3.2395 |
424
+ | 0.0254 | 10 | 3.1481 |
425
+ | 0.0280 | 11 | 2.7444 |
426
+ | 0.0305 | 12 | 2.4946 |
427
+ | 0.0331 | 13 | 2.333 |
428
+ | 0.0356 | 14 | 2.2471 |
429
+ | 0.0382 | 15 | 1.9117 |
430
+ | 0.0407 | 16 | 1.6753 |
431
+ | 0.0433 | 17 | 1.2413 |
432
+ | 0.0458 | 18 | 1.1201 |
433
+ | 0.0483 | 19 | 1.0335 |
434
+ | 0.0509 | 20 | 1.0583 |
435
+ | 0.0534 | 21 | 1.067 |
436
+ | 0.0560 | 22 | 0.7056 |
437
+ | 0.0585 | 23 | 0.761 |
438
+ | 0.0611 | 24 | 0.5501 |
439
+ | 0.0636 | 25 | 0.6486 |
440
+ | 0.0662 | 26 | 0.4639 |
441
+ | 0.0687 | 27 | 0.3885 |
442
+ | 0.0712 | 28 | 0.4982 |
443
+ | 0.0738 | 29 | 0.4784 |
444
+ | 0.0763 | 30 | 0.5189 |
445
+ | 0.0789 | 31 | 0.4824 |
446
+ | 0.0814 | 32 | 0.4183 |
447
+ | 0.0840 | 33 | 0.4945 |
448
+ | 0.0865 | 34 | 0.2579 |
449
+ | 0.0891 | 35 | 0.3312 |
450
+ | 0.0916 | 36 | 0.4035 |
451
+ | 0.0941 | 37 | 0.305 |
452
+ | 0.0967 | 38 | 0.2898 |
453
+ | 0.0992 | 39 | 0.2899 |
454
+ | 0.1018 | 40 | 0.2713 |
455
+ | 0.1043 | 41 | 0.3017 |
456
+ | 0.1069 | 42 | 0.2395 |
457
+ | 0.1094 | 43 | 0.1548 |
458
+ | 0.1120 | 44 | 0.2468 |
459
+ | 0.1145 | 45 | 0.1876 |
460
+ | 0.1170 | 46 | 0.2322 |
461
+ | 0.1196 | 47 | 0.2823 |
462
+ | 0.1221 | 48 | 0.2158 |
463
+ | 0.1247 | 49 | 0.2679 |
464
+ | 0.1272 | 50 | 0.273 |
465
+ | 0.1298 | 51 | 0.2876 |
466
+ | 0.1323 | 52 | 0.197 |
467
+ | 0.1349 | 53 | 0.1282 |
468
+ | 0.1374 | 54 | 0.3355 |
469
+ | 0.1399 | 55 | 0.1941 |
470
+ | 0.1425 | 56 | 0.1873 |
471
+ | 0.1450 | 57 | 0.2288 |
472
+ | 0.1476 | 58 | 0.2802 |
473
+ | 0.1501 | 59 | 0.2087 |
474
+ | 0.1527 | 60 | 0.2239 |
475
+ | 0.1552 | 61 | 0.225 |
476
+ | 0.1578 | 62 | 0.1582 |
477
+ | 0.1603 | 63 | 0.1972 |
478
+ | 0.1628 | 64 | 0.1632 |
479
+ | 0.1654 | 65 | 0.2101 |
480
+ | 0.1679 | 66 | 0.2084 |
481
+ | 0.1705 | 67 | 0.1499 |
482
+ | 0.1730 | 68 | 0.1467 |
483
+ | 0.1756 | 69 | 0.1428 |
484
+ | 0.1781 | 70 | 0.2298 |
485
+ | 0.1807 | 71 | 0.1883 |
486
+ | 0.1832 | 72 | 0.22 |
487
+ | 0.1858 | 73 | 0.1988 |
488
+ | 0.1883 | 74 | 0.2091 |
489
+ | 0.1908 | 75 | 0.1948 |
490
+ | 0.1934 | 76 | 0.1348 |
491
+ | 0.1959 | 77 | 0.112 |
492
+ | 0.1985 | 78 | 0.1474 |
493
+ | 0.2010 | 79 | 0.1949 |
494
+ | 0.2036 | 80 | 0.1664 |
495
+ | 0.2061 | 81 | 0.1807 |
496
+ | 0.2087 | 82 | 0.1403 |
497
+ | 0.2112 | 83 | 0.1225 |
498
+ | 0.2137 | 84 | 0.1919 |
499
+ | 0.2163 | 85 | 0.1403 |
500
+ | 0.2188 | 86 | 0.1402 |
501
+ | 0.2214 | 87 | 0.0981 |
502
+ | 0.2239 | 88 | 0.1214 |
503
+ | 0.2265 | 89 | 0.1755 |
504
+ | 0.2290 | 90 | 0.1509 |
505
+ | 0.2316 | 91 | 0.1551 |
506
+ | 0.2341 | 92 | 0.176 |
507
+ | 0.2366 | 93 | 0.1648 |
508
+ | 0.2392 | 94 | 0.1622 |
509
+ | 0.2417 | 95 | 0.1372 |
510
+ | 0.2443 | 96 | 0.1016 |
511
+ | 0.2468 | 97 | 0.1134 |
512
+ | 0.2494 | 98 | 0.1436 |
513
+ | 0.2519 | 99 | 0.1478 |
514
+ | 0.2545 | 100 | 0.2065 |
515
+ | 0.2570 | 101 | 0.1901 |
516
+ | 0.2595 | 102 | 0.1859 |
517
+ | 0.2621 | 103 | 0.212 |
518
+ | 0.2646 | 104 | 0.2179 |
519
+ | 0.2672 | 105 | 0.2471 |
520
+ | 0.2697 | 106 | 0.1769 |
521
+ | 0.2723 | 107 | 0.1593 |
522
+ | 0.2748 | 108 | 0.204 |
523
+ | 0.2774 | 109 | 0.1496 |
524
+ | 0.2799 | 110 | 0.1212 |
525
+ | 0.2824 | 111 | 0.1282 |
526
+ | 0.2850 | 112 | 0.1126 |
527
+ | 0.2875 | 113 | 0.1254 |
528
+ | 0.2901 | 114 | 0.1422 |
529
+ | 0.2926 | 115 | 0.1266 |
530
+ | 0.2952 | 116 | 0.1305 |
531
+ | 0.2977 | 117 | 0.1283 |
532
+ | 0.3003 | 118 | 0.0737 |
533
+ | 0.3028 | 119 | 0.1237 |
534
+ | 0.3053 | 120 | 0.1185 |
535
+ | 0.3079 | 121 | 0.0891 |
536
+ | 0.3104 | 122 | 0.2312 |
537
+ | 0.3130 | 123 | 0.2384 |
538
+ | 0.3155 | 124 | 0.155 |
539
+ | 0.3181 | 125 | 0.1118 |
540
+ | 0.3206 | 126 | 0.1575 |
541
+ | 0.3232 | 127 | 0.2115 |
542
+ | 0.3257 | 128 | 0.098 |
543
+ | 0.3282 | 129 | 0.1811 |
544
+ | 0.3308 | 130 | 0.1704 |
545
+ | 0.3333 | 131 | 0.1494 |
546
+ | 0.3359 | 132 | 0.1531 |
547
+ | 0.3384 | 133 | 0.1032 |
548
+ | 0.3410 | 134 | 0.1137 |
549
+ | 0.3435 | 135 | 0.1271 |
550
+ | 0.3461 | 136 | 0.1591 |
551
+ | 0.3486 | 137 | 0.1586 |
552
+ | 0.3511 | 138 | 0.1292 |
553
+ | 0.3537 | 139 | 0.1115 |
554
+ | 0.3562 | 140 | 0.1337 |
555
+ | 0.3588 | 141 | 0.1298 |
556
+ | 0.3613 | 142 | 0.1649 |
557
+ | 0.3639 | 143 | 0.0855 |
558
+ | 0.3664 | 144 | 0.1124 |
559
+ | 0.3690 | 145 | 0.0764 |
560
+ | 0.3715 | 146 | 0.1402 |
561
+ | 0.3740 | 147 | 0.137 |
562
+ | 0.3766 | 148 | 0.0736 |
563
+ | 0.3791 | 149 | 0.0772 |
564
+ | 0.3817 | 150 | 0.1689 |
565
+ | 0.3842 | 151 | 0.1371 |
566
+ | 0.3868 | 152 | 0.1195 |
567
+ | 0.3893 | 153 | 0.1536 |
568
+ | 0.3919 | 154 | 0.1421 |
569
+ | 0.3944 | 155 | 0.1222 |
570
+ | 0.3969 | 156 | 0.1121 |
571
+ | 0.3995 | 157 | 0.0892 |
572
+ | 0.4020 | 158 | 0.1516 |
573
+ | 0.4046 | 159 | 0.1071 |
574
+ | 0.4071 | 160 | 0.1593 |
575
+ | 0.4097 | 161 | 0.1078 |
576
+ | 0.4122 | 162 | 0.1112 |
577
+ | 0.4148 | 163 | 0.2101 |
578
+ | 0.4173 | 164 | 0.2096 |
579
+ | 0.4198 | 165 | 0.1337 |
580
+ | 0.4224 | 166 | 0.1501 |
581
+ | 0.4249 | 167 | 0.0989 |
582
+ | 0.4275 | 168 | 0.0992 |
583
+ | 0.4300 | 169 | 0.0926 |
584
+ | 0.4326 | 170 | 0.0692 |
585
+ | 0.4351 | 171 | 0.1235 |
586
+ | 0.4377 | 172 | 0.1029 |
587
+ | 0.4402 | 173 | 0.1351 |
588
+ | 0.4427 | 174 | 0.0899 |
589
+ | 0.4453 | 175 | 0.0844 |
590
+ | 0.4478 | 176 | 0.1167 |
591
+ | 0.4504 | 177 | 0.1355 |
592
+ | 0.4529 | 178 | 0.092 |
593
+ | 0.4555 | 179 | 0.1005 |
594
+ | 0.4580 | 180 | 0.0891 |
595
+ | 0.4606 | 181 | 0.1396 |
596
+ | 0.4631 | 182 | 0.1024 |
597
+ | 0.4656 | 183 | 0.1325 |
598
+ | 0.4682 | 184 | 0.1061 |
599
+ | 0.4707 | 185 | 0.1657 |
600
+ | 0.4733 | 186 | 0.1141 |
601
+ | 0.4758 | 187 | 0.149 |
602
+ | 0.4784 | 188 | 0.1125 |
603
+ | 0.4809 | 189 | 0.1524 |
604
+ | 0.4835 | 190 | 0.1129 |
605
+ | 0.4860 | 191 | 0.1089 |
606
+ | 0.4885 | 192 | 0.1333 |
607
+ | 0.4911 | 193 | 0.1377 |
608
+ | 0.4936 | 194 | 0.0547 |
609
+ | 0.4962 | 195 | 0.1057 |
610
+ | 0.4987 | 196 | 0.1321 |
611
+ | 0.5013 | 197 | 0.0979 |
612
+ | 0.5038 | 198 | 0.1706 |
613
+ | 0.5064 | 199 | 0.1559 |
614
+ | 0.5089 | 200 | 0.1111 |
615
+ | 0.5115 | 201 | 0.1258 |
616
+ | 0.5140 | 202 | 0.0816 |
617
+ | 0.5165 | 203 | 0.1362 |
618
+ | 0.5191 | 204 | 0.1604 |
619
+ | 0.5216 | 205 | 0.1104 |
620
+ | 0.5242 | 206 | 0.1494 |
621
+ | 0.5267 | 207 | 0.1402 |
622
+ | 0.5293 | 208 | 0.1282 |
623
+ | 0.5318 | 209 | 0.1543 |
624
+ | 0.5344 | 210 | 0.1576 |
625
+ | 0.5369 | 211 | 0.2071 |
626
+ | 0.5394 | 212 | 0.1248 |
627
+ | 0.5420 | 213 | 0.1237 |
628
+ | 0.5445 | 214 | 0.0592 |
629
+ | 0.5471 | 215 | 0.1769 |
630
+ | 0.5496 | 216 | 0.1118 |
631
+ | 0.5522 | 217 | 0.1608 |
632
+ | 0.5547 | 218 | 0.1192 |
633
+ | 0.5573 | 219 | 0.0551 |
634
+ | 0.5598 | 220 | 0.1401 |
635
+ | 0.5623 | 221 | 0.2046 |
636
+ | 0.5649 | 222 | 0.1273 |
637
+ | 0.5674 | 223 | 0.1319 |
638
+ | 0.5700 | 224 | 0.1518 |
639
+ | 0.5725 | 225 | 0.0929 |
640
+ | 0.5751 | 226 | 0.1262 |
641
+ | 0.5776 | 227 | 0.1566 |
642
+ | 0.5802 | 228 | 0.1128 |
643
+ | 0.5827 | 229 | 0.1467 |
644
+ | 0.5852 | 230 | 0.1513 |
645
+ | 0.5878 | 231 | 0.1989 |
646
+ | 0.5903 | 232 | 0.0594 |
647
+ | 0.5929 | 233 | 0.0838 |
648
+ | 0.5954 | 234 | 0.0711 |
649
+ | 0.5980 | 235 | 0.0854 |
650
+ | 0.6005 | 236 | 0.1775 |
651
+ | 0.6031 | 237 | 0.118 |
652
+ | 0.6056 | 238 | 0.1297 |
653
+ | 0.6081 | 239 | 0.1092 |
654
+ | 0.6107 | 240 | 0.1469 |
655
+ | 0.6132 | 241 | 0.1203 |
656
+ | 0.6158 | 242 | 0.0901 |
657
+ | 0.6183 | 243 | 0.1179 |
658
+ | 0.6209 | 244 | 0.0864 |
659
+ | 0.6234 | 245 | 0.1277 |
660
+ | 0.6260 | 246 | 0.1313 |
661
+ | 0.6285 | 247 | 0.089 |
662
+ | 0.6310 | 248 | 0.0727 |
663
+ | 0.6336 | 249 | 0.0556 |
664
+ | 0.6361 | 250 | 0.0782 |
665
+ | 0.6387 | 251 | 0.0869 |
666
+ | 0.6412 | 252 | 0.0988 |
667
+ | 0.6438 | 253 | 0.0818 |
668
+ | 0.6463 | 254 | 0.1013 |
669
+ | 0.6489 | 255 | 0.096 |
670
+ | 0.6514 | 256 | 0.0622 |
671
+ | 0.6539 | 257 | 0.1561 |
672
+ | 0.6565 | 258 | 0.1282 |
673
+ | 0.6590 | 259 | 0.1087 |
674
+ | 0.6616 | 260 | 0.1312 |
675
+ | 0.6641 | 261 | 0.1343 |
676
+ | 0.6667 | 262 | 0.0955 |
677
+ | 0.6692 | 263 | 0.0844 |
678
+ | 0.6718 | 264 | 0.1209 |
679
+ | 0.6743 | 265 | 0.0858 |
680
+ | 0.6768 | 266 | 0.0714 |
681
+ | 0.6794 | 267 | 0.1431 |
682
+ | 0.6819 | 268 | 0.0632 |
683
+ | 0.6845 | 269 | 0.115 |
684
+ | 0.6870 | 270 | 0.1115 |
685
+ | 0.6896 | 271 | 0.1239 |
686
+ | 0.6921 | 272 | 0.1206 |
687
+ | 0.6947 | 273 | 0.1894 |
688
+ | 0.6972 | 274 | 0.0755 |
689
+ | 0.6997 | 275 | 0.0709 |
690
+ | 0.7023 | 276 | 0.1304 |
691
+ | 0.7048 | 277 | 0.1476 |
692
+ | 0.7074 | 278 | 0.1497 |
693
+ | 0.7099 | 279 | 0.113 |
694
+ | 0.7125 | 280 | 0.1676 |
695
+ | 0.7150 | 281 | 0.0999 |
696
+ | 0.7176 | 282 | 0.2044 |
697
+ | 0.7201 | 283 | 0.1125 |
698
+ | 0.7226 | 284 | 0.0956 |
699
+ | 0.7252 | 285 | 0.0956 |
700
+ | 0.7277 | 286 | 0.0771 |
701
+ | 0.7303 | 287 | 0.0712 |
702
+ | 0.7328 | 288 | 0.0525 |
703
+ | 0.7354 | 289 | 0.0689 |
704
+ | 0.7379 | 290 | 0.0964 |
705
+ | 0.7405 | 291 | 0.1068 |
706
+ | 0.7430 | 292 | 0.0536 |
707
+ | 0.7455 | 293 | 0.0861 |
708
+ | 0.7481 | 294 | 0.0813 |
709
+ | 0.7506 | 295 | 0.0885 |
710
+ | 0.7532 | 296 | 0.1083 |
711
+ | 0.7557 | 297 | 0.1124 |
712
+ | 0.7583 | 298 | 0.1095 |
713
+ | 0.7608 | 299 | 0.08 |
714
+ | 0.7634 | 300 | 0.1081 |
715
+ | 0.7659 | 301 | 0.0719 |
716
+ | 0.7684 | 302 | 0.0933 |
717
+ | 0.7710 | 303 | 0.1143 |
718
+ | 0.7735 | 304 | 0.065 |
719
+ | 0.7761 | 305 | 0.1276 |
720
+ | 0.7786 | 306 | 0.102 |
721
+ | 0.7812 | 307 | 0.186 |
722
+ | 0.7837 | 308 | 0.0778 |
723
+ | 0.7863 | 309 | 0.1419 |
724
+ | 0.7888 | 310 | 0.0895 |
725
+ | 0.7913 | 311 | 0.1154 |
726
+ | 0.7939 | 312 | 0.1037 |
727
+ | 0.7964 | 313 | 0.0711 |
728
+ | 0.7990 | 314 | 0.1559 |
729
+ | 0.8015 | 315 | 0.0755 |
730
+ | 0.8041 | 316 | 0.0799 |
731
+ | 0.8066 | 317 | 0.1137 |
732
+ | 0.8092 | 318 | 0.0837 |
733
+ | 0.8117 | 319 | 0.1052 |
734
+ | 0.8142 | 320 | 0.0846 |
735
+ | 0.8168 | 321 | 0.0715 |
736
+ | 0.8193 | 322 | 0.0923 |
737
+ | 0.8219 | 323 | 0.1397 |
738
+ | 0.8244 | 324 | 0.0899 |
739
+ | 0.8270 | 325 | 0.1414 |
740
+ | 0.8295 | 326 | 0.0422 |
741
+ | 0.8321 | 327 | 0.0748 |
742
+ | 0.8346 | 328 | 0.0739 |
743
+ | 0.8372 | 329 | 0.0855 |
744
+ | 0.8397 | 330 | 0.071 |
745
+ | 0.8422 | 331 | 0.0557 |
746
+ | 0.8448 | 332 | 0.1055 |
747
+ | 0.8473 | 333 | 0.096 |
748
+ | 0.8499 | 334 | 0.1083 |
749
+ | 0.8524 | 335 | 0.133 |
750
+ | 0.8550 | 336 | 0.1308 |
751
+ | 0.8575 | 337 | 0.0661 |
752
+ | 0.8601 | 338 | 0.0974 |
753
+ | 0.8626 | 339 | 0.1027 |
754
+ | 0.8651 | 340 | 0.1068 |
755
+ | 0.8677 | 341 | 0.1653 |
756
+ | 0.8702 | 342 | 0.097 |
757
+ | 0.8728 | 343 | 0.0845 |
758
+ | 0.8753 | 344 | 0.0546 |
759
+ | 0.8779 | 345 | 0.1273 |
760
+ | 0.8804 | 346 | 0.0982 |
761
+ | 0.8830 | 347 | 0.0893 |
762
+ | 0.8855 | 348 | 0.1222 |
763
+ | 0.8880 | 349 | 0.1072 |
764
+ | 0.8906 | 350 | 0.1254 |
765
+ | 0.8931 | 351 | 0.0679 |
766
+ | 0.8957 | 352 | 0.0995 |
767
+ | 0.8982 | 353 | 0.0878 |
768
+ | 0.9008 | 354 | 0.0564 |
769
+ | 0.9033 | 355 | 0.113 |
770
+ | 0.9059 | 356 | 0.0567 |
771
+ | 0.9084 | 357 | 0.0968 |
772
+ | 0.9109 | 358 | 0.1023 |
773
+ | 0.9135 | 359 | 0.1106 |
774
+ | 0.9160 | 360 | 0.091 |
775
+ | 0.9186 | 361 | 0.0988 |
776
+ | 0.9211 | 362 | 0.1374 |
777
+ | 0.9237 | 363 | 0.0855 |
778
+ | 0.9262 | 364 | 0.0824 |
779
+ | 0.9288 | 365 | 0.058 |
780
+ | 0.9313 | 366 | 0.0776 |
781
+ | 0.9338 | 367 | 0.1195 |
782
+ | 0.9364 | 368 | 0.0506 |
783
+ | 0.9389 | 369 | 0.0893 |
784
+ | 0.9415 | 370 | 0.1145 |
785
+ | 0.9440 | 371 | 0.0695 |
786
+ | 0.9466 | 372 | 0.0805 |
787
+ | 0.9491 | 373 | 0.0824 |
788
+ | 0.9517 | 374 | 0.0841 |
789
+ | 0.9542 | 375 | 0.0919 |
790
+ | 0.9567 | 376 | 0.064 |
791
+ | 0.9593 | 377 | 0.2194 |
792
+ | 0.9618 | 378 | 0.1165 |
793
+ | 0.9644 | 379 | 0.0888 |
794
+ | 0.9669 | 380 | 0.0826 |
795
+ | 0.9695 | 381 | 0.0687 |
796
+ | 0.9720 | 382 | 0.0933 |
797
+ | 0.9746 | 383 | 0.1337 |
798
+ | 0.9771 | 384 | 0.0738 |
799
+ | 0.9796 | 385 | 0.0749 |
800
+ | 0.9822 | 386 | 0.0742 |
801
+ | 0.9847 | 387 | 0.1111 |
802
+ | 0.9873 | 388 | 0.093 |
803
+ | 0.9898 | 389 | 0.0877 |
804
+ | 0.9924 | 390 | 0.0637 |
805
+ | 0.9949 | 391 | 0.0897 |
806
+ | 0.9975 | 392 | 0.0818 |
807
+ | 1.0 | 393 | 0.0362 |
808
+ | 1.0025 | 394 | 0.0561 |
809
+ | 1.0051 | 395 | 0.0847 |
810
+ | 1.0076 | 396 | 0.0752 |
811
+ | 1.0102 | 397 | 0.0951 |
812
+ | 1.0127 | 398 | 0.1069 |
813
+ | 1.0153 | 399 | 0.0553 |
814
+ | 1.0178 | 400 | 0.0929 |
815
+ | 1.0204 | 401 | 0.0876 |
816
+ | 1.0229 | 402 | 0.0381 |
817
+ | 1.0254 | 403 | 0.1074 |
818
+ | 1.0280 | 404 | 0.0763 |
819
+ | 1.0305 | 405 | 0.0881 |
820
+ | 1.0331 | 406 | 0.0481 |
821
+ | 1.0356 | 407 | 0.1398 |
822
+ | 1.0382 | 408 | 0.09 |
823
+ | 1.0407 | 409 | 0.1045 |
824
+ | 1.0433 | 410 | 0.088 |
825
+ | 1.0458 | 411 | 0.0751 |
826
+ | 1.0483 | 412 | 0.0781 |
827
+ | 1.0509 | 413 | 0.0844 |
828
+ | 1.0534 | 414 | 0.0949 |
829
+ | 1.0560 | 415 | 0.0467 |
830
+ | 1.0585 | 416 | 0.1159 |
831
+ | 1.0611 | 417 | 0.0511 |
832
+ | 1.0636 | 418 | 0.0659 |
833
+ | 1.0662 | 419 | 0.043 |
834
+ | 1.0687 | 420 | 0.0468 |
835
+ | 1.0712 | 421 | 0.068 |
836
+ | 1.0738 | 422 | 0.1022 |
837
+ | 1.0763 | 423 | 0.1096 |
838
+ | 1.0789 | 424 | 0.1113 |
839
+ | 1.0814 | 425 | 0.1219 |
840
+ | 1.0840 | 426 | 0.0852 |
841
+ | 1.0865 | 427 | 0.0413 |
842
+ | 1.0891 | 428 | 0.0797 |
843
+ | 1.0916 | 429 | 0.1048 |
844
+ | 1.0941 | 430 | 0.0494 |
845
+ | 1.0967 | 431 | 0.079 |
846
+ | 1.0992 | 432 | 0.0698 |
847
+ | 1.1018 | 433 | 0.0908 |
848
+ | 1.1043 | 434 | 0.0993 |
849
+ | 1.1069 | 435 | 0.0397 |
850
+ | 1.1094 | 436 | 0.0312 |
851
+ | 1.1120 | 437 | 0.089 |
852
+ | 1.1145 | 438 | 0.0318 |
853
+ | 1.1170 | 439 | 0.0356 |
854
+ | 1.1196 | 440 | 0.0588 |
855
+ | 1.1221 | 441 | 0.0311 |
856
+ | 1.1247 | 442 | 0.0578 |
857
+ | 1.1272 | 443 | 0.1313 |
858
+ | 1.1298 | 444 | 0.0897 |
859
+ | 1.1323 | 445 | 0.0798 |
860
+ | 1.1349 | 446 | 0.0326 |
861
+ | 1.1374 | 447 | 0.143 |
862
+ | 1.1399 | 448 | 0.0661 |
863
+ | 1.1425 | 449 | 0.0433 |
864
+ | 1.1450 | 450 | 0.0782 |
865
+ | 1.1476 | 451 | 0.08 |
866
+ | 1.1501 | 452 | 0.0505 |
867
+ | 1.1527 | 453 | 0.0542 |
868
+ | 1.1552 | 454 | 0.0755 |
869
+ | 1.1578 | 455 | 0.0315 |
870
+ | 1.1603 | 456 | 0.0667 |
871
+ | 1.1628 | 457 | 0.0329 |
872
+ | 1.1654 | 458 | 0.0791 |
873
+ | 1.1679 | 459 | 0.0698 |
874
+ | 1.1705 | 460 | 0.0194 |
875
+ | 1.1730 | 461 | 0.0501 |
876
+ | 1.1756 | 462 | 0.0449 |
877
+ | 1.1781 | 463 | 0.0903 |
878
+ | 1.1807 | 464 | 0.0503 |
879
+ | 1.1832 | 465 | 0.0664 |
880
+ | 1.1858 | 466 | 0.0457 |
881
+ | 1.1883 | 467 | 0.0568 |
882
+ | 1.1908 | 468 | 0.064 |
883
+ | 1.1934 | 469 | 0.0253 |
884
+ | 1.1959 | 470 | 0.046 |
885
+ | 1.1985 | 471 | 0.0279 |
886
+ | 1.2010 | 472 | 0.0733 |
887
+ | 1.2036 | 473 | 0.0463 |
888
+ | 1.2061 | 474 | 0.07 |
889
+ | 1.2087 | 475 | 0.0281 |
890
+ | 1.2112 | 476 | 0.0373 |
891
+ | 1.2137 | 477 | 0.0738 |
892
+ | 1.2163 | 478 | 0.0412 |
893
+ | 1.2188 | 479 | 0.0545 |
894
+ | 1.2214 | 480 | 0.0247 |
895
+ | 1.2239 | 481 | 0.0293 |
896
+ | 1.2265 | 482 | 0.0845 |
897
+ | 1.2290 | 483 | 0.055 |
898
+ | 1.2316 | 484 | 0.072 |
899
+ | 1.2341 | 485 | 0.0481 |
900
+ | 1.2366 | 486 | 0.0443 |
901
+ | 1.2392 | 487 | 0.0807 |
902
+ | 1.2417 | 488 | 0.0421 |
903
+ | 1.2443 | 489 | 0.0237 |
904
+ | 1.2468 | 490 | 0.0189 |
905
+ | 1.2494 | 491 | 0.0604 |
906
+ | 1.2519 | 492 | 0.0428 |
907
+ | 1.2545 | 493 | 0.061 |
908
+ | 1.2570 | 494 | 0.0723 |
909
+ | 1.2595 | 495 | 0.0539 |
910
+ | 1.2621 | 496 | 0.0747 |
911
+ | 1.2646 | 497 | 0.0917 |
912
+ | 1.2672 | 498 | 0.1161 |
913
+ | 1.2697 | 499 | 0.087 |
914
+ | 1.2723 | 500 | 0.0616 |
915
+ | 1.2748 | 501 | 0.0756 |
916
+ | 1.2774 | 502 | 0.0674 |
917
+ | 1.2799 | 503 | 0.04 |
918
+ | 1.2824 | 504 | 0.0354 |
919
+ | 1.2850 | 505 | 0.0403 |
920
+ | 1.2875 | 506 | 0.0596 |
921
+ | 1.2901 | 507 | 0.0359 |
922
+ | 1.2926 | 508 | 0.0648 |
923
+ | 1.2952 | 509 | 0.0424 |
924
+ | 1.2977 | 510 | 0.0605 |
925
+ | 1.3003 | 511 | 0.0136 |
926
+ | 1.3028 | 512 | 0.0547 |
927
+ | 1.3053 | 513 | 0.0385 |
928
+ | 1.3079 | 514 | 0.0191 |
929
+ | 1.3104 | 515 | 0.1222 |
930
+ | 1.3130 | 516 | 0.0906 |
931
+ | 1.3155 | 517 | 0.0603 |
932
+ | 1.3181 | 518 | 0.0366 |
933
+ | 1.3206 | 519 | 0.0416 |
934
+ | 1.3232 | 520 | 0.0832 |
935
+ | 1.3257 | 521 | 0.0355 |
936
+ | 1.3282 | 522 | 0.0614 |
937
+ | 1.3308 | 523 | 0.0539 |
938
+ | 1.3333 | 524 | 0.0566 |
939
+ | 1.3359 | 525 | 0.0727 |
940
+ | 1.3384 | 526 | 0.0311 |
941
+ | 1.3410 | 527 | 0.0254 |
942
+ | 1.3435 | 528 | 0.0376 |
943
+ | 1.3461 | 529 | 0.0652 |
944
+ | 1.3486 | 530 | 0.0717 |
945
+ | 1.3511 | 531 | 0.0521 |
946
+ | 1.3537 | 532 | 0.0404 |
947
+ | 1.3562 | 533 | 0.041 |
948
+ | 1.3588 | 534 | 0.0435 |
949
+ | 1.3613 | 535 | 0.0842 |
950
+ | 1.3639 | 536 | 0.0203 |
951
+ | 1.3664 | 537 | 0.072 |
952
+ | 1.3690 | 538 | 0.0277 |
953
+ | 1.3715 | 539 | 0.0575 |
954
+ | 1.3740 | 540 | 0.0665 |
955
+ | 1.3766 | 541 | 0.024 |
956
+ | 1.3791 | 542 | 0.0202 |
957
+ | 1.3817 | 543 | 0.052 |
958
+ | 1.3842 | 544 | 0.0532 |
959
+ | 1.3868 | 545 | 0.0623 |
960
+ | 1.3893 | 546 | 0.0643 |
961
+ | 1.3919 | 547 | 0.0694 |
962
+ | 1.3944 | 548 | 0.0582 |
963
+ | 1.3969 | 549 | 0.0411 |
964
+ | 1.3995 | 550 | 0.0245 |
965
+ | 1.4020 | 551 | 0.0714 |
966
+ | 1.4046 | 552 | 0.0489 |
967
+ | 1.4071 | 553 | 0.0696 |
968
+ | 1.4097 | 554 | 0.0316 |
969
+ | 1.4122 | 555 | 0.0554 |
970
+ | 1.4148 | 556 | 0.097 |
971
+ | 1.4173 | 557 | 0.0665 |
972
+ | 1.4198 | 558 | 0.0578 |
973
+ | 1.4224 | 559 | 0.0746 |
974
+ | 1.4249 | 560 | 0.0347 |
975
+ | 1.4275 | 561 | 0.0471 |
976
+ | 1.4300 | 562 | 0.0237 |
977
+ | 1.4326 | 563 | 0.0269 |
978
+ | 1.4351 | 564 | 0.068 |
979
+ | 1.4377 | 565 | 0.0362 |
980
+ | 1.4402 | 566 | 0.059 |
981
+ | 1.4427 | 567 | 0.0321 |
982
+ | 1.4453 | 568 | 0.0469 |
983
+ | 1.4478 | 569 | 0.0445 |
984
+ | 1.4504 | 570 | 0.0804 |
985
+ | 1.4529 | 571 | 0.0387 |
986
+ | 1.4555 | 572 | 0.0358 |
987
+ | 1.4580 | 573 | 0.0322 |
988
+ | 1.4606 | 574 | 0.0673 |
989
+ | 1.4631 | 575 | 0.0302 |
990
+ | 1.4656 | 576 | 0.0612 |
991
+ | 1.4682 | 577 | 0.0553 |
992
+ | 1.4707 | 578 | 0.0998 |
993
+ | 1.4733 | 579 | 0.0396 |
994
+ | 1.4758 | 580 | 0.0764 |
995
+ | 1.4784 | 581 | 0.0427 |
996
+ | 1.4809 | 582 | 0.0785 |
997
+ | 1.4835 | 583 | 0.0419 |
998
+ | 1.4860 | 584 | 0.0584 |
999
+ | 1.4885 | 585 | 0.0437 |
1000
+ | 1.4911 | 586 | 0.0561 |
1001
+ | 1.4936 | 587 | 0.0131 |
1002
+ | 1.4962 | 588 | 0.0472 |
1003
+ | 1.4987 | 589 | 0.0479 |
1004
+ | 1.5013 | 590 | 0.0477 |
1005
+ | 1.5038 | 591 | 0.0745 |
1006
+ | 1.5064 | 592 | 0.0918 |
1007
+ | 1.5089 | 593 | 0.041 |
1008
+ | 1.5115 | 594 | 0.0463 |
1009
+ | 1.5140 | 595 | 0.0227 |
1010
+ | 1.5165 | 596 | 0.0427 |
1011
+ | 1.5191 | 597 | 0.0754 |
1012
+ | 1.5216 | 598 | 0.0489 |
1013
+ | 1.5242 | 599 | 0.0765 |
1014
+ | 1.5267 | 600 | 0.0651 |
1015
+ | 1.5293 | 601 | 0.0544 |
1016
+ | 1.5318 | 602 | 0.0777 |
1017
+ | 1.5344 | 603 | 0.0638 |
1018
+ | 1.5369 | 604 | 0.1198 |
1019
+ | 1.5394 | 605 | 0.0882 |
1020
+ | 1.5420 | 606 | 0.0236 |
1021
+ | 1.5445 | 607 | 0.0202 |
1022
+ | 1.5471 | 608 | 0.0955 |
1023
+ | 1.5496 | 609 | 0.0366 |
1024
+ | 1.5522 | 610 | 0.1021 |
1025
+ | 1.5547 | 611 | 0.0669 |
1026
+ | 1.5573 | 612 | 0.0185 |
1027
+ | 1.5598 | 613 | 0.0575 |
1028
+ | 1.5623 | 614 | 0.1001 |
1029
+ | 1.5649 | 615 | 0.0664 |
1030
+ | 1.5674 | 616 | 0.0617 |
1031
+ | 1.5700 | 617 | 0.0661 |
1032
+ | 1.5725 | 618 | 0.0425 |
1033
+ | 1.5751 | 619 | 0.0445 |
1034
+ | 1.5776 | 620 | 0.0773 |
1035
+ | 1.5802 | 621 | 0.0504 |
1036
+ | 1.5827 | 622 | 0.0785 |
1037
+ | 1.5852 | 623 | 0.0802 |
1038
+ | 1.5878 | 624 | 0.0882 |
1039
+ | 1.5903 | 625 | 0.0125 |
1040
+ | 1.5929 | 626 | 0.0305 |
1041
+ | 1.5954 | 627 | 0.0275 |
1042
+ | 1.5980 | 628 | 0.0245 |
1043
+ | 1.6005 | 629 | 0.0897 |
1044
+ | 1.6031 | 630 | 0.0444 |
1045
+ | 1.6056 | 631 | 0.0589 |
1046
+ | 1.6081 | 632 | 0.0337 |
1047
+ | 1.6107 | 633 | 0.0889 |
1048
+ | 1.6132 | 634 | 0.0556 |
1049
+ | 1.6158 | 635 | 0.0426 |
1050
+ | 1.6183 | 636 | 0.046 |
1051
+ | 1.6209 | 637 | 0.0342 |
1052
+ | 1.6234 | 638 | 0.0573 |
1053
+ | 1.6260 | 639 | 0.0569 |
1054
+ | 1.6285 | 640 | 0.0248 |
1055
+ | 1.6310 | 641 | 0.0214 |
1056
+ | 1.6336 | 642 | 0.0147 |
1057
+ | 1.6361 | 643 | 0.0203 |
1058
+ | 1.6387 | 644 | 0.0366 |
1059
+ | 1.6412 | 645 | 0.0484 |
1060
+ | 1.6438 | 646 | 0.0301 |
1061
+ | 1.6463 | 647 | 0.0314 |
1062
+ | 1.6489 | 648 | 0.0369 |
1063
+ | 1.6514 | 649 | 0.0168 |
1064
+ | 1.6539 | 650 | 0.0645 |
1065
+ | 1.6565 | 651 | 0.0755 |
1066
+ | 1.6590 | 652 | 0.0448 |
1067
+ | 1.6616 | 653 | 0.0795 |
1068
+ | 1.6641 | 654 | 0.0673 |
1069
+ | 1.6667 | 655 | 0.0431 |
1070
+ | 1.6692 | 656 | 0.0265 |
1071
+ | 1.6718 | 657 | 0.0567 |
1072
+ | 1.6743 | 658 | 0.0235 |
1073
+ | 1.6768 | 659 | 0.034 |
1074
+ | 1.6794 | 660 | 0.0812 |
1075
+ | 1.6819 | 661 | 0.0157 |
1076
+ | 1.6845 | 662 | 0.0448 |
1077
+ | 1.6870 | 663 | 0.0488 |
1078
+ | 1.6896 | 664 | 0.0515 |
1079
+ | 1.6921 | 665 | 0.0531 |
1080
+ | 1.6947 | 666 | 0.1166 |
1081
+ | 1.6972 | 667 | 0.0264 |
1082
+ | 1.6997 | 668 | 0.0325 |
1083
+ | 1.7023 | 669 | 0.0784 |
1084
+ | 1.7048 | 670 | 0.0859 |
1085
+ | 1.7074 | 671 | 0.0981 |
1086
+ | 1.7099 | 672 | 0.0411 |
1087
+ | 1.7125 | 673 | 0.0915 |
1088
+ | 1.7150 | 674 | 0.0396 |
1089
+ | 1.7176 | 675 | 0.1381 |
1090
+ | 1.7201 | 676 | 0.0547 |
1091
+ | 1.7226 | 677 | 0.0436 |
1092
+ | 1.7252 | 678 | 0.0519 |
1093
+ | 1.7277 | 679 | 0.0305 |
1094
+ | 1.7303 | 680 | 0.0356 |
1095
+ | 1.7328 | 681 | 0.0173 |
1096
+ | 1.7354 | 682 | 0.0299 |
1097
+ | 1.7379 | 683 | 0.0424 |
1098
+ | 1.7405 | 684 | 0.038 |
1099
+ | 1.7430 | 685 | 0.0159 |
1100
+ | 1.7455 | 686 | 0.0273 |
1101
+ | 1.7481 | 687 | 0.0301 |
1102
+ | 1.7506 | 688 | 0.0315 |
1103
+ | 1.7532 | 689 | 0.0566 |
1104
+ | 1.7557 | 690 | 0.0478 |
1105
+ | 1.7583 | 691 | 0.0533 |
1106
+ | 1.7608 | 692 | 0.0248 |
1107
+ | 1.7634 | 693 | 0.0454 |
1108
+ | 1.7659 | 694 | 0.0252 |
1109
+ | 1.7684 | 695 | 0.0326 |
1110
+ | 1.7710 | 696 | 0.0501 |
1111
+ | 1.7735 | 697 | 0.0196 |
1112
+ | 1.7761 | 698 | 0.0487 |
1113
+ | 1.7786 | 699 | 0.0445 |
1114
+ | 1.7812 | 700 | 0.1264 |
1115
+ | 1.7837 | 701 | 0.0312 |
1116
+ | 1.7863 | 702 | 0.1022 |
1117
+ | 1.7888 | 703 | 0.0293 |
1118
+ | 1.7913 | 704 | 0.0671 |
1119
+ | 1.7939 | 705 | 0.051 |
1120
+ | 1.7964 | 706 | 0.0246 |
1121
+ | 1.7990 | 707 | 0.1115 |
1122
+ | 1.8015 | 708 | 0.0203 |
1123
+ | 1.8041 | 709 | 0.0359 |
1124
+ | 1.8066 | 710 | 0.0699 |
1125
+ | 1.8092 | 711 | 0.0435 |
1126
+ | 1.8117 | 712 | 0.0689 |
1127
+ | 1.8142 | 713 | 0.0359 |
1128
+ | 1.8168 | 714 | 0.0321 |
1129
+ | 1.8193 | 715 | 0.0439 |
1130
+ | 1.8219 | 716 | 0.0652 |
1131
+ | 1.8244 | 717 | 0.0494 |
1132
+ | 1.8270 | 718 | 0.0864 |
1133
+ | 1.8295 | 719 | 0.0119 |
1134
+ | 1.8321 | 720 | 0.0284 |
1135
+ | 1.8346 | 721 | 0.0344 |
1136
+ | 1.8372 | 722 | 0.0454 |
1137
+ | 1.8397 | 723 | 0.0267 |
1138
+ | 1.8422 | 724 | 0.0152 |
1139
+ | 1.8448 | 725 | 0.0512 |
1140
+ | 1.8473 | 726 | 0.0537 |
1141
+ | 1.8499 | 727 | 0.0873 |
1142
+ | 1.8524 | 728 | 0.0934 |
1143
+ | 1.8550 | 729 | 0.0583 |
1144
+ | 1.8575 | 730 | 0.0206 |
1145
+ | 1.8601 | 731 | 0.0308 |
1146
+ | 1.8626 | 732 | 0.0443 |
1147
+ | 1.8651 | 733 | 0.0435 |
1148
+ | 1.8677 | 734 | 0.1254 |
1149
+ | 1.8702 | 735 | 0.0525 |
1150
+ | 1.8728 | 736 | 0.039 |
1151
+ | 1.8753 | 737 | 0.0157 |
1152
+ | 1.8779 | 738 | 0.0621 |
1153
+ | 1.8804 | 739 | 0.0405 |
1154
+ | 1.8830 | 740 | 0.0369 |
1155
+ | 1.8855 | 741 | 0.0568 |
1156
+ | 1.8880 | 742 | 0.0451 |
1157
+ | 1.8906 | 743 | 0.0657 |
1158
+ | 1.8931 | 744 | 0.0304 |
1159
+ | 1.8957 | 745 | 0.047 |
1160
+ | 1.8982 | 746 | 0.0457 |
1161
+ | 1.9008 | 747 | 0.0239 |
1162
+ | 1.9033 | 748 | 0.0669 |
1163
+ | 1.9059 | 749 | 0.0252 |
1164
+ | 1.9084 | 750 | 0.061 |
1165
+ | 1.9109 | 751 | 0.0429 |
1166
+ | 1.9135 | 752 | 0.0611 |
1167
+ | 1.9160 | 753 | 0.0482 |
1168
+ | 1.9186 | 754 | 0.0381 |
1169
+ | 1.9211 | 755 | 0.0749 |
1170
+ | 1.9237 | 756 | 0.0481 |
1171
+ | 1.9262 | 757 | 0.0405 |
1172
+ | 1.9288 | 758 | 0.0248 |
1173
+ | 1.9313 | 759 | 0.0377 |
1174
+ | 1.9338 | 760 | 0.061 |
1175
+ | 1.9364 | 761 | 0.0203 |
1176
+ | 1.9389 | 762 | 0.0315 |
1177
+ | 1.9415 | 763 | 0.0534 |
1178
+ | 1.9440 | 764 | 0.0383 |
1179
+ | 1.9466 | 765 | 0.0431 |
1180
+ | 1.9491 | 766 | 0.0509 |
1181
+ | 1.9517 | 767 | 0.0361 |
1182
+ | 1.9542 | 768 | 0.054 |
1183
+ | 1.9567 | 769 | 0.0248 |
1184
+ | 1.9593 | 770 | 0.1599 |
1185
+ | 1.9618 | 771 | 0.0657 |
1186
+ | 1.9644 | 772 | 0.0373 |
1187
+ | 1.9669 | 773 | 0.0632 |
1188
+ | 1.9695 | 774 | 0.0385 |
1189
+ | 1.9720 | 775 | 0.0456 |
1190
+ | 1.9746 | 776 | 0.0857 |
1191
+ | 1.9771 | 777 | 0.0253 |
1192
+ | 1.9796 | 778 | 0.0378 |
1193
+ | 1.9822 | 779 | 0.0366 |
1194
+ | 1.9847 | 780 | 0.0646 |
1195
+ | 1.9873 | 781 | 0.062 |
1196
+ | 1.9898 | 782 | 0.0513 |
1197
+ | 1.9924 | 783 | 0.0291 |
1198
+ | 1.9949 | 784 | 0.0466 |
1199
+ | 1.9975 | 785 | 0.0345 |
1200
+ | 2.0 | 786 | 0.0108 |
1201
+ | 2.0025 | 787 | 0.0196 |
1202
+ | 2.0051 | 788 | 0.0402 |
1203
+ | 2.0076 | 789 | 0.034 |
1204
+ | 2.0102 | 790 | 0.0606 |
1205
+ | 2.0127 | 791 | 0.0677 |
1206
+ | 2.0153 | 792 | 0.0174 |
1207
+ | 2.0178 | 793 | 0.0548 |
1208
+ | 2.0204 | 794 | 0.0385 |
1209
+ | 2.0229 | 795 | 0.0146 |
1210
+ | 2.0254 | 796 | 0.0716 |
1211
+ | 2.0280 | 797 | 0.0304 |
1212
+ | 2.0305 | 798 | 0.0512 |
1213
+ | 2.0331 | 799 | 0.0158 |
1214
+ | 2.0356 | 800 | 0.0973 |
1215
+ | 2.0382 | 801 | 0.0394 |
1216
+ | 2.0407 | 802 | 0.0724 |
1217
+ | 2.0433 | 803 | 0.0518 |
1218
+ | 2.0458 | 804 | 0.0385 |
1219
+ | 2.0483 | 805 | 0.0464 |
1220
+ | 2.0509 | 806 | 0.0501 |
1221
+ | 2.0534 | 807 | 0.051 |
1222
+ | 2.0560 | 808 | 0.0232 |
1223
+ | 2.0585 | 809 | 0.0631 |
1224
+ | 2.0611 | 810 | 0.0192 |
1225
+ | 2.0636 | 811 | 0.0301 |
1226
+ | 2.0662 | 812 | 0.0177 |
1227
+ | 2.0687 | 813 | 0.0172 |
1228
+ | 2.0712 | 814 | 0.0313 |
1229
+ | 2.0738 | 815 | 0.0653 |
1230
+ | 2.0763 | 816 | 0.0715 |
1231
+ | 2.0789 | 817 | 0.0548 |
1232
+ | 2.0814 | 818 | 0.0729 |
1233
+ | 2.0840 | 819 | 0.0399 |
1234
+ | 2.0865 | 820 | 0.0208 |
1235
+ | 2.0891 | 821 | 0.0476 |
1236
+ | 2.0916 | 822 | 0.054 |
1237
+ | 2.0941 | 823 | 0.0174 |
1238
+ | 2.0967 | 824 | 0.0431 |
1239
+ | 2.0992 | 825 | 0.0361 |
1240
+ | 2.1018 | 826 | 0.0514 |
1241
+ | 2.1043 | 827 | 0.0513 |
1242
+ | 2.1069 | 828 | 0.0099 |
1243
+ | 2.1094 | 829 | 0.0137 |
1244
+ | 2.1120 | 830 | 0.0493 |
1245
+ | 2.1145 | 831 | 0.0133 |
1246
+ | 2.1170 | 832 | 0.0087 |
1247
+ | 2.1196 | 833 | 0.0306 |
1248
+ | 2.1221 | 834 | 0.0092 |
1249
+ | 2.1247 | 835 | 0.0242 |
1250
+ | 2.1272 | 836 | 0.0905 |
1251
+ | 2.1298 | 837 | 0.0544 |
1252
+ | 2.1323 | 838 | 0.0462 |
1253
+ | 2.1349 | 839 | 0.0107 |
1254
+ | 2.1374 | 840 | 0.0846 |
1255
+ | 2.1399 | 841 | 0.031 |
1256
+ | 2.1425 | 842 | 0.027 |
1257
+ | 2.1450 | 843 | 0.05 |
1258
+ | 2.1476 | 844 | 0.0468 |
1259
+ | 2.1501 | 845 | 0.0251 |
1260
+ | 2.1527 | 846 | 0.031 |
1261
+ | 2.1552 | 847 | 0.0343 |
1262
+ | 2.1578 | 848 | 0.0149 |
1263
+ | 2.1603 | 849 | 0.0347 |
1264
+ | 2.1628 | 850 | 0.014 |
1265
+ | 2.1654 | 851 | 0.0471 |
1266
+ | 2.1679 | 852 | 0.0413 |
1267
+ | 2.1705 | 853 | 0.0047 |
1268
+ | 2.1730 | 854 | 0.0232 |
1269
+ | 2.1756 | 855 | 0.025 |
1270
+ | 2.1781 | 856 | 0.0621 |
1271
+ | 2.1807 | 857 | 0.0198 |
1272
+ | 2.1832 | 858 | 0.0346 |
1273
+ | 2.1858 | 859 | 0.0177 |
1274
+ | 2.1883 | 860 | 0.0298 |
1275
+ | 2.1908 | 861 | 0.0325 |
1276
+ | 2.1934 | 862 | 0.0075 |
1277
+ | 2.1959 | 863 | 0.0224 |
1278
+ | 2.1985 | 864 | 0.0085 |
1279
+ | 2.2010 | 865 | 0.0498 |
1280
+ | 2.2036 | 866 | 0.0222 |
1281
+ | 2.2061 | 867 | 0.0309 |
1282
+ | 2.2087 | 868 | 0.0074 |
1283
+ | 2.2112 | 869 | 0.0126 |
1284
+ | 2.2137 | 870 | 0.0372 |
1285
+ | 2.2163 | 871 | 0.0232 |
1286
+ | 2.2188 | 872 | 0.033 |
1287
+ | 2.2214 | 873 | 0.0111 |
1288
+ | 2.2239 | 874 | 0.0121 |
1289
+ | 2.2265 | 875 | 0.0552 |
1290
+ | 2.2290 | 876 | 0.0305 |
1291
+ | 2.2316 | 877 | 0.042 |
1292
+ | 2.2341 | 878 | 0.0147 |
1293
+ | 2.2366 | 879 | 0.0222 |
1294
+ | 2.2392 | 880 | 0.0341 |
1295
+ | 2.2417 | 881 | 0.0163 |
1296
+ | 2.2443 | 882 | 0.0084 |
1297
+ | 2.2468 | 883 | 0.0081 |
1298
+ | 2.2494 | 884 | 0.0312 |
1299
+ | 2.2519 | 885 | 0.0153 |
1300
+ | 2.2545 | 886 | 0.0262 |
1301
+ | 2.2570 | 887 | 0.0404 |
1302
+ | 2.2595 | 888 | 0.0198 |
1303
+ | 2.2621 | 889 | 0.0304 |
1304
+ | 2.2646 | 890 | 0.0544 |
1305
+ | 2.2672 | 891 | 0.065 |
1306
+ | 2.2697 | 892 | 0.0473 |
1307
+ | 2.2723 | 893 | 0.0291 |
1308
+ | 2.2748 | 894 | 0.0415 |
1309
+ | 2.2774 | 895 | 0.0398 |
1310
+ | 2.2799 | 896 | 0.018 |
1311
+ | 2.2824 | 897 | 0.0158 |
1312
+ | 2.2850 | 898 | 0.0161 |
1313
+ | 2.2875 | 899 | 0.0347 |
1314
+ | 2.2901 | 900 | 0.0104 |
1315
+ | 2.2926 | 901 | 0.044 |
1316
+ | 2.2952 | 902 | 0.019 |
1317
+ | 2.2977 | 903 | 0.0416 |
1318
+ | 2.3003 | 904 | 0.0039 |
1319
+ | 2.3028 | 905 | 0.0246 |
1320
+ | 2.3053 | 906 | 0.0133 |
1321
+ | 2.3079 | 907 | 0.0053 |
1322
+ | 2.3104 | 908 | 0.0992 |
1323
+ | 2.3130 | 909 | 0.0569 |
1324
+ | 2.3155 | 910 | 0.0326 |
1325
+ | 2.3181 | 911 | 0.0189 |
1326
+ | 2.3206 | 912 | 0.0115 |
1327
+ | 2.3232 | 913 | 0.0417 |
1328
+ | 2.3257 | 914 | 0.0161 |
1329
+ | 2.3282 | 915 | 0.0308 |
1330
+ | 2.3308 | 916 | 0.0234 |
1331
+ | 2.3333 | 917 | 0.027 |
1332
+ | 2.3359 | 918 | 0.0391 |
1333
+ | 2.3384 | 919 | 0.0107 |
1334
+ | 2.3410 | 920 | 0.0092 |
1335
+ | 2.3435 | 921 | 0.016 |
1336
+ | 2.3461 | 922 | 0.0299 |
1337
+ | 2.3486 | 923 | 0.0493 |
1338
+ | 2.3511 | 924 | 0.025 |
1339
+ | 2.3537 | 925 | 0.0127 |
1340
+ | 2.3562 | 926 | 0.0131 |
1341
+ | 2.3588 | 927 | 0.0214 |
1342
+ | 2.3613 | 928 | 0.0538 |
1343
+ | 2.3639 | 929 | 0.0082 |
1344
+ | 2.3664 | 930 | 0.043 |
1345
+ | 2.3690 | 931 | 0.0074 |
1346
+ | 2.3715 | 932 | 0.042 |
1347
+ | 2.3740 | 933 | 0.044 |
1348
+ | 2.3766 | 934 | 0.01 |
1349
+ | 2.3791 | 935 | 0.0055 |
1350
+ | 2.3817 | 936 | 0.0215 |
1351
+ | 2.3842 | 937 | 0.0258 |
1352
+ | 2.3868 | 938 | 0.0302 |
1353
+ | 2.3893 | 939 | 0.0326 |
1354
+ | 2.3919 | 940 | 0.0348 |
1355
+ | 2.3944 | 941 | 0.0444 |
1356
+ | 2.3969 | 942 | 0.019 |
1357
+ | 2.3995 | 943 | 0.0098 |
1358
+ | 2.4020 | 944 | 0.0283 |
1359
+ | 2.4046 | 945 | 0.0306 |
1360
+ | 2.4071 | 946 | 0.0316 |
1361
+ | 2.4097 | 947 | 0.01 |
1362
+ | 2.4122 | 948 | 0.0253 |
1363
+ | 2.4148 | 949 | 0.0664 |
1364
+ | 2.4173 | 950 | 0.0366 |
1365
+ | 2.4198 | 951 | 0.0307 |
1366
+ | 2.4224 | 952 | 0.0422 |
1367
+ | 2.4249 | 953 | 0.0133 |
1368
+ | 2.4275 | 954 | 0.0209 |
1369
+ | 2.4300 | 955 | 0.0065 |
1370
+ | 2.4326 | 956 | 0.0107 |
1371
+ | 2.4351 | 957 | 0.0396 |
1372
+ | 2.4377 | 958 | 0.0137 |
1373
+ | 2.4402 | 959 | 0.0258 |
1374
+ | 2.4427 | 960 | 0.0138 |
1375
+ | 2.4453 | 961 | 0.0275 |
1376
+ | 2.4478 | 962 | 0.0208 |
1377
+ | 2.4504 | 963 | 0.0302 |
1378
+ | 2.4529 | 964 | 0.0292 |
1379
+ | 2.4555 | 965 | 0.018 |
1380
+ | 2.4580 | 966 | 0.0168 |
1381
+ | 2.4606 | 967 | 0.0365 |
1382
+ | 2.4631 | 968 | 0.0141 |
1383
+ | 2.4656 | 969 | 0.0348 |
1384
+ | 2.4682 | 970 | 0.022 |
1385
+ | 2.4707 | 971 | 0.0677 |
1386
+ | 2.4733 | 972 | 0.0156 |
1387
+ | 2.4758 | 973 | 0.0424 |
1388
+ | 2.4784 | 974 | 0.0188 |
1389
+ | 2.4809 | 975 | 0.0494 |
1390
+ | 2.4835 | 976 | 0.0192 |
1391
+ | 2.4860 | 977 | 0.0346 |
1392
+ | 2.4885 | 978 | 0.0167 |
1393
+ | 2.4911 | 979 | 0.0274 |
1394
+ | 2.4936 | 980 | 0.0046 |
1395
+ | 2.4962 | 981 | 0.0301 |
1396
+ | 2.4987 | 982 | 0.0246 |
1397
+ | 2.5013 | 983 | 0.0222 |
1398
+ | 2.5038 | 984 | 0.0346 |
1399
+ | 2.5064 | 985 | 0.0595 |
1400
+ | 2.5089 | 986 | 0.0221 |
1401
+ | 2.5115 | 987 | 0.0211 |
1402
+ | 2.5140 | 988 | 0.0092 |
1403
+ | 2.5165 | 989 | 0.0225 |
1404
+ | 2.5191 | 990 | 0.0452 |
1405
+ | 2.5216 | 991 | 0.0288 |
1406
+ | 2.5242 | 992 | 0.044 |
1407
+ | 2.5267 | 993 | 0.0308 |
1408
+ | 2.5293 | 994 | 0.0309 |
1409
+ | 2.5318 | 995 | 0.0495 |
1410
+ | 2.5344 | 996 | 0.0384 |
1411
+ | 2.5369 | 997 | 0.0834 |
1412
+ | 2.5394 | 998 | 0.0866 |
1413
+ | 2.5420 | 999 | 0.0076 |
1414
+ | 2.5445 | 1000 | 0.0071 |
1415
+ | 2.5471 | 1001 | 0.0634 |
1416
+ | 2.5496 | 1002 | 0.0144 |
1417
+ | 2.5522 | 1003 | 0.077 |
1418
+ | 2.5547 | 1004 | 0.0347 |
1419
+ | 2.5573 | 1005 | 0.0081 |
1420
+ | 2.5598 | 1006 | 0.0216 |
1421
+ | 2.5623 | 1007 | 0.0437 |
1422
+ | 2.5649 | 1008 | 0.0367 |
1423
+ | 2.5674 | 1009 | 0.0281 |
1424
+ | 2.5700 | 1010 | 0.0312 |
1425
+ | 2.5725 | 1011 | 0.0181 |
1426
+ | 2.5751 | 1012 | 0.0226 |
1427
+ | 2.5776 | 1013 | 0.0558 |
1428
+ | 2.5802 | 1014 | 0.0267 |
1429
+ | 2.5827 | 1015 | 0.0596 |
1430
+ | 2.5852 | 1016 | 0.046 |
1431
+ | 2.5878 | 1017 | 0.0465 |
1432
+ | 2.5903 | 1018 | 0.0035 |
1433
+ | 2.5929 | 1019 | 0.019 |
1434
+ | 2.5954 | 1020 | 0.0118 |
1435
+ | 2.5980 | 1021 | 0.0128 |
1436
+ | 2.6005 | 1022 | 0.0458 |
1437
+ | 2.6031 | 1023 | 0.0185 |
1438
+ | 2.6056 | 1024 | 0.0309 |
1439
+ | 2.6081 | 1025 | 0.0142 |
1440
+ | 2.6107 | 1026 | 0.0732 |
1441
+ | 2.6132 | 1027 | 0.0327 |
1442
+ | 2.6158 | 1028 | 0.0296 |
1443
+ | 2.6183 | 1029 | 0.0237 |
1444
+ | 2.6209 | 1030 | 0.0169 |
1445
+ | 2.6234 | 1031 | 0.0306 |
1446
+ | 2.6260 | 1032 | 0.0235 |
1447
+ | 2.6285 | 1033 | 0.009 |
1448
+ | 2.6310 | 1034 | 0.0118 |
1449
+ | 2.6336 | 1035 | 0.0067 |
1450
+ | 2.6361 | 1036 | 0.008 |
1451
+ | 2.6387 | 1037 | 0.0202 |
1452
+ | 2.6412 | 1038 | 0.0241 |
1453
+ | 2.6438 | 1039 | 0.0118 |
1454
+ | 2.6463 | 1040 | 0.0161 |
1455
+ | 2.6489 | 1041 | 0.0242 |
1456
+ | 2.6514 | 1042 | 0.0072 |
1457
+ | 2.6539 | 1043 | 0.037 |
1458
+ | 2.6565 | 1044 | 0.0362 |
1459
+ | 2.6590 | 1045 | 0.0213 |
1460
+ | 2.6616 | 1046 | 0.0458 |
1461
+ | 2.6641 | 1047 | 0.0358 |
1462
+ | 2.6667 | 1048 | 0.024 |
1463
+ | 2.6692 | 1049 | 0.0093 |
1464
+ | 2.6718 | 1050 | 0.0306 |
1465
+ | 2.6743 | 1051 | 0.0075 |
1466
+ | 2.6768 | 1052 | 0.0193 |
1467
+ | 2.6794 | 1053 | 0.048 |
1468
+ | 2.6819 | 1054 | 0.0058 |
1469
+ | 2.6845 | 1055 | 0.0233 |
1470
+ | 2.6870 | 1056 | 0.0264 |
1471
+ | 2.6896 | 1057 | 0.0276 |
1472
+ | 2.6921 | 1058 | 0.0346 |
1473
+ | 2.6947 | 1059 | 0.0854 |
1474
+ | 2.6972 | 1060 | 0.0119 |
1475
+ | 2.6997 | 1061 | 0.0174 |
1476
+ | 2.7023 | 1062 | 0.0514 |
1477
+ | 2.7048 | 1063 | 0.0628 |
1478
+ | 2.7074 | 1064 | 0.0721 |
1479
+ | 2.7099 | 1065 | 0.0246 |
1480
+ | 2.7125 | 1066 | 0.049 |
1481
+ | 2.7150 | 1067 | 0.0148 |
1482
+ | 2.7176 | 1068 | 0.1024 |
1483
+ | 2.7201 | 1069 | 0.0312 |
1484
+ | 2.7226 | 1070 | 0.029 |
1485
+ | 2.7252 | 1071 | 0.0352 |
1486
+ | 2.7277 | 1072 | 0.0131 |
1487
+ | 2.7303 | 1073 | 0.0195 |
1488
+ | 2.7328 | 1074 | 0.0064 |
1489
+ | 2.7354 | 1075 | 0.0169 |
1490
+ | 2.7379 | 1076 | 0.0232 |
1491
+ | 2.7405 | 1077 | 0.0216 |
1492
+ | 2.7430 | 1078 | 0.0058 |
1493
+ | 2.7455 | 1079 | 0.0089 |
1494
+ | 2.7481 | 1080 | 0.0143 |
1495
+ | 2.7506 | 1081 | 0.0168 |
1496
+ | 2.7532 | 1082 | 0.0331 |
1497
+ | 2.7557 | 1083 | 0.0255 |
1498
+ | 2.7583 | 1084 | 0.0312 |
1499
+ | 2.7608 | 1085 | 0.0125 |
1500
+ | 2.7634 | 1086 | 0.0228 |
1501
+ | 2.7659 | 1087 | 0.0083 |
1502
+ | 2.7684 | 1088 | 0.0141 |
1503
+ | 2.7710 | 1089 | 0.0189 |
1504
+ | 2.7735 | 1090 | 0.0109 |
1505
+ | 2.7761 | 1091 | 0.0195 |
1506
+ | 2.7786 | 1092 | 0.0169 |
1507
+ | 2.7812 | 1093 | 0.0937 |
1508
+ | 2.7837 | 1094 | 0.019 |
1509
+ | 2.7863 | 1095 | 0.0856 |
1510
+ | 2.7888 | 1096 | 0.0155 |
1511
+ | 2.7913 | 1097 | 0.0408 |
1512
+ | 2.7939 | 1098 | 0.0279 |
1513
+ | 2.7964 | 1099 | 0.008 |
1514
+ | 2.7990 | 1100 | 0.086 |
1515
+ | 2.8015 | 1101 | 0.0078 |
1516
+ | 2.8041 | 1102 | 0.0186 |
1517
+ | 2.8066 | 1103 | 0.0468 |
1518
+ | 2.8092 | 1104 | 0.0255 |
1519
+ | 2.8117 | 1105 | 0.0418 |
1520
+ | 2.8142 | 1106 | 0.0188 |
1521
+ | 2.8168 | 1107 | 0.0197 |
1522
+ | 2.8193 | 1108 | 0.023 |
1523
+ | 2.8219 | 1109 | 0.0421 |
1524
+ | 2.8244 | 1110 | 0.0301 |
1525
+ | 2.8270 | 1111 | 0.0627 |
1526
+ | 2.8295 | 1112 | 0.0052 |
1527
+ | 2.8321 | 1113 | 0.0163 |
1528
+ | 2.8346 | 1114 | 0.0209 |
1529
+ | 2.8372 | 1115 | 0.0277 |
1530
+ | 2.8397 | 1116 | 0.0211 |
1531
+ | 2.8422 | 1117 | 0.0066 |
1532
+ | 2.8448 | 1118 | 0.0263 |
1533
+ | 2.8473 | 1119 | 0.0408 |
1534
+ | 2.8499 | 1120 | 0.0516 |
1535
+ | 2.8524 | 1121 | 0.0748 |
1536
+ | 2.8550 | 1122 | 0.0309 |
1537
+ | 2.8575 | 1123 | 0.007 |
1538
+ | 2.8601 | 1124 | 0.014 |
1539
+ | 2.8626 | 1125 | 0.0284 |
1540
+ | 2.8651 | 1126 | 0.0165 |
1541
+ | 2.8677 | 1127 | 0.0975 |
1542
+ | 2.8702 | 1128 | 0.0354 |
1543
+ | 2.8728 | 1129 | 0.0235 |
1544
+ | 2.8753 | 1130 | 0.0074 |
1545
+ | 2.8779 | 1131 | 0.0386 |
1546
+ | 2.8804 | 1132 | 0.0173 |
1547
+ | 2.8830 | 1133 | 0.0211 |
1548
+ | 2.8855 | 1134 | 0.0305 |
1549
+ | 2.8880 | 1135 | 0.0219 |
1550
+ | 2.8906 | 1136 | 0.0454 |
1551
+ | 2.8931 | 1137 | 0.0176 |
1552
+ | 2.8957 | 1138 | 0.0261 |
1553
+ | 2.8982 | 1139 | 0.0274 |
1554
+ | 2.9008 | 1140 | 0.0131 |
1555
+ | 2.9033 | 1141 | 0.0485 |
1556
+ | 2.9059 | 1142 | 0.0129 |
1557
+ | 2.9084 | 1143 | 0.05 |
1558
+ | 2.9109 | 1144 | 0.0306 |
1559
+ | 2.9135 | 1145 | 0.0352 |
1560
+ | 2.9160 | 1146 | 0.0271 |
1561
+ | 2.9186 | 1147 | 0.0216 |
1562
+ | 2.9211 | 1148 | 0.0567 |
1563
+ | 2.9237 | 1149 | 0.0258 |
1564
+ | 2.9262 | 1150 | 0.0221 |
1565
+ | 2.9288 | 1151 | 0.0112 |
1566
+ | 2.9313 | 1152 | 0.0199 |
1567
+ | 2.9338 | 1153 | 0.0388 |
1568
+ | 2.9364 | 1154 | 0.0101 |
1569
+ | 2.9389 | 1155 | 0.0179 |
1570
+ | 2.9415 | 1156 | 0.0358 |
1571
+ | 2.9440 | 1157 | 0.0247 |
1572
+ | 2.9466 | 1158 | 0.031 |
1573
+ | 2.9491 | 1159 | 0.0367 |
1574
+ | 2.9517 | 1160 | 0.0198 |
1575
+ | 2.9542 | 1161 | 0.0346 |
1576
+ | 2.9567 | 1162 | 0.011 |
1577
+ | 2.9593 | 1163 | 0.139 |
1578
+ | 2.9618 | 1164 | 0.0555 |
1579
+ | 2.9644 | 1165 | 0.0228 |
1580
+ | 2.9669 | 1166 | 0.0377 |
1581
+ | 2.9695 | 1167 | 0.024 |
1582
+ | 2.9720 | 1168 | 0.0331 |
1583
+ | 2.9746 | 1169 | 0.0815 |
1584
+ | 2.9771 | 1170 | 0.0116 |
1585
+ | 2.9796 | 1171 | 0.0186 |
1586
+ | 2.9822 | 1172 | 0.0153 |
1587
+ | 2.9847 | 1173 | 0.0557 |
1588
+ | 2.9873 | 1174 | 0.0406 |
1589
+ | 2.9898 | 1175 | 0.0334 |
1590
+ | 2.9924 | 1176 | 0.0265 |
1591
+ | 2.9949 | 1177 | 0.0333 |
1592
+ | 2.9975 | 1178 | 0.0177 |
1593
+ | 3.0 | 1179 | 0.0028 |
1594
+
1595
+ </details>
1596
+
1597
+ ### Framework Versions
1598
+ - Python: 3.11.10
1599
+ - Sentence Transformers: 4.0.2
1600
+ - PyLate: 1.1.7
1601
+ - Transformers: 4.48.2
1602
+ - PyTorch: 2.5.1+cu124
1603
+ - Accelerate: 1.1.1
1604
+ - Datasets: 2.21.0
1605
+ - Tokenizers: 0.21.0
1606
+
1607
+
1608
+ ## Citation
1609
+
1610
+ ### BibTeX
1611
+
1612
+ #### Reason-ModernColBERT
1613
+ ```bibtex
1614
+ @misc{Reason-ModernColBERT,
1615
+ title={Reason-ModernColBERT},
1616
+ author={Chaffin, Antoine},
1617
+ url={https://huggingface.co/lightonai/Reason-ModernColBERT},
1618
+ year={2025}
1619
+ }
1620
+ ```
1621
+
1622
+ #### GTE-ModernColBERT
1623
+ ```bibtex
1624
+ @misc{GTE-ModernColBERT,
1625
+ title={GTE-ModernColBERT},
1626
+ author={Chaffin, Antoine},
1627
+ url={https://huggingface.co/lightonai/GTE-ModernColBERT-v1},
1628
+ year={2025}
1629
+ }
1630
+ ```
1631
+
1632
+ #### Sentence Transformers
1633
+ ```bibtex
1634
+ @inproceedings{reimers-2019-sentence-bert,
1635
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1636
+ author = "Reimers, Nils and Gurevych, Iryna",
1637
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1638
+ month = "11",
1639
+ year = "2019",
1640
+ publisher = "Association for Computational Linguistics",
1641
+ url = "https://arxiv.org/abs/1908.10084"
1642
+ }
1643
+ ```
1644
+
1645
+ #### PyLate
1646
+ ```bibtex
1647
+ @misc{PyLate,
1648
+ title={PyLate: Flexible Training and Retrieval for Late Interaction Models},
1649
+ author={Chaffin, Antoine and Sourty, Raphaël},
1650
+ url={https://github.com/lightonai/pylate},
1651
+ year={2024}
1652
+ }
1653
+ ```
1654
+
1655
+ #### CachedContrastive
1656
+ ```bibtex
1657
+ @misc{gao2021scaling,
1658
+ title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
1659
+ author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
1660
+ year={2021},
1661
+ eprint={2101.06983},
1662
+ archivePrefix={arXiv},
1663
+ primaryClass={cs.LG}
1664
+ }
1665
+ ```
1666
+
1667
+ <!--
1668
+ ## Glossary
1669
+
1670
+ *Clearly define terms in order to be accessible across audiences.*
1671
+ -->
1672
+
1673
+ <!--
1674
+ ## Model Card Authors
1675
+
1676
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1677
+ -->
1678
+
1679
+ <!--
1680
+ ## Model Card Contact
1681
+
1682
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1683
+ -->
config.json ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "/opt/home/nohtow/pylate/examples/train/output/GTE-ModernColBERT-v1/GTE-ModernColBERT-v1-ReasonIR_temp_1.0_noskiplist_1e-05_3epoch/final",
3
+ "architectures": [
4
+ "ModernBertModel"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 50281,
9
+ "classifier_activation": "gelu",
10
+ "classifier_bias": false,
11
+ "classifier_dropout": 0.0,
12
+ "classifier_pooling": "mean",
13
+ "cls_token_id": 50281,
14
+ "decoder_bias": true,
15
+ "deterministic_flash_attn": false,
16
+ "embedding_dropout": 0.0,
17
+ "eos_token_id": 50282,
18
+ "global_attn_every_n_layers": 3,
19
+ "global_rope_theta": 160000.0,
20
+ "gradient_checkpointing": false,
21
+ "hidden_activation": "gelu",
22
+ "hidden_size": 768,
23
+ "initializer_cutoff_factor": 2.0,
24
+ "initializer_range": 0.02,
25
+ "intermediate_size": 1152,
26
+ "layer_norm_eps": 1e-05,
27
+ "local_attention": 128,
28
+ "local_rope_theta": 10000.0,
29
+ "max_position_embeddings": 8192,
30
+ "mlp_bias": false,
31
+ "mlp_dropout": 0.0,
32
+ "model_type": "modernbert",
33
+ "norm_bias": false,
34
+ "norm_eps": 1e-05,
35
+ "num_attention_heads": 12,
36
+ "num_hidden_layers": 22,
37
+ "pad_token_id": 50283,
38
+ "position_embedding_type": "absolute",
39
+ "reference_compile": false,
40
+ "repad_logits_with_grad": false,
41
+ "sep_token_id": 50282,
42
+ "sparse_pred_ignore_index": -100,
43
+ "sparse_prediction": false,
44
+ "torch_dtype": "float32",
45
+ "transformers_version": "4.48.2",
46
+ "vocab_size": 50370
47
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "4.0.2",
4
+ "transformers": "4.48.2",
5
+ "pytorch": "2.5.1+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "MaxSim",
10
+ "query_prefix": "[Q] ",
11
+ "document_prefix": "[D] ",
12
+ "query_length": 128,
13
+ "document_length": 8192,
14
+ "attend_to_expansion_tokens": false,
15
+ "skiplist_words": [
16
+ "!",
17
+ "\"",
18
+ "#",
19
+ "$",
20
+ "%",
21
+ "&",
22
+ "'",
23
+ "(",
24
+ ")",
25
+ "*",
26
+ "+",
27
+ ",",
28
+ "-",
29
+ ".",
30
+ "/",
31
+ ":",
32
+ ";",
33
+ "<",
34
+ "=",
35
+ ">",
36
+ "?",
37
+ "@",
38
+ "[",
39
+ "\\",
40
+ "]",
41
+ "^",
42
+ "_",
43
+ "`",
44
+ "{",
45
+ "|",
46
+ "}",
47
+ "~"
48
+ ]
49
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a62051d1a9c889955b3035c2f5b7157eec1a5d3de25e15966ab80192f666a934
3
+ size 596076280
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Dense",
12
+ "type": "pylate.models.Dense.Dense"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 127,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": true,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": "[MASK]",
17
+ "sep_token": {
18
+ "content": "[SEP]",
19
+ "lstrip": false,
20
+ "normalized": false,
21
+ "rstrip": false,
22
+ "single_word": false
23
+ },
24
+ "unk_token": {
25
+ "content": "[UNK]",
26
+ "lstrip": false,
27
+ "normalized": false,
28
+ "rstrip": false,
29
+ "single_word": false
30
+ }
31
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,968 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "|||IP_ADDRESS|||",
5
+ "lstrip": false,
6
+ "normalized": true,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": false
10
+ },
11
+ "1": {
12
+ "content": "<|padding|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "50254": {
20
+ "content": " ",
21
+ "lstrip": false,
22
+ "normalized": true,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": false
26
+ },
27
+ "50255": {
28
+ "content": " ",
29
+ "lstrip": false,
30
+ "normalized": true,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": false
34
+ },
35
+ "50256": {
36
+ "content": " ",
37
+ "lstrip": false,
38
+ "normalized": true,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": false
42
+ },
43
+ "50257": {
44
+ "content": " ",
45
+ "lstrip": false,
46
+ "normalized": true,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": false
50
+ },
51
+ "50258": {
52
+ "content": " ",
53
+ "lstrip": false,
54
+ "normalized": true,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": false
58
+ },
59
+ "50259": {
60
+ "content": " ",
61
+ "lstrip": false,
62
+ "normalized": true,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": false
66
+ },
67
+ "50260": {
68
+ "content": " ",
69
+ "lstrip": false,
70
+ "normalized": true,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": false
74
+ },
75
+ "50261": {
76
+ "content": " ",
77
+ "lstrip": false,
78
+ "normalized": true,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": false
82
+ },
83
+ "50262": {
84
+ "content": " ",
85
+ "lstrip": false,
86
+ "normalized": true,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": false
90
+ },
91
+ "50263": {
92
+ "content": " ",
93
+ "lstrip": false,
94
+ "normalized": true,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": false
98
+ },
99
+ "50264": {
100
+ "content": " ",
101
+ "lstrip": false,
102
+ "normalized": true,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": false
106
+ },
107
+ "50265": {
108
+ "content": " ",
109
+ "lstrip": false,
110
+ "normalized": true,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": false
114
+ },
115
+ "50266": {
116
+ "content": " ",
117
+ "lstrip": false,
118
+ "normalized": true,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": false
122
+ },
123
+ "50267": {
124
+ "content": " ",
125
+ "lstrip": false,
126
+ "normalized": true,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": false
130
+ },
131
+ "50268": {
132
+ "content": " ",
133
+ "lstrip": false,
134
+ "normalized": true,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": false
138
+ },
139
+ "50269": {
140
+ "content": " ",
141
+ "lstrip": false,
142
+ "normalized": true,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": false
146
+ },
147
+ "50270": {
148
+ "content": " ",
149
+ "lstrip": false,
150
+ "normalized": true,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": false
154
+ },
155
+ "50271": {
156
+ "content": " ",
157
+ "lstrip": false,
158
+ "normalized": true,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": false
162
+ },
163
+ "50272": {
164
+ "content": " ",
165
+ "lstrip": false,
166
+ "normalized": true,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": false
170
+ },
171
+ "50273": {
172
+ "content": " ",
173
+ "lstrip": false,
174
+ "normalized": true,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": false
178
+ },
179
+ "50274": {
180
+ "content": " ",
181
+ "lstrip": false,
182
+ "normalized": true,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": false
186
+ },
187
+ "50275": {
188
+ "content": " ",
189
+ "lstrip": false,
190
+ "normalized": true,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": false
194
+ },
195
+ "50276": {
196
+ "content": " ",
197
+ "lstrip": false,
198
+ "normalized": true,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": false
202
+ },
203
+ "50277": {
204
+ "content": "|||EMAIL_ADDRESS|||",
205
+ "lstrip": false,
206
+ "normalized": true,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": false
210
+ },
211
+ "50278": {
212
+ "content": "|||PHONE_NUMBER|||",
213
+ "lstrip": false,
214
+ "normalized": true,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": false
218
+ },
219
+ "50279": {
220
+ "content": "<|endoftext|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "50280": {
228
+ "content": "[UNK]",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "50281": {
236
+ "content": "[CLS]",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "50282": {
244
+ "content": "[SEP]",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "50283": {
252
+ "content": "[PAD]",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "50284": {
260
+ "content": "[MASK]",
261
+ "lstrip": true,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "50285": {
268
+ "content": "[unused0]",
269
+ "lstrip": false,
270
+ "normalized": true,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": false
274
+ },
275
+ "50286": {
276
+ "content": "[unused1]",
277
+ "lstrip": false,
278
+ "normalized": true,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": false
282
+ },
283
+ "50287": {
284
+ "content": "[unused2]",
285
+ "lstrip": false,
286
+ "normalized": true,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": false
290
+ },
291
+ "50288": {
292
+ "content": "[unused3]",
293
+ "lstrip": false,
294
+ "normalized": true,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": false
298
+ },
299
+ "50289": {
300
+ "content": "[unused4]",
301
+ "lstrip": false,
302
+ "normalized": true,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": false
306
+ },
307
+ "50290": {
308
+ "content": "[unused5]",
309
+ "lstrip": false,
310
+ "normalized": true,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": false
314
+ },
315
+ "50291": {
316
+ "content": "[unused6]",
317
+ "lstrip": false,
318
+ "normalized": true,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": false
322
+ },
323
+ "50292": {
324
+ "content": "[unused7]",
325
+ "lstrip": false,
326
+ "normalized": true,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": false
330
+ },
331
+ "50293": {
332
+ "content": "[unused8]",
333
+ "lstrip": false,
334
+ "normalized": true,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": false
338
+ },
339
+ "50294": {
340
+ "content": "[unused9]",
341
+ "lstrip": false,
342
+ "normalized": true,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": false
346
+ },
347
+ "50295": {
348
+ "content": "[unused10]",
349
+ "lstrip": false,
350
+ "normalized": true,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": false
354
+ },
355
+ "50296": {
356
+ "content": "[unused11]",
357
+ "lstrip": false,
358
+ "normalized": true,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": false
362
+ },
363
+ "50297": {
364
+ "content": "[unused12]",
365
+ "lstrip": false,
366
+ "normalized": true,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": false
370
+ },
371
+ "50298": {
372
+ "content": "[unused13]",
373
+ "lstrip": false,
374
+ "normalized": true,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": false
378
+ },
379
+ "50299": {
380
+ "content": "[unused14]",
381
+ "lstrip": false,
382
+ "normalized": true,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": false
386
+ },
387
+ "50300": {
388
+ "content": "[unused15]",
389
+ "lstrip": false,
390
+ "normalized": true,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": false
394
+ },
395
+ "50301": {
396
+ "content": "[unused16]",
397
+ "lstrip": false,
398
+ "normalized": true,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": false
402
+ },
403
+ "50302": {
404
+ "content": "[unused17]",
405
+ "lstrip": false,
406
+ "normalized": true,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": false
410
+ },
411
+ "50303": {
412
+ "content": "[unused18]",
413
+ "lstrip": false,
414
+ "normalized": true,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": false
418
+ },
419
+ "50304": {
420
+ "content": "[unused19]",
421
+ "lstrip": false,
422
+ "normalized": true,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": false
426
+ },
427
+ "50305": {
428
+ "content": "[unused20]",
429
+ "lstrip": false,
430
+ "normalized": true,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": false
434
+ },
435
+ "50306": {
436
+ "content": "[unused21]",
437
+ "lstrip": false,
438
+ "normalized": true,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": false
442
+ },
443
+ "50307": {
444
+ "content": "[unused22]",
445
+ "lstrip": false,
446
+ "normalized": true,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": false
450
+ },
451
+ "50308": {
452
+ "content": "[unused23]",
453
+ "lstrip": false,
454
+ "normalized": true,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": false
458
+ },
459
+ "50309": {
460
+ "content": "[unused24]",
461
+ "lstrip": false,
462
+ "normalized": true,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": false
466
+ },
467
+ "50310": {
468
+ "content": "[unused25]",
469
+ "lstrip": false,
470
+ "normalized": true,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": false
474
+ },
475
+ "50311": {
476
+ "content": "[unused26]",
477
+ "lstrip": false,
478
+ "normalized": true,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": false
482
+ },
483
+ "50312": {
484
+ "content": "[unused27]",
485
+ "lstrip": false,
486
+ "normalized": true,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": false
490
+ },
491
+ "50313": {
492
+ "content": "[unused28]",
493
+ "lstrip": false,
494
+ "normalized": true,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": false
498
+ },
499
+ "50314": {
500
+ "content": "[unused29]",
501
+ "lstrip": false,
502
+ "normalized": true,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": false
506
+ },
507
+ "50315": {
508
+ "content": "[unused30]",
509
+ "lstrip": false,
510
+ "normalized": true,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": false
514
+ },
515
+ "50316": {
516
+ "content": "[unused31]",
517
+ "lstrip": false,
518
+ "normalized": true,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": false
522
+ },
523
+ "50317": {
524
+ "content": "[unused32]",
525
+ "lstrip": false,
526
+ "normalized": true,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": false
530
+ },
531
+ "50318": {
532
+ "content": "[unused33]",
533
+ "lstrip": false,
534
+ "normalized": true,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": false
538
+ },
539
+ "50319": {
540
+ "content": "[unused34]",
541
+ "lstrip": false,
542
+ "normalized": true,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": false
546
+ },
547
+ "50320": {
548
+ "content": "[unused35]",
549
+ "lstrip": false,
550
+ "normalized": true,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": false
554
+ },
555
+ "50321": {
556
+ "content": "[unused36]",
557
+ "lstrip": false,
558
+ "normalized": true,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": false
562
+ },
563
+ "50322": {
564
+ "content": "[unused37]",
565
+ "lstrip": false,
566
+ "normalized": true,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": false
570
+ },
571
+ "50323": {
572
+ "content": "[unused38]",
573
+ "lstrip": false,
574
+ "normalized": true,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": false
578
+ },
579
+ "50324": {
580
+ "content": "[unused39]",
581
+ "lstrip": false,
582
+ "normalized": true,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": false
586
+ },
587
+ "50325": {
588
+ "content": "[unused40]",
589
+ "lstrip": false,
590
+ "normalized": true,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": false
594
+ },
595
+ "50326": {
596
+ "content": "[unused41]",
597
+ "lstrip": false,
598
+ "normalized": true,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": false
602
+ },
603
+ "50327": {
604
+ "content": "[unused42]",
605
+ "lstrip": false,
606
+ "normalized": true,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": false
610
+ },
611
+ "50328": {
612
+ "content": "[unused43]",
613
+ "lstrip": false,
614
+ "normalized": true,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": false
618
+ },
619
+ "50329": {
620
+ "content": "[unused44]",
621
+ "lstrip": false,
622
+ "normalized": true,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": false
626
+ },
627
+ "50330": {
628
+ "content": "[unused45]",
629
+ "lstrip": false,
630
+ "normalized": true,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": false
634
+ },
635
+ "50331": {
636
+ "content": "[unused46]",
637
+ "lstrip": false,
638
+ "normalized": true,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": false
642
+ },
643
+ "50332": {
644
+ "content": "[unused47]",
645
+ "lstrip": false,
646
+ "normalized": true,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": false
650
+ },
651
+ "50333": {
652
+ "content": "[unused48]",
653
+ "lstrip": false,
654
+ "normalized": true,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": false
658
+ },
659
+ "50334": {
660
+ "content": "[unused49]",
661
+ "lstrip": false,
662
+ "normalized": true,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": false
666
+ },
667
+ "50335": {
668
+ "content": "[unused50]",
669
+ "lstrip": false,
670
+ "normalized": true,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": false
674
+ },
675
+ "50336": {
676
+ "content": "[unused51]",
677
+ "lstrip": false,
678
+ "normalized": true,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": false
682
+ },
683
+ "50337": {
684
+ "content": "[unused52]",
685
+ "lstrip": false,
686
+ "normalized": true,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": false
690
+ },
691
+ "50338": {
692
+ "content": "[unused53]",
693
+ "lstrip": false,
694
+ "normalized": true,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": false
698
+ },
699
+ "50339": {
700
+ "content": "[unused54]",
701
+ "lstrip": false,
702
+ "normalized": true,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": false
706
+ },
707
+ "50340": {
708
+ "content": "[unused55]",
709
+ "lstrip": false,
710
+ "normalized": true,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": false
714
+ },
715
+ "50341": {
716
+ "content": "[unused56]",
717
+ "lstrip": false,
718
+ "normalized": true,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": false
722
+ },
723
+ "50342": {
724
+ "content": "[unused57]",
725
+ "lstrip": false,
726
+ "normalized": true,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": false
730
+ },
731
+ "50343": {
732
+ "content": "[unused58]",
733
+ "lstrip": false,
734
+ "normalized": true,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": false
738
+ },
739
+ "50344": {
740
+ "content": "[unused59]",
741
+ "lstrip": false,
742
+ "normalized": true,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": false
746
+ },
747
+ "50345": {
748
+ "content": "[unused60]",
749
+ "lstrip": false,
750
+ "normalized": true,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": false
754
+ },
755
+ "50346": {
756
+ "content": "[unused61]",
757
+ "lstrip": false,
758
+ "normalized": true,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": false
762
+ },
763
+ "50347": {
764
+ "content": "[unused62]",
765
+ "lstrip": false,
766
+ "normalized": true,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": false
770
+ },
771
+ "50348": {
772
+ "content": "[unused63]",
773
+ "lstrip": false,
774
+ "normalized": true,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": false
778
+ },
779
+ "50349": {
780
+ "content": "[unused64]",
781
+ "lstrip": false,
782
+ "normalized": true,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": false
786
+ },
787
+ "50350": {
788
+ "content": "[unused65]",
789
+ "lstrip": false,
790
+ "normalized": true,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": false
794
+ },
795
+ "50351": {
796
+ "content": "[unused66]",
797
+ "lstrip": false,
798
+ "normalized": true,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": false
802
+ },
803
+ "50352": {
804
+ "content": "[unused67]",
805
+ "lstrip": false,
806
+ "normalized": true,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": false
810
+ },
811
+ "50353": {
812
+ "content": "[unused68]",
813
+ "lstrip": false,
814
+ "normalized": true,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": false
818
+ },
819
+ "50354": {
820
+ "content": "[unused69]",
821
+ "lstrip": false,
822
+ "normalized": true,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": false
826
+ },
827
+ "50355": {
828
+ "content": "[unused70]",
829
+ "lstrip": false,
830
+ "normalized": true,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": false
834
+ },
835
+ "50356": {
836
+ "content": "[unused71]",
837
+ "lstrip": false,
838
+ "normalized": true,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": false
842
+ },
843
+ "50357": {
844
+ "content": "[unused72]",
845
+ "lstrip": false,
846
+ "normalized": true,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": false
850
+ },
851
+ "50358": {
852
+ "content": "[unused73]",
853
+ "lstrip": false,
854
+ "normalized": true,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": false
858
+ },
859
+ "50359": {
860
+ "content": "[unused74]",
861
+ "lstrip": false,
862
+ "normalized": true,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": false
866
+ },
867
+ "50360": {
868
+ "content": "[unused75]",
869
+ "lstrip": false,
870
+ "normalized": true,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": false
874
+ },
875
+ "50361": {
876
+ "content": "[unused76]",
877
+ "lstrip": false,
878
+ "normalized": true,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": false
882
+ },
883
+ "50362": {
884
+ "content": "[unused77]",
885
+ "lstrip": false,
886
+ "normalized": true,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": false
890
+ },
891
+ "50363": {
892
+ "content": "[unused78]",
893
+ "lstrip": false,
894
+ "normalized": true,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": false
898
+ },
899
+ "50364": {
900
+ "content": "[unused79]",
901
+ "lstrip": false,
902
+ "normalized": true,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": false
906
+ },
907
+ "50365": {
908
+ "content": "[unused80]",
909
+ "lstrip": false,
910
+ "normalized": true,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": false
914
+ },
915
+ "50366": {
916
+ "content": "[unused81]",
917
+ "lstrip": false,
918
+ "normalized": true,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": false
922
+ },
923
+ "50367": {
924
+ "content": "[unused82]",
925
+ "lstrip": false,
926
+ "normalized": true,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": false
930
+ },
931
+ "50368": {
932
+ "content": "[Q] ",
933
+ "lstrip": false,
934
+ "normalized": true,
935
+ "rstrip": false,
936
+ "single_word": false,
937
+ "special": false
938
+ },
939
+ "50369": {
940
+ "content": "[D] ",
941
+ "lstrip": false,
942
+ "normalized": true,
943
+ "rstrip": false,
944
+ "single_word": false,
945
+ "special": false
946
+ }
947
+ },
948
+ "clean_up_tokenization_spaces": true,
949
+ "cls_token": "[CLS]",
950
+ "extra_special_tokens": {},
951
+ "mask_token": "[MASK]",
952
+ "max_length": 299,
953
+ "model_input_names": [
954
+ "input_ids",
955
+ "attention_mask"
956
+ ],
957
+ "model_max_length": 127,
958
+ "pad_to_multiple_of": null,
959
+ "pad_token": "[MASK]",
960
+ "pad_token_type_id": 0,
961
+ "padding_side": "right",
962
+ "sep_token": "[SEP]",
963
+ "stride": 0,
964
+ "tokenizer_class": "PreTrainedTokenizerFast",
965
+ "truncation_side": "right",
966
+ "truncation_strategy": "longest_first",
967
+ "unk_token": "[UNK]"
968
+ }