tomaarsen HF staff commited on
Commit
daebe35
·
verified ·
1 Parent(s): 75d2ed8

Add new CrossEncoder model

Browse files
Files changed (6) hide show
  1. README.md +431 -0
  2. config.json +53 -0
  3. model.safetensors +3 -0
  4. special_tokens_map.json +37 -0
  5. tokenizer.json +0 -0
  6. tokenizer_config.json +945 -0
README.md ADDED
@@ -0,0 +1,431 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ tags:
5
+ - sentence-transformers
6
+ - cross-encoder
7
+ - text-classification
8
+ - generated_from_trainer
9
+ - dataset_size:522240
10
+ - loss:BinaryCrossEntropyLoss
11
+ base_model: answerdotai/ModernBERT-base
12
+ datasets:
13
+ - sentence-transformers/natural-questions
14
+ pipeline_tag: text-classification
15
+ library_name: sentence-transformers
16
+ metrics:
17
+ - map
18
+ - mrr@10
19
+ - ndcg@10
20
+ model-index:
21
+ - name: CrossEncoder based on answerdotai/ModernBERT-base
22
+ results: []
23
+ ---
24
+
25
+ # CrossEncoder based on answerdotai/ModernBERT-base
26
+
27
+ This is a [Cross Encoder](https://www.sbert.net/docs/cross_encoder/usage/usage.html) model finetuned from [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) using the [sentence-transformers](https://www.SBERT.net) library. It computes scores for pairs of texts, which can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
28
+
29
+ ## Model Details
30
+
31
+ ### Model Description
32
+ - **Model Type:** Cross Encoder
33
+ - **Base model:** [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) <!-- at revision 8949b909ec900327062f0ebf497f51aef5e6f0c8 -->
34
+ - **Maximum Sequence Length:** 8192 tokens
35
+ - **Number of Output Labels:** 1 label
36
+ <!-- - **Training Dataset:** Unknown -->
37
+ - **Language:** en
38
+ <!-- - **License:** Unknown -->
39
+
40
+ ### Model Sources
41
+
42
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
43
+ - **Documentation:** [Cross Encoder Documentation](https://www.sbert.net/docs/cross_encoder/usage/usage.html)
44
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
45
+ - **Hugging Face:** [Cross Encoders on Hugging Face](https://huggingface.co/models?library=sentence-transformers&other=cross-encoder)
46
+
47
+ ## Usage
48
+
49
+ ### Direct Usage (Sentence Transformers)
50
+
51
+ First install the Sentence Transformers library:
52
+
53
+ ```bash
54
+ pip install -U sentence-transformers
55
+ ```
56
+
57
+ Then you can load this model and run inference.
58
+ ```python
59
+ from sentence_transformers import CrossEncoder
60
+
61
+ # Download from the 🤗 Hub
62
+ model = CrossEncoder("tomaarsen/reranker-ModernBERT-base-nq-bce-static-retriever")
63
+ # Get scores for pairs of texts
64
+ pairs = [
65
+ ['difference between russian blue and british blue cat', 'Russian Blue The coat is known as a "double coat", with the undercoat being soft, downy and equal in length to the guard hairs, which are an even blue with silver tips. However, the tail may have a few very dull, almost unnoticeable stripes. The coat is described as thick, plush and soft to the touch. The feeling is softer than the softest silk. The silver tips give the coat a shimmering appearance. Its eyes are almost always a dark and vivid green. Any white patches of fur or yellow eyes in adulthood are seen as flaws in show cats.[3] Russian Blues should not be confused with British Blues (which are not a distinct breed, but rather a British Shorthair with a blue coat as the British Shorthair breed itself comes in a wide variety of colors and patterns), nor the Chartreux or Korat which are two other naturally occurring breeds of blue cats, although they have similar traits.'],
66
+ ['who played the little girl on mrs doubtfire', 'Mara Wilson Mara Elizabeth Wilson[2] (born July 24, 1987) is an American writer and former child actress. She is known for playing Natalie Hillard in Mrs. Doubtfire (1993), Susan Walker in Miracle on 34th Street (1994), Matilda Wormwood in Matilda (1996) and Lily Stone in Thomas and the Magic Railroad (2000). Since retiring from film acting, Wilson has focused on writing.'],
67
+ ['what year did the movie the sound of music come out', 'The Sound of Music (film) The film was released on March 2, 1965 in the United States, initially as a limited roadshow theatrical release. Although critical response to the film was widely mixed, the film was a major commercial success, becoming the number one box office movie after four weeks, and the highest-grossing film of 1965. By November 1966, The Sound of Music had become the highest-grossing film of all-time—surpassing Gone with the Wind—and held that distinction for five years. The film was just as popular throughout the world, breaking previous box-office records in twenty-nine countries. Following an initial theatrical release that lasted four and a half years, and two successful re-releases, the film sold 283 million admissions worldwide and earned a total worldwide gross of $286,000,000.'],
68
+ ['where was the movie dawn of the dead filmed', 'Dawn of the Dead (2004 film) The mall scenes and rooftop scenes were shot in the former Thornhill Square Shopping Centre in Thornhill, Ontario, and the other scenes were shot in the Aileen-Willowbrook neighborhood of Thornhill. The set for Ana and Luis\'s bedroom was constructed in a back room of the mall.[7] The mall was defunct, which is the reason the production used it; the movie crew completely renovated the structure, and stocked it with fictitious stores after Starbucks and numerous other corporations refused to let their names be used[7] (two exceptions to this are Roots and Panasonic). Most of the mall was demolished shortly after the film was shot. The fictitious stores include a coffee shop called Hallowed Grounds (a lyric from Johnny Cash\'s song "The Man Comes Around", which was used over the opening credits), and an upscale department store called Gaylen Ross (an in-joke reference to one of the stars of the original 1978 film).'],
69
+ ['where is the 2018 nba draft being held', "2018 NBA draft The 2018 NBA draft was held on June 21, 2018, at Barclays Center in Brooklyn, New York. National Basketball Association (NBA) teams took turns selecting amateur United States college basketball players and other eligible players, including international players. It was televised nationally by ESPN. This draft was the last to use the original weighted lottery system that gives teams near the bottom of the NBA draft better odds at the top three picks of the draft while teams higher up had worse odds in the process; the rule was agreed upon by the NBA on September 28, 2017, but would not be implemented until the 2019 draft.[2] With the last year of what was, at the time, the most recent lottery system (with the NBA draft lottery being held in Chicago instead of in New York), the Phoenix Suns won the first overall pick on May 15, 2018, with the Sacramento Kings at the second overall pick and the Atlanta Hawks at third overall pick.[3] The Suns' selection is their first No. 1 overall selection in franchise history. They would use that selection on the Bahamian center DeAndre Ayton from the nearby University of Arizona."],
70
+ ]
71
+ scores = model.predict(pairs)
72
+ print(scores.shape)
73
+ # (5,)
74
+
75
+ # Or rank different texts based on similarity to a single text
76
+ ranks = model.rank(
77
+ 'difference between russian blue and british blue cat',
78
+ [
79
+ 'Russian Blue The coat is known as a "double coat", with the undercoat being soft, downy and equal in length to the guard hairs, which are an even blue with silver tips. However, the tail may have a few very dull, almost unnoticeable stripes. The coat is described as thick, plush and soft to the touch. The feeling is softer than the softest silk. The silver tips give the coat a shimmering appearance. Its eyes are almost always a dark and vivid green. Any white patches of fur or yellow eyes in adulthood are seen as flaws in show cats.[3] Russian Blues should not be confused with British Blues (which are not a distinct breed, but rather a British Shorthair with a blue coat as the British Shorthair breed itself comes in a wide variety of colors and patterns), nor the Chartreux or Korat which are two other naturally occurring breeds of blue cats, although they have similar traits.',
80
+ 'Mara Wilson Mara Elizabeth Wilson[2] (born July 24, 1987) is an American writer and former child actress. She is known for playing Natalie Hillard in Mrs. Doubtfire (1993), Susan Walker in Miracle on 34th Street (1994), Matilda Wormwood in Matilda (1996) and Lily Stone in Thomas and the Magic Railroad (2000). Since retiring from film acting, Wilson has focused on writing.',
81
+ 'The Sound of Music (film) The film was released on March 2, 1965 in the United States, initially as a limited roadshow theatrical release. Although critical response to the film was widely mixed, the film was a major commercial success, becoming the number one box office movie after four weeks, and the highest-grossing film of 1965. By November 1966, The Sound of Music had become the highest-grossing film of all-time—surpassing Gone with the Wind—and held that distinction for five years. The film was just as popular throughout the world, breaking previous box-office records in twenty-nine countries. Following an initial theatrical release that lasted four and a half years, and two successful re-releases, the film sold 283 million admissions worldwide and earned a total worldwide gross of $286,000,000.',
82
+ 'Dawn of the Dead (2004 film) The mall scenes and rooftop scenes were shot in the former Thornhill Square Shopping Centre in Thornhill, Ontario, and the other scenes were shot in the Aileen-Willowbrook neighborhood of Thornhill. The set for Ana and Luis\'s bedroom was constructed in a back room of the mall.[7] The mall was defunct, which is the reason the production used it; the movie crew completely renovated the structure, and stocked it with fictitious stores after Starbucks and numerous other corporations refused to let their names be used[7] (two exceptions to this are Roots and Panasonic). Most of the mall was demolished shortly after the film was shot. The fictitious stores include a coffee shop called Hallowed Grounds (a lyric from Johnny Cash\'s song "The Man Comes Around", which was used over the opening credits), and an upscale department store called Gaylen Ross (an in-joke reference to one of the stars of the original 1978 film).',
83
+ "2018 NBA draft The 2018 NBA draft was held on June 21, 2018, at Barclays Center in Brooklyn, New York. National Basketball Association (NBA) teams took turns selecting amateur United States college basketball players and other eligible players, including international players. It was televised nationally by ESPN. This draft was the last to use the original weighted lottery system that gives teams near the bottom of the NBA draft better odds at the top three picks of the draft while teams higher up had worse odds in the process; the rule was agreed upon by the NBA on September 28, 2017, but would not be implemented until the 2019 draft.[2] With the last year of what was, at the time, the most recent lottery system (with the NBA draft lottery being held in Chicago instead of in New York), the Phoenix Suns won the first overall pick on May 15, 2018, with the Sacramento Kings at the second overall pick and the Atlanta Hawks at third overall pick.[3] The Suns' selection is their first No. 1 overall selection in franchise history. They would use that selection on the Bahamian center DeAndre Ayton from the nearby University of Arizona.",
84
+ ]
85
+ )
86
+ # [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...]
87
+ ```
88
+
89
+ <!--
90
+ ### Direct Usage (Transformers)
91
+
92
+ <details><summary>Click to see the direct usage in Transformers</summary>
93
+
94
+ </details>
95
+ -->
96
+
97
+ <!--
98
+ ### Downstream Usage (Sentence Transformers)
99
+
100
+ You can finetune this model on your own dataset.
101
+
102
+ <details><summary>Click to expand</summary>
103
+
104
+ </details>
105
+ -->
106
+
107
+ <!--
108
+ ### Out-of-Scope Use
109
+
110
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
111
+ -->
112
+
113
+ ## Evaluation
114
+
115
+ ### Metrics
116
+
117
+ #### Cross Encoder Reranking
118
+
119
+ * Datasets: `nq-dev`, `NanoMSMARCO`, `NanoNFCorpus` and `NanoNQ`
120
+ * Evaluated with [<code>CERerankingEvaluator</code>](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CERerankingEvaluator)
121
+
122
+ | Metric | nq-dev | NanoMSMARCO | NanoNFCorpus | NanoNQ |
123
+ |:------------|:---------------------|:---------------------|:---------------------|:---------------------|
124
+ | map | 0.7173 (+0.2201) | 0.5918 (+0.1023) | 0.3481 (+0.0777) | 0.6627 (+0.2420) |
125
+ | mrr@10 | 0.7152 (+0.2280) | 0.5830 (+0.1055) | 0.5808 (+0.0810) | 0.6765 (+0.2498) |
126
+ | **ndcg@10** | **0.7762 (+0.2164)** | **0.6454 (+0.1049)** | **0.4131 (+0.0880)** | **0.7091 (+0.2084)** |
127
+
128
+ #### Cross Encoder Nano BEIR
129
+
130
+ * Dataset: `NanoBEIR_mean`
131
+ * Evaluated with [<code>CENanoBEIREvaluator</code>](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CENanoBEIREvaluator)
132
+
133
+ | Metric | Value |
134
+ |:------------|:---------------------|
135
+ | map | 0.5342 (+0.1407) |
136
+ | mrr@10 | 0.6135 (+0.1454) |
137
+ | **ndcg@10** | **0.5892 (+0.1338)** |
138
+
139
+ <!--
140
+ ## Bias, Risks and Limitations
141
+
142
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
143
+ -->
144
+
145
+ <!--
146
+ ### Recommendations
147
+
148
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
149
+ -->
150
+
151
+ ## Training Details
152
+
153
+ ### Training Dataset
154
+
155
+ #### Unnamed Dataset
156
+
157
+ * Size: 522,240 training samples
158
+ * Columns: <code>query</code>, <code>response</code>, and <code>label</code>
159
+ * Approximate statistics based on the first 1000 samples:
160
+ | | query | response | label |
161
+ |:--------|:-----------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------|:-----------------------------|
162
+ | type | string | string | int |
163
+ | details | <ul><li>min: 27 characters</li><li>mean: 47.59 characters</li><li>max: 99 characters</li></ul> | <ul><li>min: 87 characters</li><li>mean: 606.97 characters</li><li>max: 2412 characters</li></ul> | <ul><li>1: 100.00%</li></ul> |
164
+ * Samples:
165
+ | query | response | label |
166
+ |:----------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
167
+ | <code>who said my enemy's enemy is my friend</code> | <code>The enemy of my enemy is my friend The enemy of my enemy is my friend is an ancient proverb which suggests that two opposing parties can or should work together against a common enemy. The earliest known expression of this concept is found in a Sanskrit treatise on statecraft, the Arthashastra, which dates to around the 4th century BC, while the first recorded use of the current English version came in 1884.[1][2]</code> | <code>1</code> |
168
+ | <code>when does the nba season start and end</code> | <code>2017–18 NBA season The 2017–18 NBA season is the 72nd season of the National Basketball Association (NBA). The regular season began on October 17, 2017, earlier than previous seasons to reduce the number of "back-to-back" games teams are scheduled to play,[1] with the 2017 Eastern Conference champion (and Finals runner–up) Cleveland Cavaliers hosting a game against the Boston Celtics at Quicken Loans Arena in Cleveland, Ohio[2] Christmas games were played on December 25, 2017. The 2018 NBA All-Star Game was played on February 18, 2018, at the Staples Center in Los Angeles, California. LeBron James of the Cleveland Cavaliers was named the All-Star Game Most Valuable Player. The regular season will end on April 11, 2018 and the playoffs will begin on April 14, 2018.[3]</code> | <code>1</code> |
169
+ | <code>what is the basis of supreme court decisions</code> | <code>Supreme court A supreme court is the highest court within the hierarchy of courts in many legal jurisdictions. Other descriptions for such courts include court of last resort, apex court, and highest (or final) court of appeal. Broadly speaking, the decisions of a supreme court are not subject to further review by any other court. Supreme courts typically function primarily as appellate courts, hearing appeals from decisions of lower trial courts, or from intermediate-level appellate courts.[1]</code> | <code>1</code> |
170
+ * Loss: [<code>BinaryCrossEntropyLoss</code>](https://sbert.net/docs/package_reference/cross_encoder/losses.html#binarycrossentropyloss) with these parameters:
171
+ ```json
172
+ {
173
+ "activation_fct": "torch.nn.modules.linear.Identity",
174
+ "pos_weight": 5
175
+ }
176
+ ```
177
+
178
+ ### Evaluation Dataset
179
+
180
+ #### natural-questions
181
+
182
+ * Dataset: [natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
183
+ * Size: 100,231 evaluation samples
184
+ * Columns: <code>query</code>, <code>response</code>, and <code>label</code>
185
+ * Approximate statistics based on the first 1000 samples:
186
+ | | query | response | label |
187
+ |:--------|:-----------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------|:-----------------------------|
188
+ | type | string | string | int |
189
+ | details | <ul><li>min: 27 characters</li><li>mean: 47.03 characters</li><li>max: 96 characters</li></ul> | <ul><li>min: 26 characters</li><li>mean: 608.17 characters</li><li>max: 2639 characters</li></ul> | <ul><li>1: 100.00%</li></ul> |
190
+ * Samples:
191
+ | query | response | label |
192
+ |:------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
193
+ | <code>difference between russian blue and british blue cat</code> | <code>Russian Blue The coat is known as a "double coat", with the undercoat being soft, downy and equal in length to the guard hairs, which are an even blue with silver tips. However, the tail may have a few very dull, almost unnoticeable stripes. The coat is described as thick, plush and soft to the touch. The feeling is softer than the softest silk. The silver tips give the coat a shimmering appearance. Its eyes are almost always a dark and vivid green. Any white patches of fur or yellow eyes in adulthood are seen as flaws in show cats.[3] Russian Blues should not be confused with British Blues (which are not a distinct breed, but rather a British Shorthair with a blue coat as the British Shorthair breed itself comes in a wide variety of colors and patterns), nor the Chartreux or Korat which are two other naturally occurring breeds of blue cats, although they have similar traits.</code> | <code>1</code> |
194
+ | <code>who played the little girl on mrs doubtfire</code> | <code>Mara Wilson Mara Elizabeth Wilson[2] (born July 24, 1987) is an American writer and former child actress. She is known for playing Natalie Hillard in Mrs. Doubtfire (1993), Susan Walker in Miracle on 34th Street (1994), Matilda Wormwood in Matilda (1996) and Lily Stone in Thomas and the Magic Railroad (2000). Since retiring from film acting, Wilson has focused on writing.</code> | <code>1</code> |
195
+ | <code>what year did the movie the sound of music come out</code> | <code>The Sound of Music (film) The film was released on March 2, 1965 in the United States, initially as a limited roadshow theatrical release. Although critical response to the film was widely mixed, the film was a major commercial success, becoming the number one box office movie after four weeks, and the highest-grossing film of 1965. By November 1966, The Sound of Music had become the highest-grossing film of all-time—surpassing Gone with the Wind—and held that distinction for five years. The film was just as popular throughout the world, breaking previous box-office records in twenty-nine countries. Following an initial theatrical release that lasted four and a half years, and two successful re-releases, the film sold 283 million admissions worldwide and earned a total worldwide gross of $286,000,000.</code> | <code>1</code> |
196
+ * Loss: [<code>BinaryCrossEntropyLoss</code>](https://sbert.net/docs/package_reference/cross_encoder/losses.html#binarycrossentropyloss) with these parameters:
197
+ ```json
198
+ {
199
+ "activation_fct": "torch.nn.modules.linear.Identity",
200
+ "pos_weight": 5
201
+ }
202
+ ```
203
+
204
+ ### Training Hyperparameters
205
+ #### Non-Default Hyperparameters
206
+
207
+ - `eval_strategy`: steps
208
+ - `per_device_train_batch_size`: 64
209
+ - `per_device_eval_batch_size`: 64
210
+ - `learning_rate`: 2e-05
211
+ - `num_train_epochs`: 1
212
+ - `warmup_ratio`: 0.1
213
+ - `seed`: 12
214
+ - `bf16`: True
215
+ - `dataloader_num_workers`: 4
216
+ - `load_best_model_at_end`: True
217
+ - `batch_sampler`: no_duplicates
218
+
219
+ #### All Hyperparameters
220
+ <details><summary>Click to expand</summary>
221
+
222
+ - `overwrite_output_dir`: False
223
+ - `do_predict`: False
224
+ - `eval_strategy`: steps
225
+ - `prediction_loss_only`: True
226
+ - `per_device_train_batch_size`: 64
227
+ - `per_device_eval_batch_size`: 64
228
+ - `per_gpu_train_batch_size`: None
229
+ - `per_gpu_eval_batch_size`: None
230
+ - `gradient_accumulation_steps`: 1
231
+ - `eval_accumulation_steps`: None
232
+ - `torch_empty_cache_steps`: None
233
+ - `learning_rate`: 2e-05
234
+ - `weight_decay`: 0.0
235
+ - `adam_beta1`: 0.9
236
+ - `adam_beta2`: 0.999
237
+ - `adam_epsilon`: 1e-08
238
+ - `max_grad_norm`: 1.0
239
+ - `num_train_epochs`: 1
240
+ - `max_steps`: -1
241
+ - `lr_scheduler_type`: linear
242
+ - `lr_scheduler_kwargs`: {}
243
+ - `warmup_ratio`: 0.1
244
+ - `warmup_steps`: 0
245
+ - `log_level`: passive
246
+ - `log_level_replica`: warning
247
+ - `log_on_each_node`: True
248
+ - `logging_nan_inf_filter`: True
249
+ - `save_safetensors`: True
250
+ - `save_on_each_node`: False
251
+ - `save_only_model`: False
252
+ - `restore_callback_states_from_checkpoint`: False
253
+ - `no_cuda`: False
254
+ - `use_cpu`: False
255
+ - `use_mps_device`: False
256
+ - `seed`: 12
257
+ - `data_seed`: None
258
+ - `jit_mode_eval`: False
259
+ - `use_ipex`: False
260
+ - `bf16`: True
261
+ - `fp16`: False
262
+ - `fp16_opt_level`: O1
263
+ - `half_precision_backend`: auto
264
+ - `bf16_full_eval`: False
265
+ - `fp16_full_eval`: False
266
+ - `tf32`: None
267
+ - `local_rank`: 0
268
+ - `ddp_backend`: None
269
+ - `tpu_num_cores`: None
270
+ - `tpu_metrics_debug`: False
271
+ - `debug`: []
272
+ - `dataloader_drop_last`: False
273
+ - `dataloader_num_workers`: 4
274
+ - `dataloader_prefetch_factor`: None
275
+ - `past_index`: -1
276
+ - `disable_tqdm`: False
277
+ - `remove_unused_columns`: True
278
+ - `label_names`: None
279
+ - `load_best_model_at_end`: True
280
+ - `ignore_data_skip`: False
281
+ - `fsdp`: []
282
+ - `fsdp_min_num_params`: 0
283
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
284
+ - `fsdp_transformer_layer_cls_to_wrap`: None
285
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
286
+ - `deepspeed`: None
287
+ - `label_smoothing_factor`: 0.0
288
+ - `optim`: adamw_torch
289
+ - `optim_args`: None
290
+ - `adafactor`: False
291
+ - `group_by_length`: False
292
+ - `length_column_name`: length
293
+ - `ddp_find_unused_parameters`: None
294
+ - `ddp_bucket_cap_mb`: None
295
+ - `ddp_broadcast_buffers`: False
296
+ - `dataloader_pin_memory`: True
297
+ - `dataloader_persistent_workers`: False
298
+ - `skip_memory_metrics`: True
299
+ - `use_legacy_prediction_loop`: False
300
+ - `push_to_hub`: False
301
+ - `resume_from_checkpoint`: None
302
+ - `hub_model_id`: None
303
+ - `hub_strategy`: every_save
304
+ - `hub_private_repo`: None
305
+ - `hub_always_push`: False
306
+ - `gradient_checkpointing`: False
307
+ - `gradient_checkpointing_kwargs`: None
308
+ - `include_inputs_for_metrics`: False
309
+ - `include_for_metrics`: []
310
+ - `eval_do_concat_batches`: True
311
+ - `fp16_backend`: auto
312
+ - `push_to_hub_model_id`: None
313
+ - `push_to_hub_organization`: None
314
+ - `mp_parameters`:
315
+ - `auto_find_batch_size`: False
316
+ - `full_determinism`: False
317
+ - `torchdynamo`: None
318
+ - `ray_scope`: last
319
+ - `ddp_timeout`: 1800
320
+ - `torch_compile`: False
321
+ - `torch_compile_backend`: None
322
+ - `torch_compile_mode`: None
323
+ - `dispatch_batches`: None
324
+ - `split_batches`: None
325
+ - `include_tokens_per_second`: False
326
+ - `include_num_input_tokens_seen`: False
327
+ - `neftune_noise_alpha`: None
328
+ - `optim_target_modules`: None
329
+ - `batch_eval_metrics`: False
330
+ - `eval_on_start`: False
331
+ - `use_liger_kernel`: False
332
+ - `eval_use_gather_object`: False
333
+ - `average_tokens_across_devices`: False
334
+ - `prompts`: None
335
+ - `batch_sampler`: no_duplicates
336
+ - `multi_dataset_batch_sampler`: proportional
337
+
338
+ </details>
339
+
340
+ ### Training Logs
341
+ | Epoch | Step | Training Loss | Validation Loss | nq-dev_ndcg@10 | NanoMSMARCO_ndcg@10 | NanoNFCorpus_ndcg@10 | NanoNQ_ndcg@10 | NanoBEIR_mean_ndcg@10 |
342
+ |:----------:|:--------:|:-------------:|:---------------:|:--------------------:|:--------------------:|:--------------------:|:--------------------:|:---------------------:|
343
+ | -1 | -1 | - | - | 0.1601 (-0.3997) | 0.0058 (-0.5346) | 0.2751 (-0.0500) | 0.0181 (-0.4826) | 0.0996 (-0.3557) |
344
+ | 0.0001 | 1 | 1.4341 | - | - | - | - | - | - |
345
+ | 0.0245 | 200 | 1.151 | - | - | - | - | - | - |
346
+ | 0.0490 | 400 | 0.9215 | - | - | - | - | - | - |
347
+ | 0.0735 | 600 | 0.3631 | - | - | - | - | - | - |
348
+ | 0.0980 | 800 | 0.2496 | - | - | - | - | - | - |
349
+ | 0.1225 | 1000 | 0.2019 | 0.9056 | 0.7381 (+0.1783) | 0.5730 (+0.0326) | 0.3626 (+0.0376) | 0.6462 (+0.1456) | 0.5273 (+0.0719) |
350
+ | 0.1471 | 1200 | 0.1735 | - | - | - | - | - | - |
351
+ | 0.1716 | 1400 | 0.1771 | - | - | - | - | - | - |
352
+ | 0.1961 | 1600 | 0.1619 | - | - | - | - | - | - |
353
+ | 0.2206 | 1800 | 0.127 | - | - | - | - | - | - |
354
+ | 0.2451 | 2000 | 0.1351 | 2.1863 | 0.7491 (+0.1892) | 0.6261 (+0.0856) | 0.3637 (+0.0386) | 0.6990 (+0.1984) | 0.5629 (+0.1076) |
355
+ | 0.2696 | 2200 | 0.131 | - | - | - | - | - | - |
356
+ | 0.2941 | 2400 | 0.1382 | - | - | - | - | - | - |
357
+ | 0.3186 | 2600 | 0.1322 | - | - | - | - | - | - |
358
+ | 0.3431 | 2800 | 0.1129 | - | - | - | - | - | - |
359
+ | 0.3676 | 3000 | 0.1297 | 1.3222 | 0.7722 (+0.2124) | 0.6252 (+0.0847) | 0.3888 (+0.0638) | 0.6977 (+0.1971) | 0.5706 (+0.1152) |
360
+ | 0.3922 | 3200 | 0.1137 | - | - | - | - | - | - |
361
+ | 0.4167 | 3400 | 0.115 | - | - | - | - | - | - |
362
+ | 0.4412 | 3600 | 0.108 | - | - | - | - | - | - |
363
+ | 0.4657 | 3800 | 0.1066 | - | - | - | - | - | - |
364
+ | 0.4902 | 4000 | 0.1036 | 2.0361 | 0.7707 (+0.2109) | 0.6332 (+0.0928) | 0.3763 (+0.0513) | 0.6758 (+0.1752) | 0.5618 (+0.1064) |
365
+ | 0.5147 | 4200 | 0.0914 | - | - | - | - | - | - |
366
+ | 0.5392 | 4400 | 0.099 | - | - | - | - | - | - |
367
+ | 0.5637 | 4600 | 0.1086 | - | - | - | - | - | - |
368
+ | 0.5882 | 4800 | 0.1055 | - | - | - | - | - | - |
369
+ | 0.6127 | 5000 | 0.1081 | 1.8660 | 0.7736 (+0.2138) | 0.6228 (+0.0824) | 0.4040 (+0.0790) | 0.6995 (+0.1988) | 0.5754 (+0.1201) |
370
+ | 0.6373 | 5200 | 0.0934 | - | - | - | - | - | - |
371
+ | 0.6618 | 5400 | 0.0972 | - | - | - | - | - | - |
372
+ | 0.6863 | 5600 | 0.093 | - | - | - | - | - | - |
373
+ | 0.7108 | 5800 | 0.0859 | - | - | - | - | - | - |
374
+ | **0.7353** | **6000** | **0.0791** | **1.4301** | **0.7762 (+0.2164)** | **0.6454 (+0.1049)** | **0.4131 (+0.0880)** | **0.7091 (+0.2084)** | **0.5892 (+0.1338)** |
375
+ | 0.7598 | 6200 | 0.0809 | - | - | - | - | - | - |
376
+ | 0.7843 | 6400 | 0.0775 | - | - | - | - | - | - |
377
+ | 0.8088 | 6600 | 0.093 | - | - | - | - | - | - |
378
+ | 0.8333 | 6800 | 0.0812 | - | - | - | - | - | - |
379
+ | 0.8578 | 7000 | 0.0922 | 1.8314 | 0.7868 (+0.2270) | 0.6344 (+0.0940) | 0.4028 (+0.0777) | 0.7228 (+0.2221) | 0.5867 (+0.1313) |
380
+ | 0.8824 | 7200 | 0.0755 | - | - | - | - | - | - |
381
+ | 0.9069 | 7400 | 0.0743 | - | - | - | - | - | - |
382
+ | 0.9314 | 7600 | 0.0738 | - | - | - | - | - | - |
383
+ | 0.9559 | 7800 | 0.0746 | - | - | - | - | - | - |
384
+ | 0.9804 | 8000 | 0.081 | 1.5947 | 0.7891 (+0.2293) | 0.6256 (+0.0852) | 0.3994 (+0.0743) | 0.7256 (+0.2250) | 0.5835 (+0.1282) |
385
+ | -1 | -1 | - | - | 0.7762 (+0.2164) | 0.6454 (+0.1049) | 0.4131 (+0.0880) | 0.7091 (+0.2084) | 0.5892 (+0.1338) |
386
+
387
+ * The bold row denotes the saved checkpoint.
388
+
389
+ ### Framework Versions
390
+ - Python: 3.11.10
391
+ - Sentence Transformers: 3.5.0.dev0
392
+ - Transformers: 4.49.0.dev0
393
+ - PyTorch: 2.6.0.dev20241112+cu121
394
+ - Accelerate: 1.2.0
395
+ - Datasets: 3.2.0
396
+ - Tokenizers: 0.21.0
397
+
398
+ ## Citation
399
+
400
+ ### BibTeX
401
+
402
+ #### Sentence Transformers
403
+ ```bibtex
404
+ @inproceedings{reimers-2019-sentence-bert,
405
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
406
+ author = "Reimers, Nils and Gurevych, Iryna",
407
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
408
+ month = "11",
409
+ year = "2019",
410
+ publisher = "Association for Computational Linguistics",
411
+ url = "https://arxiv.org/abs/1908.10084",
412
+ }
413
+ ```
414
+
415
+ <!--
416
+ ## Glossary
417
+
418
+ *Clearly define terms in order to be accessible across audiences.*
419
+ -->
420
+
421
+ <!--
422
+ ## Model Card Authors
423
+
424
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
425
+ -->
426
+
427
+ <!--
428
+ ## Model Card Contact
429
+
430
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
431
+ -->
config.json ADDED
@@ -0,0 +1,53 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "answerdotai/ModernBERT-base",
3
+ "architectures": [
4
+ "ModernBertForSequenceClassification"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 50281,
9
+ "classifier_activation": "gelu",
10
+ "classifier_bias": false,
11
+ "classifier_dropout": 0.0,
12
+ "classifier_pooling": "mean",
13
+ "cls_token_id": 50281,
14
+ "decoder_bias": true,
15
+ "deterministic_flash_attn": false,
16
+ "embedding_dropout": 0.0,
17
+ "eos_token_id": 50282,
18
+ "global_attn_every_n_layers": 3,
19
+ "global_rope_theta": 160000.0,
20
+ "gradient_checkpointing": false,
21
+ "hidden_activation": "gelu",
22
+ "hidden_size": 768,
23
+ "id2label": {
24
+ "0": "LABEL_0"
25
+ },
26
+ "initializer_cutoff_factor": 2.0,
27
+ "initializer_range": 0.02,
28
+ "intermediate_size": 1152,
29
+ "label2id": {
30
+ "LABEL_0": 0
31
+ },
32
+ "layer_norm_eps": 1e-05,
33
+ "local_attention": 128,
34
+ "local_rope_theta": 10000.0,
35
+ "max_position_embeddings": 8192,
36
+ "mlp_bias": false,
37
+ "mlp_dropout": 0.0,
38
+ "model_type": "modernbert",
39
+ "norm_bias": false,
40
+ "norm_eps": 1e-05,
41
+ "num_attention_heads": 12,
42
+ "num_hidden_layers": 22,
43
+ "pad_token_id": 50283,
44
+ "position_embedding_type": "absolute",
45
+ "reference_compile": true,
46
+ "repad_logits_with_grad": false,
47
+ "sep_token_id": 50282,
48
+ "sparse_pred_ignore_index": -100,
49
+ "sparse_prediction": false,
50
+ "torch_dtype": "float32",
51
+ "transformers_version": "4.49.0.dev0",
52
+ "vocab_size": 50368
53
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3ef33a27a63003c30ecf3c835bc86e44105e63d82faedcdc936142b30ae1e654
3
+ size 598436708
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": true,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,945 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "|||IP_ADDRESS|||",
5
+ "lstrip": false,
6
+ "normalized": true,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": false
10
+ },
11
+ "1": {
12
+ "content": "<|padding|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "50254": {
20
+ "content": " ",
21
+ "lstrip": false,
22
+ "normalized": true,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": false
26
+ },
27
+ "50255": {
28
+ "content": " ",
29
+ "lstrip": false,
30
+ "normalized": true,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": false
34
+ },
35
+ "50256": {
36
+ "content": " ",
37
+ "lstrip": false,
38
+ "normalized": true,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": false
42
+ },
43
+ "50257": {
44
+ "content": " ",
45
+ "lstrip": false,
46
+ "normalized": true,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": false
50
+ },
51
+ "50258": {
52
+ "content": " ",
53
+ "lstrip": false,
54
+ "normalized": true,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": false
58
+ },
59
+ "50259": {
60
+ "content": " ",
61
+ "lstrip": false,
62
+ "normalized": true,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": false
66
+ },
67
+ "50260": {
68
+ "content": " ",
69
+ "lstrip": false,
70
+ "normalized": true,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": false
74
+ },
75
+ "50261": {
76
+ "content": " ",
77
+ "lstrip": false,
78
+ "normalized": true,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": false
82
+ },
83
+ "50262": {
84
+ "content": " ",
85
+ "lstrip": false,
86
+ "normalized": true,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": false
90
+ },
91
+ "50263": {
92
+ "content": " ",
93
+ "lstrip": false,
94
+ "normalized": true,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": false
98
+ },
99
+ "50264": {
100
+ "content": " ",
101
+ "lstrip": false,
102
+ "normalized": true,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": false
106
+ },
107
+ "50265": {
108
+ "content": " ",
109
+ "lstrip": false,
110
+ "normalized": true,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": false
114
+ },
115
+ "50266": {
116
+ "content": " ",
117
+ "lstrip": false,
118
+ "normalized": true,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": false
122
+ },
123
+ "50267": {
124
+ "content": " ",
125
+ "lstrip": false,
126
+ "normalized": true,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": false
130
+ },
131
+ "50268": {
132
+ "content": " ",
133
+ "lstrip": false,
134
+ "normalized": true,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": false
138
+ },
139
+ "50269": {
140
+ "content": " ",
141
+ "lstrip": false,
142
+ "normalized": true,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": false
146
+ },
147
+ "50270": {
148
+ "content": " ",
149
+ "lstrip": false,
150
+ "normalized": true,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": false
154
+ },
155
+ "50271": {
156
+ "content": " ",
157
+ "lstrip": false,
158
+ "normalized": true,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": false
162
+ },
163
+ "50272": {
164
+ "content": " ",
165
+ "lstrip": false,
166
+ "normalized": true,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": false
170
+ },
171
+ "50273": {
172
+ "content": " ",
173
+ "lstrip": false,
174
+ "normalized": true,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": false
178
+ },
179
+ "50274": {
180
+ "content": " ",
181
+ "lstrip": false,
182
+ "normalized": true,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": false
186
+ },
187
+ "50275": {
188
+ "content": " ",
189
+ "lstrip": false,
190
+ "normalized": true,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": false
194
+ },
195
+ "50276": {
196
+ "content": " ",
197
+ "lstrip": false,
198
+ "normalized": true,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": false
202
+ },
203
+ "50277": {
204
+ "content": "|||EMAIL_ADDRESS|||",
205
+ "lstrip": false,
206
+ "normalized": true,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": false
210
+ },
211
+ "50278": {
212
+ "content": "|||PHONE_NUMBER|||",
213
+ "lstrip": false,
214
+ "normalized": true,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": false
218
+ },
219
+ "50279": {
220
+ "content": "<|endoftext|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "50280": {
228
+ "content": "[UNK]",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "50281": {
236
+ "content": "[CLS]",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "50282": {
244
+ "content": "[SEP]",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "50283": {
252
+ "content": "[PAD]",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "50284": {
260
+ "content": "[MASK]",
261
+ "lstrip": true,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "50285": {
268
+ "content": "[unused0]",
269
+ "lstrip": false,
270
+ "normalized": true,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": false
274
+ },
275
+ "50286": {
276
+ "content": "[unused1]",
277
+ "lstrip": false,
278
+ "normalized": true,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": false
282
+ },
283
+ "50287": {
284
+ "content": "[unused2]",
285
+ "lstrip": false,
286
+ "normalized": true,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": false
290
+ },
291
+ "50288": {
292
+ "content": "[unused3]",
293
+ "lstrip": false,
294
+ "normalized": true,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": false
298
+ },
299
+ "50289": {
300
+ "content": "[unused4]",
301
+ "lstrip": false,
302
+ "normalized": true,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": false
306
+ },
307
+ "50290": {
308
+ "content": "[unused5]",
309
+ "lstrip": false,
310
+ "normalized": true,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": false
314
+ },
315
+ "50291": {
316
+ "content": "[unused6]",
317
+ "lstrip": false,
318
+ "normalized": true,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": false
322
+ },
323
+ "50292": {
324
+ "content": "[unused7]",
325
+ "lstrip": false,
326
+ "normalized": true,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": false
330
+ },
331
+ "50293": {
332
+ "content": "[unused8]",
333
+ "lstrip": false,
334
+ "normalized": true,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": false
338
+ },
339
+ "50294": {
340
+ "content": "[unused9]",
341
+ "lstrip": false,
342
+ "normalized": true,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": false
346
+ },
347
+ "50295": {
348
+ "content": "[unused10]",
349
+ "lstrip": false,
350
+ "normalized": true,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": false
354
+ },
355
+ "50296": {
356
+ "content": "[unused11]",
357
+ "lstrip": false,
358
+ "normalized": true,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": false
362
+ },
363
+ "50297": {
364
+ "content": "[unused12]",
365
+ "lstrip": false,
366
+ "normalized": true,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": false
370
+ },
371
+ "50298": {
372
+ "content": "[unused13]",
373
+ "lstrip": false,
374
+ "normalized": true,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": false
378
+ },
379
+ "50299": {
380
+ "content": "[unused14]",
381
+ "lstrip": false,
382
+ "normalized": true,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": false
386
+ },
387
+ "50300": {
388
+ "content": "[unused15]",
389
+ "lstrip": false,
390
+ "normalized": true,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": false
394
+ },
395
+ "50301": {
396
+ "content": "[unused16]",
397
+ "lstrip": false,
398
+ "normalized": true,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": false
402
+ },
403
+ "50302": {
404
+ "content": "[unused17]",
405
+ "lstrip": false,
406
+ "normalized": true,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": false
410
+ },
411
+ "50303": {
412
+ "content": "[unused18]",
413
+ "lstrip": false,
414
+ "normalized": true,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": false
418
+ },
419
+ "50304": {
420
+ "content": "[unused19]",
421
+ "lstrip": false,
422
+ "normalized": true,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": false
426
+ },
427
+ "50305": {
428
+ "content": "[unused20]",
429
+ "lstrip": false,
430
+ "normalized": true,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": false
434
+ },
435
+ "50306": {
436
+ "content": "[unused21]",
437
+ "lstrip": false,
438
+ "normalized": true,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": false
442
+ },
443
+ "50307": {
444
+ "content": "[unused22]",
445
+ "lstrip": false,
446
+ "normalized": true,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": false
450
+ },
451
+ "50308": {
452
+ "content": "[unused23]",
453
+ "lstrip": false,
454
+ "normalized": true,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": false
458
+ },
459
+ "50309": {
460
+ "content": "[unused24]",
461
+ "lstrip": false,
462
+ "normalized": true,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": false
466
+ },
467
+ "50310": {
468
+ "content": "[unused25]",
469
+ "lstrip": false,
470
+ "normalized": true,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": false
474
+ },
475
+ "50311": {
476
+ "content": "[unused26]",
477
+ "lstrip": false,
478
+ "normalized": true,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": false
482
+ },
483
+ "50312": {
484
+ "content": "[unused27]",
485
+ "lstrip": false,
486
+ "normalized": true,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": false
490
+ },
491
+ "50313": {
492
+ "content": "[unused28]",
493
+ "lstrip": false,
494
+ "normalized": true,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": false
498
+ },
499
+ "50314": {
500
+ "content": "[unused29]",
501
+ "lstrip": false,
502
+ "normalized": true,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": false
506
+ },
507
+ "50315": {
508
+ "content": "[unused30]",
509
+ "lstrip": false,
510
+ "normalized": true,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": false
514
+ },
515
+ "50316": {
516
+ "content": "[unused31]",
517
+ "lstrip": false,
518
+ "normalized": true,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": false
522
+ },
523
+ "50317": {
524
+ "content": "[unused32]",
525
+ "lstrip": false,
526
+ "normalized": true,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": false
530
+ },
531
+ "50318": {
532
+ "content": "[unused33]",
533
+ "lstrip": false,
534
+ "normalized": true,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": false
538
+ },
539
+ "50319": {
540
+ "content": "[unused34]",
541
+ "lstrip": false,
542
+ "normalized": true,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": false
546
+ },
547
+ "50320": {
548
+ "content": "[unused35]",
549
+ "lstrip": false,
550
+ "normalized": true,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": false
554
+ },
555
+ "50321": {
556
+ "content": "[unused36]",
557
+ "lstrip": false,
558
+ "normalized": true,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": false
562
+ },
563
+ "50322": {
564
+ "content": "[unused37]",
565
+ "lstrip": false,
566
+ "normalized": true,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": false
570
+ },
571
+ "50323": {
572
+ "content": "[unused38]",
573
+ "lstrip": false,
574
+ "normalized": true,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": false
578
+ },
579
+ "50324": {
580
+ "content": "[unused39]",
581
+ "lstrip": false,
582
+ "normalized": true,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": false
586
+ },
587
+ "50325": {
588
+ "content": "[unused40]",
589
+ "lstrip": false,
590
+ "normalized": true,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": false
594
+ },
595
+ "50326": {
596
+ "content": "[unused41]",
597
+ "lstrip": false,
598
+ "normalized": true,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": false
602
+ },
603
+ "50327": {
604
+ "content": "[unused42]",
605
+ "lstrip": false,
606
+ "normalized": true,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": false
610
+ },
611
+ "50328": {
612
+ "content": "[unused43]",
613
+ "lstrip": false,
614
+ "normalized": true,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": false
618
+ },
619
+ "50329": {
620
+ "content": "[unused44]",
621
+ "lstrip": false,
622
+ "normalized": true,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": false
626
+ },
627
+ "50330": {
628
+ "content": "[unused45]",
629
+ "lstrip": false,
630
+ "normalized": true,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": false
634
+ },
635
+ "50331": {
636
+ "content": "[unused46]",
637
+ "lstrip": false,
638
+ "normalized": true,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": false
642
+ },
643
+ "50332": {
644
+ "content": "[unused47]",
645
+ "lstrip": false,
646
+ "normalized": true,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": false
650
+ },
651
+ "50333": {
652
+ "content": "[unused48]",
653
+ "lstrip": false,
654
+ "normalized": true,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": false
658
+ },
659
+ "50334": {
660
+ "content": "[unused49]",
661
+ "lstrip": false,
662
+ "normalized": true,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": false
666
+ },
667
+ "50335": {
668
+ "content": "[unused50]",
669
+ "lstrip": false,
670
+ "normalized": true,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": false
674
+ },
675
+ "50336": {
676
+ "content": "[unused51]",
677
+ "lstrip": false,
678
+ "normalized": true,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": false
682
+ },
683
+ "50337": {
684
+ "content": "[unused52]",
685
+ "lstrip": false,
686
+ "normalized": true,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": false
690
+ },
691
+ "50338": {
692
+ "content": "[unused53]",
693
+ "lstrip": false,
694
+ "normalized": true,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": false
698
+ },
699
+ "50339": {
700
+ "content": "[unused54]",
701
+ "lstrip": false,
702
+ "normalized": true,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": false
706
+ },
707
+ "50340": {
708
+ "content": "[unused55]",
709
+ "lstrip": false,
710
+ "normalized": true,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": false
714
+ },
715
+ "50341": {
716
+ "content": "[unused56]",
717
+ "lstrip": false,
718
+ "normalized": true,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": false
722
+ },
723
+ "50342": {
724
+ "content": "[unused57]",
725
+ "lstrip": false,
726
+ "normalized": true,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": false
730
+ },
731
+ "50343": {
732
+ "content": "[unused58]",
733
+ "lstrip": false,
734
+ "normalized": true,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": false
738
+ },
739
+ "50344": {
740
+ "content": "[unused59]",
741
+ "lstrip": false,
742
+ "normalized": true,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": false
746
+ },
747
+ "50345": {
748
+ "content": "[unused60]",
749
+ "lstrip": false,
750
+ "normalized": true,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": false
754
+ },
755
+ "50346": {
756
+ "content": "[unused61]",
757
+ "lstrip": false,
758
+ "normalized": true,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": false
762
+ },
763
+ "50347": {
764
+ "content": "[unused62]",
765
+ "lstrip": false,
766
+ "normalized": true,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": false
770
+ },
771
+ "50348": {
772
+ "content": "[unused63]",
773
+ "lstrip": false,
774
+ "normalized": true,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": false
778
+ },
779
+ "50349": {
780
+ "content": "[unused64]",
781
+ "lstrip": false,
782
+ "normalized": true,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": false
786
+ },
787
+ "50350": {
788
+ "content": "[unused65]",
789
+ "lstrip": false,
790
+ "normalized": true,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": false
794
+ },
795
+ "50351": {
796
+ "content": "[unused66]",
797
+ "lstrip": false,
798
+ "normalized": true,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": false
802
+ },
803
+ "50352": {
804
+ "content": "[unused67]",
805
+ "lstrip": false,
806
+ "normalized": true,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": false
810
+ },
811
+ "50353": {
812
+ "content": "[unused68]",
813
+ "lstrip": false,
814
+ "normalized": true,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": false
818
+ },
819
+ "50354": {
820
+ "content": "[unused69]",
821
+ "lstrip": false,
822
+ "normalized": true,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": false
826
+ },
827
+ "50355": {
828
+ "content": "[unused70]",
829
+ "lstrip": false,
830
+ "normalized": true,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": false
834
+ },
835
+ "50356": {
836
+ "content": "[unused71]",
837
+ "lstrip": false,
838
+ "normalized": true,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": false
842
+ },
843
+ "50357": {
844
+ "content": "[unused72]",
845
+ "lstrip": false,
846
+ "normalized": true,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": false
850
+ },
851
+ "50358": {
852
+ "content": "[unused73]",
853
+ "lstrip": false,
854
+ "normalized": true,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": false
858
+ },
859
+ "50359": {
860
+ "content": "[unused74]",
861
+ "lstrip": false,
862
+ "normalized": true,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": false
866
+ },
867
+ "50360": {
868
+ "content": "[unused75]",
869
+ "lstrip": false,
870
+ "normalized": true,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": false
874
+ },
875
+ "50361": {
876
+ "content": "[unused76]",
877
+ "lstrip": false,
878
+ "normalized": true,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": false
882
+ },
883
+ "50362": {
884
+ "content": "[unused77]",
885
+ "lstrip": false,
886
+ "normalized": true,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": false
890
+ },
891
+ "50363": {
892
+ "content": "[unused78]",
893
+ "lstrip": false,
894
+ "normalized": true,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": false
898
+ },
899
+ "50364": {
900
+ "content": "[unused79]",
901
+ "lstrip": false,
902
+ "normalized": true,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": false
906
+ },
907
+ "50365": {
908
+ "content": "[unused80]",
909
+ "lstrip": false,
910
+ "normalized": true,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": false
914
+ },
915
+ "50366": {
916
+ "content": "[unused81]",
917
+ "lstrip": false,
918
+ "normalized": true,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": false
922
+ },
923
+ "50367": {
924
+ "content": "[unused82]",
925
+ "lstrip": false,
926
+ "normalized": true,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": false
930
+ }
931
+ },
932
+ "clean_up_tokenization_spaces": true,
933
+ "cls_token": "[CLS]",
934
+ "extra_special_tokens": {},
935
+ "mask_token": "[MASK]",
936
+ "model_input_names": [
937
+ "input_ids",
938
+ "attention_mask"
939
+ ],
940
+ "model_max_length": 8192,
941
+ "pad_token": "[PAD]",
942
+ "sep_token": "[SEP]",
943
+ "tokenizer_class": "PreTrainedTokenizerFast",
944
+ "unk_token": "[UNK]"
945
+ }