tomaarsen HF Staff commited on
Commit
877391d
·
verified ·
1 Parent(s): b7886f7

Add new SentenceTransformer model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,938 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - sentence-transformers
7
+ - sentence-similarity
8
+ - feature-extraction
9
+ - dense
10
+ - generated_from_trainer
11
+ - dataset_size:10000
12
+ - loss:MarginMSELoss
13
+ widget:
14
+ - source_sentence: driving distance miami to fort lauderdale
15
+ sentences:
16
+ - "This Distance calculator provides both the by air, and by road distance between\
17
+ \ cities in both miles and kms, along with a map and driving directions â\x80\x93\
18
+ \ please scroll down to see it, and a little further for the city to city driving\
19
+ \ directions. This is the by air, as the crow flies distance between the two cities."
20
+ - Driving distance from Fort Lauderdale, FL to Miami, FL. The total driving distance
21
+ from Fort Lauderdale, FL to Miami, FL is 27 miles or 43 kilometers. Your trip
22
+ begins in Fort Lauderdale, Florida. It ends in Miami, Florida. If you are planning
23
+ a road trip, you might also want to calculate the total driving time from Fort
24
+ Lauderdale, FL to Miami, FL so you can see when you'll arrive at your destination.
25
+ - The total driving distance from Fort Lauderdale to Miami Beach, FL is 32 miles.
26
+ It can take 1 hour to 1.5 hours depending on how bad the traffic is or in rush
27
+ hour. Or 11.30 at night it may be 1 hour, but will be less 1 hour.
28
+ - Distance Between Cordova Tennessee United States and Memphis International Airport
29
+ (MEM) Tennessee United States, flight and driving distances and airport information.
30
+ Distance calculator.
31
+ - Distance Map Info. Optimal route map between New Smyrna Beach, FL and Melbourne,
32
+ FL. This route will be about 75 Miles. The driving route information(distance,
33
+ estimated time, directions), flight route, traffic information and print the map
34
+ features are placed on the top right corner of the map.
35
+ - Driving distance from Fort Lauderdale, FL to Miami, FL. The total driving distance
36
+ from Fort Lauderdale, FL to Miami, FL is 27 miles or 43 kilometers.Your trip begins
37
+ in Fort Lauderdale, Florida. It ends in Miami, Florida.ou can also calculate the
38
+ cost of driving from Fort Lauderdale, FL to Miami, FL based on current local fuel
39
+ prices and an estimate of your car's best gas mileage.
40
+ - There are 25.15 miles from Miami to Fort Lauderdale in north direction and 29.11
41
+ miles (46.85 kilometers) by car, following the I-95 route. Miami and Fort Lauderdale
42
+ are 31 minutes far apart, if you drive non-stop. This is the fastest route from
43
+ Miami, FL to Fort Lauderdale, FL. The halfway point is Aventura, FL.
44
+ - Need to know towns or cities within a specific radius of other places try this
45
+ Towns within a Radius of Treasure Island tool. The distance between Treasure Island
46
+ and St Augustine in a straight line is 172 miles or 276.75 Kilometers.
47
+ - Distance calculator helps you to find the distance between cities and calculate
48
+ the flying and driving distance in both kilometers and miles. Distance Calculator
49
+ Distance calculator is a tool for calculating distance between cities or places
50
+ on map.
51
+ - source_sentence: datsun 1200 starter motor diagram
52
+ sentences:
53
+ - Here you will find scans of the original wiring diagram for the 1971, 1972, and
54
+ 1973 Datsun 1200. 1971 Datsun 1200 Wiring Diagram. 1972 Datsun 1200 Wiring Diagram.
55
+ 1973 Datsun 1200 Wiring Diagram.
56
+ - Free Universal VIN decoder to check vehicle data and history. This is a universal
57
+ VIN decoder. Every car has a unique identifier code called a VIN. This number
58
+ contains vital information about the car, such as its manufacturer, year of production,
59
+ the plant it was produced in, type of engine, model and more.
60
+ - 'This review is from: Powerhorse Portable Generator with Electric Start - 9000
61
+ Surge Watts, 7250 Rated Watts (Misc.) Update Jan. 4, 2012... I found out the motor
62
+ for this generator is manufactured by a company called Ducar. I found this out
63
+ by speaking with a customer Service rep at Northern Tool.'
64
+ - Starter Motor. The engine starter is an electrical motor that is bolted to the
65
+ engine crankcase. The starter turns the engine flywheel teeth with the teeth on
66
+ the starter motor plunger which starts the riding lawnmower engine. Once the battery,
67
+ electrical connectors, wiring and starter solenoid have been tested and are functioning
68
+ properly, the problem likely lies with the engine starter motor.
69
+ - "Datsun is a Japanese car manufacturer that produced a great number of cars, starting\
70
+ \ with the first Datsun model in 1931 and on until the brand was discontinued\
71
+ \ in 1983. The Datsun brand became particularly famous because of one of their\
72
+ \ sports cars, the Fairlady model. Headquartered in Japan, Datsun was absorbed\
73
+ \ by another Japanese carmaker, Nissan, and the Datsun models were rebadged as\
74
+ \ Nissanâ\x80\x99s from 1983 onwards."
75
+ - Datsun is an automobile brand owned by Nissan. Datsun's original production run
76
+ began in 1931. From 1958 to 1986, only vehicles exported by Nissan were identified
77
+ as Datsun. By 1986 Nissan had phased out the Datsun name, but re-launched it in
78
+ 2013 as the brand for low-cost vehicles manufactured for emerging markets.
79
+ - The dial reading should be 12 volts or more. Work the starter switch, and the
80
+ reading should fall, but not below 10.5 volts. If the reading does not fall, there
81
+ is a fault in the ignition-switch circuit or in the solenoid.
82
+ - echo dot. Controls lights, fans, switches, thermostats, garage doors, sprinklers,
83
+ and more with compatible connected devices. Connects to speakers or headphones
84
+ through Bluetooth or 3.5 mm stereo cable to play music.
85
+ - Acura TSX vs Honda Accord. Compare price, expert/user reviews, mpg, engines, safety,
86
+ cargo capacity and other specs at a glance.
87
+ - source_sentence: what primary plant tissue is involved in transport of minerals
88
+ sentences:
89
+ - The ground tissue comprises the bulk of the primary plant body. Parenchyma, collenchyma,
90
+ and sclerenchyma cells are common in the ground tissue. Vascular tissue transports
91
+ food, water, hormones and minerals within the plant. Vascular tissue includes
92
+ xylem, phloem, parenchyma, and cambium cells. Two views of the structure of the
93
+ root and root meristem.
94
+ - · just now. Report Abuse. 1.Water transport-in xylem from the roots up the plant.
95
+ A passive process = transpiration 2.Sugar transport-sucrose is the main sugar
96
+ transported in phloem from leaves up and down the plant. An active process-needs
97
+ ATP.Process called translocation Look up transpiration and translocation Discuss
98
+ the cells used in each, the direction and the mechanism. Peter S · 6 years ago.ugar
99
+ i.e. glucose is produced in leaves (green parts of the plant) and it is transported
100
+ to other parts of the plants such as reproductive organs, fruits and seeds, roots
101
+ and stem (growing tips etc) through phloem tissue. Xylems are made up of dead
102
+ cells while phloem tissue is made up of living cells.
103
+ - 1 of 6. Plants have tissues to transport water, nutrients and minerals. Xylem
104
+ transports water and mineral salts from the roots up to other parts of the plant,
105
+ while phloem transports sucrose and amino acids between the leaves and other parts
106
+ of the plant. Xylem and phloem in the centre of the plant root.
107
+ - Xylem is one of two types of vascular tissues found in plant (the other being
108
+ Phloem). The word Xylem is derived from the Greek for wood. The xylem is responsible
109
+ for the transport (translocation) of water and soluble mineral nutrients from
110
+ the roots throughout the plant, this facilitates the replacement of water lost
111
+ during transpiration and photosynthesis.
112
+ - Various vascular tissues in the root allow for transportation of water and nutrients
113
+ to the rest of theplant.Plant cells have a cell wall to provide support, a large
114
+ vacuole for storage of minerals, food, andchloroplasts where photosynthesis takes
115
+ place.
116
+ - The function of the phloem tissue is to transport food nutrients such as glucose
117
+ and amino acids from the leaves and to all other cells of the plant, this is called
118
+ translocation.
119
+ - Plants have tissues to transport water, nutrients and minerals. Xylem transports
120
+ water and mineral salts from the roots up to other parts of the plant, while phloem
121
+ transports sucrose and amino acids between the leaves and other parts of the plant.
122
+ Xylem and phloem in the centre of the plant root
123
+ - Seed plants contain 2 types of vascular tissue (xylem & phloem) to help transport
124
+ water, minerals, & food throughout the root & shoot systems.Plant cells have several
125
+ specialized structures including a central vacuole for storage, plastids for storage
126
+ of pigments, and a thick cell wall of cellulose.lants have 3 tissue systems ---
127
+ ground, dermal, and vascular tissues. Plant tissues make up the main organs of
128
+ a plant --- root, stem, leaf, & flower.
129
+ - 'This article is about vascular tissue in plants. For transport in animals, see
130
+ Circulatory system. Vascular tissue is a complex conducting tissue, formed of
131
+ more than one cell type, found in vascular plants. The primary components of vascular
132
+ tissue are the xylem and phloem. These two tissues transport fluid and nutrients
133
+ internally. There are also two meristems associated with vascular tissue: the
134
+ vascular cambium and the cork cambium. All the vascular tissues within a particular
135
+ plant together constitute the vascular tissue system of that plant. The cells
136
+ in vascular tissue are typically long and slender. Since the xylem and phloem
137
+ function in the conduction of water, minerals, and nutrients throughout the plant,
138
+ it is not surprising that their form should be similar to pipes.'
139
+ - source_sentence: Julia name meaning
140
+ sentences:
141
+ - 'In American the meaning of the name Julie is: Young. French Meaning: The name
142
+ Julie is a French baby name. In French the meaning of the name Julie is: Downy.
143
+ French form of Julia. Also can be a feminine form of Julian: Youthful.'
144
+ - Jelisa Name Meaning. You are honest, benevolent, brilliant and often inventive,
145
+ full of high inspirations. You are courageous, honest, determined, original and
146
+ creative. You are a leader, especially for a cause.
147
+ - The name Julius is of Latin and Greek origin. The meaning of Julius is downy-bearded,
148
+ soft haired, implying youthful. Julius is generally used as a boy's name. It consists
149
+ of 6 letters and 3 syllables and is pronounced Ju-li-us.
150
+ - 'The name Julia is a Greek baby name. In Greek the meaning of the name Julia is:
151
+ Downy. Hairy. Derived from the clan name of Roman dictator Gaius Julius Caesar.
152
+ Latin Meaning: The name Julia is a Latin baby name.'
153
+ - 'The name Julia is an American baby name. In American the meaning of the name
154
+ Julia is: Youthful. Swedish Meaning: The name Julia is a Swedish baby name. In
155
+ Swedish the meaning of the name Julia is: Youth.Greek Meaning: The name Julia
156
+ is a Greek baby name. In Greek the meaning of the name Julia is: Downy. Hairy.
157
+ Derived from the clan name of Roman dictator Gaius Julius Caesar.Latin Meaning:
158
+ The name Julia is a Latin baby name.In Latin the meaning of the name Julia is:
159
+ Young. The feminine form of Julius. A character in Shakespeare''s play ''Two Gentlemen
160
+ of Verona''. Shakespearean Meaning: The name Julia is a Shakespearean baby name.he
161
+ name Julia is a Latin baby name. In Latin the meaning of the name Julia is: Young.
162
+ The feminine form of Julius. A character in Shakespeare''s play ''Two Gentlemen
163
+ of Verona''.'
164
+ - The meaning of Julius is Downy-bearded youth. Its origin is Variant of the Roman
165
+ name Iulius. Julius is a form of Iulius and is generally pronounced like JOO lee
166
+ es. This name is mostly being used as a boys name. Last year it ranked 312th in
167
+ the U.S. Social Security Administration list of most popular baby boy names. Show
168
+ popularity chart.
169
+ - "What does Julianna mean? Julianna [ju-lian-na] as a name for girls is a Latin\
170
+ \ name, and Julianna means youthful; Jove's child. Julianna is a version of Juliana\
171
+ \ (Latin): feminine of Julius. Associated with: youthful. Juliannaâ\x96² has 1\
172
+ \ variant: Julieanna."
173
+ - 'The name Jelena is a Russian baby name. In Russian the meaning of the name Jelena
174
+ is: Shining light. SoulUrge Number: 11. Expression Number: 2. People with this
175
+ name have a deep inner desire to inspire others in a higher cause, and to share
176
+ their own strongly held views on spiritual matters.'
177
+ - 'The name Julia is a Latin baby name. In Latin the meaning of the name Julia is:
178
+ Young. The feminine form of Julius. A character in Shakespeare''s play ''Two Gentlemen
179
+ of Verona''.he name Julia is a Latin baby name. In Latin the meaning of the name
180
+ Julia is: Young. The feminine form of Julius. A character in Shakespeare''s play
181
+ ''Two Gentlemen of Verona''.'
182
+ - source_sentence: what does ly mean in a blood test
183
+ sentences:
184
+ - According to the Hormone-Refractory Prostate Cancer Association, LY on a blood
185
+ test stands for lymphocytes. The number in the results represents the percentage
186
+ of lymphocytes in the white blood count. Lymphocytes should count for 15 to 46.8
187
+ percent of white blood cells. Continue Reading.
188
+ - FROM OUR EXPERTS. Trace lysed blood refers to a finding that is usually reported
189
+ from a urinary dip stick analysis. It implies that there is a small quantity of
190
+ red cells in the urine that have broken open. The developer on the dip stick reacts
191
+ with the hemoglobin that is released when the red cells are lysed.
192
+ - Does anybody know what LM and TFT in blood testing is please? Does anybody know
193
+ what LM and TFT in blood testing is please? TFT is Thyroid Function Tests , usually
194
+ done as a series of tests for TSH (Thyroid-stimulating hormone) and for the T3
195
+ (Triiodothyronine) and T4 ( Thyroxine), which are the hormones produced by the
196
+ thyroid gland.
197
+ - Uses. A blood lipid test measures the levels of HDL (high-density lipoprotein)
198
+ cholesterol and LDL (low-density lipoprotein) cholesterol in the blood as well
199
+ as triglycerides. These results can help doctors determine patients' health status.
200
+ - It is found in almost all body tissues and can be measured by a simple blood test.
201
+ Normal LDH levels are generally low and range between 140 IU/liter to 333 IU/
202
+ liter. Low LDH levels are usually no cause for concern. However, elevated LDH
203
+ levels may indicate cell damage.
204
+ - Usually, an LFT blood test measures the amount of bilirubin in the blood. Bilirubin
205
+ is released when red blood cells breakdown, and it is the liver that detoxifies
206
+ the bilirubin and helps to eliminate it from the body. Bilirubin is a part of
207
+ the digestive juice, bile, which the liver produces.
208
+ - 'Medical Definition of LYE. 1. : a strong alkaline liquor rich in potassium carbonate
209
+ leached from wood ashes and used especially in making soap and washing; broadly:
210
+ a strong alkaline solution (as of sodium hydroxide or potassium hydroxide). 2.:
211
+ a solid caustic (as sodium hydroxide).edical Definition of LYE. 1. : a strong
212
+ alkaline liquor rich in potassium carbonate leached from wood ashes and used especially
213
+ in making soap and washing; broadly: a strong alkaline solution (as of sodium
214
+ hydroxide or potassium hydroxide). 2. : a solid caustic (as sodium hydroxide).'
215
+ - FROM OUR COMMUNITY. Hi Terry, The LY (Lymphocytes) in your blood test is; the
216
+ type of white blood cell found in the blood and lymph systems; part of the immune
217
+ system. BUN/CREAT - Bun and Creatinine are tests done to monitor kidney function.
218
+ I'm sorry, but I've never heard of the other 2.
219
+ - The Lactic Acid Plasma Test, or Lactate Test, measures the amount of lactate in
220
+ the blood to determine if a patient has lactic acidosis.he test may be ordered
221
+ at prescribed intervals to monitor lactate levels. The Lactate Test is also known
222
+ as Lactic Acid, Plasma Lactate, and L-Lactate. Patients will be instructed to
223
+ fast prior to this blood test and be in a resting state.
224
+ datasets:
225
+ - tomaarsen/msmarco-Qwen3-Reranker-0.6B
226
+ pipeline_tag: sentence-similarity
227
+ library_name: sentence-transformers
228
+ metrics:
229
+ - cosine_accuracy@1
230
+ - cosine_accuracy@3
231
+ - cosine_accuracy@5
232
+ - cosine_accuracy@10
233
+ - cosine_precision@1
234
+ - cosine_precision@3
235
+ - cosine_precision@5
236
+ - cosine_precision@10
237
+ - cosine_recall@1
238
+ - cosine_recall@3
239
+ - cosine_recall@5
240
+ - cosine_recall@10
241
+ - cosine_ndcg@10
242
+ - cosine_mrr@10
243
+ - cosine_map@100
244
+ model-index:
245
+ - name: ModernBERT-base-uncased finetuned on MSMARCO via distillation
246
+ results:
247
+ - task:
248
+ type: information-retrieval
249
+ name: Information Retrieval
250
+ dataset:
251
+ name: msmarco eval 1kq 1kd
252
+ type: msmarco-eval-1kq-1kd
253
+ metrics:
254
+ - type: cosine_accuracy@1
255
+ value: 0.72
256
+ name: Cosine Accuracy@1
257
+ - type: cosine_accuracy@3
258
+ value: 0.844
259
+ name: Cosine Accuracy@3
260
+ - type: cosine_accuracy@5
261
+ value: 0.887
262
+ name: Cosine Accuracy@5
263
+ - type: cosine_accuracy@10
264
+ value: 0.928
265
+ name: Cosine Accuracy@10
266
+ - type: cosine_precision@1
267
+ value: 0.72
268
+ name: Cosine Precision@1
269
+ - type: cosine_precision@3
270
+ value: 0.2813333333333333
271
+ name: Cosine Precision@3
272
+ - type: cosine_precision@5
273
+ value: 0.1774
274
+ name: Cosine Precision@5
275
+ - type: cosine_precision@10
276
+ value: 0.09280000000000001
277
+ name: Cosine Precision@10
278
+ - type: cosine_recall@1
279
+ value: 0.72
280
+ name: Cosine Recall@1
281
+ - type: cosine_recall@3
282
+ value: 0.844
283
+ name: Cosine Recall@3
284
+ - type: cosine_recall@5
285
+ value: 0.887
286
+ name: Cosine Recall@5
287
+ - type: cosine_recall@10
288
+ value: 0.928
289
+ name: Cosine Recall@10
290
+ - type: cosine_ndcg@10
291
+ value: 0.8242736490605469
292
+ name: Cosine Ndcg@10
293
+ - type: cosine_mrr@10
294
+ value: 0.790978968253969
295
+ name: Cosine Mrr@10
296
+ - type: cosine_map@100
297
+ value: 0.7935881895028645
298
+ name: Cosine Map@100
299
+ - task:
300
+ type: information-retrieval
301
+ name: Information Retrieval
302
+ dataset:
303
+ name: NanoMSMARCO
304
+ type: NanoMSMARCO
305
+ metrics:
306
+ - type: cosine_accuracy@1
307
+ value: 0.1
308
+ name: Cosine Accuracy@1
309
+ - type: cosine_accuracy@3
310
+ value: 0.3
311
+ name: Cosine Accuracy@3
312
+ - type: cosine_accuracy@5
313
+ value: 0.36
314
+ name: Cosine Accuracy@5
315
+ - type: cosine_accuracy@10
316
+ value: 0.52
317
+ name: Cosine Accuracy@10
318
+ - type: cosine_precision@1
319
+ value: 0.1
320
+ name: Cosine Precision@1
321
+ - type: cosine_precision@3
322
+ value: 0.1
323
+ name: Cosine Precision@3
324
+ - type: cosine_precision@5
325
+ value: 0.07200000000000001
326
+ name: Cosine Precision@5
327
+ - type: cosine_precision@10
328
+ value: 0.05200000000000001
329
+ name: Cosine Precision@10
330
+ - type: cosine_recall@1
331
+ value: 0.1
332
+ name: Cosine Recall@1
333
+ - type: cosine_recall@3
334
+ value: 0.3
335
+ name: Cosine Recall@3
336
+ - type: cosine_recall@5
337
+ value: 0.36
338
+ name: Cosine Recall@5
339
+ - type: cosine_recall@10
340
+ value: 0.52
341
+ name: Cosine Recall@10
342
+ - type: cosine_ndcg@10
343
+ value: 0.2824159565255077
344
+ name: Cosine Ndcg@10
345
+ - type: cosine_mrr@10
346
+ value: 0.20972222222222223
347
+ name: Cosine Mrr@10
348
+ - type: cosine_map@100
349
+ value: 0.22982047430129676
350
+ name: Cosine Map@100
351
+ - task:
352
+ type: information-retrieval
353
+ name: Information Retrieval
354
+ dataset:
355
+ name: NanoNFCorpus
356
+ type: NanoNFCorpus
357
+ metrics:
358
+ - type: cosine_accuracy@1
359
+ value: 0.02
360
+ name: Cosine Accuracy@1
361
+ - type: cosine_accuracy@3
362
+ value: 0.14
363
+ name: Cosine Accuracy@3
364
+ - type: cosine_accuracy@5
365
+ value: 0.24
366
+ name: Cosine Accuracy@5
367
+ - type: cosine_accuracy@10
368
+ value: 0.4
369
+ name: Cosine Accuracy@10
370
+ - type: cosine_precision@1
371
+ value: 0.02
372
+ name: Cosine Precision@1
373
+ - type: cosine_precision@3
374
+ value: 0.05999999999999999
375
+ name: Cosine Precision@3
376
+ - type: cosine_precision@5
377
+ value: 0.068
378
+ name: Cosine Precision@5
379
+ - type: cosine_precision@10
380
+ value: 0.08199999999999999
381
+ name: Cosine Precision@10
382
+ - type: cosine_recall@1
383
+ value: 0.0003636363636363636
384
+ name: Cosine Recall@1
385
+ - type: cosine_recall@3
386
+ value: 0.0031788855979215523
387
+ name: Cosine Recall@3
388
+ - type: cosine_recall@5
389
+ value: 0.008116504313017697
390
+ name: Cosine Recall@5
391
+ - type: cosine_recall@10
392
+ value: 0.024507502383325896
393
+ name: Cosine Recall@10
394
+ - type: cosine_ndcg@10
395
+ value: 0.07247990678185146
396
+ name: Cosine Ndcg@10
397
+ - type: cosine_mrr@10
398
+ value: 0.1187936507936508
399
+ name: Cosine Mrr@10
400
+ - type: cosine_map@100
401
+ value: 0.021084185720588976
402
+ name: Cosine Map@100
403
+ - task:
404
+ type: information-retrieval
405
+ name: Information Retrieval
406
+ dataset:
407
+ name: NanoNQ
408
+ type: NanoNQ
409
+ metrics:
410
+ - type: cosine_accuracy@1
411
+ value: 0.08
412
+ name: Cosine Accuracy@1
413
+ - type: cosine_accuracy@3
414
+ value: 0.1
415
+ name: Cosine Accuracy@3
416
+ - type: cosine_accuracy@5
417
+ value: 0.14
418
+ name: Cosine Accuracy@5
419
+ - type: cosine_accuracy@10
420
+ value: 0.2
421
+ name: Cosine Accuracy@10
422
+ - type: cosine_precision@1
423
+ value: 0.08
424
+ name: Cosine Precision@1
425
+ - type: cosine_precision@3
426
+ value: 0.04
427
+ name: Cosine Precision@3
428
+ - type: cosine_precision@5
429
+ value: 0.032
430
+ name: Cosine Precision@5
431
+ - type: cosine_precision@10
432
+ value: 0.022000000000000002
433
+ name: Cosine Precision@10
434
+ - type: cosine_recall@1
435
+ value: 0.07
436
+ name: Cosine Recall@1
437
+ - type: cosine_recall@3
438
+ value: 0.1
439
+ name: Cosine Recall@3
440
+ - type: cosine_recall@5
441
+ value: 0.13
442
+ name: Cosine Recall@5
443
+ - type: cosine_recall@10
444
+ value: 0.19
445
+ name: Cosine Recall@10
446
+ - type: cosine_ndcg@10
447
+ value: 0.12278869017927456
448
+ name: Cosine Ndcg@10
449
+ - type: cosine_mrr@10
450
+ value: 0.10635714285714286
451
+ name: Cosine Mrr@10
452
+ - type: cosine_map@100
453
+ value: 0.11393704035020408
454
+ name: Cosine Map@100
455
+ - task:
456
+ type: nano-beir
457
+ name: Nano BEIR
458
+ dataset:
459
+ name: NanoBEIR mean
460
+ type: NanoBEIR_mean
461
+ metrics:
462
+ - type: cosine_accuracy@1
463
+ value: 0.06666666666666667
464
+ name: Cosine Accuracy@1
465
+ - type: cosine_accuracy@3
466
+ value: 0.18000000000000002
467
+ name: Cosine Accuracy@3
468
+ - type: cosine_accuracy@5
469
+ value: 0.24666666666666667
470
+ name: Cosine Accuracy@5
471
+ - type: cosine_accuracy@10
472
+ value: 0.37333333333333335
473
+ name: Cosine Accuracy@10
474
+ - type: cosine_precision@1
475
+ value: 0.06666666666666667
476
+ name: Cosine Precision@1
477
+ - type: cosine_precision@3
478
+ value: 0.06666666666666667
479
+ name: Cosine Precision@3
480
+ - type: cosine_precision@5
481
+ value: 0.05733333333333334
482
+ name: Cosine Precision@5
483
+ - type: cosine_precision@10
484
+ value: 0.052
485
+ name: Cosine Precision@10
486
+ - type: cosine_recall@1
487
+ value: 0.0567878787878788
488
+ name: Cosine Recall@1
489
+ - type: cosine_recall@3
490
+ value: 0.13439296186597385
491
+ name: Cosine Recall@3
492
+ - type: cosine_recall@5
493
+ value: 0.1660388347710059
494
+ name: Cosine Recall@5
495
+ - type: cosine_recall@10
496
+ value: 0.2448358341277753
497
+ name: Cosine Recall@10
498
+ - type: cosine_ndcg@10
499
+ value: 0.15922818449554457
500
+ name: Cosine Ndcg@10
501
+ - type: cosine_mrr@10
502
+ value: 0.14495767195767198
503
+ name: Cosine Mrr@10
504
+ - type: cosine_map@100
505
+ value: 0.12161390012402994
506
+ name: Cosine Map@100
507
+ ---
508
+
509
+ # ModernBERT-base-uncased finetuned on MSMARCO via distillation
510
+
511
+ This is a [sentence-transformers](https://www.SBERT.net) model trained on the [msmarco-qwen3-reranker-0.6_b](https://huggingface.co/datasets/tomaarsen/msmarco-Qwen3-Reranker-0.6B) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
512
+
513
+ ## Model Details
514
+
515
+ ### Model Description
516
+ - **Model Type:** Sentence Transformer
517
+ <!-- - **Base model:** [Unknown](https://huggingface.co/unknown) -->
518
+ - **Maximum Sequence Length:** 8192 tokens
519
+ - **Output Dimensionality:** 768 dimensions
520
+ - **Similarity Function:** Cosine Similarity
521
+ - **Training Dataset:**
522
+ - [msmarco-qwen3-reranker-0.6_b](https://huggingface.co/datasets/tomaarsen/msmarco-Qwen3-Reranker-0.6B)
523
+ - **Language:** en
524
+ - **License:** apache-2.0
525
+
526
+ ### Model Sources
527
+
528
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
529
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
530
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
531
+
532
+ ### Full Model Architecture
533
+
534
+ ```
535
+ SentenceTransformer(
536
+ (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
537
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
538
+ )
539
+ ```
540
+
541
+ ## Usage
542
+
543
+ ### Direct Usage (Sentence Transformers)
544
+
545
+ First install the Sentence Transformers library:
546
+
547
+ ```bash
548
+ pip install -U sentence-transformers
549
+ ```
550
+
551
+ Then you can load this model and run inference.
552
+ ```python
553
+ from sentence_transformers import SentenceTransformer
554
+
555
+ # Download from the 🤗 Hub
556
+ model = SentenceTransformer("tomaarsen/ModernBERT-base-uncased-msmarco-margin-mse")
557
+ # Run inference
558
+ queries = [
559
+ "what does ly mean in a blood test",
560
+ ]
561
+ documents = [
562
+ 'According to the Hormone-Refractory Prostate Cancer Association, LY on a blood test stands for lymphocytes. The number in the results represents the percentage of lymphocytes in the white blood count. Lymphocytes should count for 15 to 46.8 percent of white blood cells. Continue Reading.',
563
+ "FROM OUR COMMUNITY. Hi Terry, The LY (Lymphocytes) in your blood test is; the type of white blood cell found in the blood and lymph systems; part of the immune system. BUN/CREAT - Bun and Creatinine are tests done to monitor kidney function. I'm sorry, but I've never heard of the other 2.",
564
+ 'FROM OUR EXPERTS. Trace lysed blood refers to a finding that is usually reported from a urinary dip stick analysis. It implies that there is a small quantity of red cells in the urine that have broken open. The developer on the dip stick reacts with the hemoglobin that is released when the red cells are lysed.',
565
+ ]
566
+ query_embeddings = model.encode_query(queries)
567
+ document_embeddings = model.encode_document(documents)
568
+ print(query_embeddings.shape, document_embeddings.shape)
569
+ # [1, 768] [3, 768]
570
+
571
+ # Get the similarity scores for the embeddings
572
+ similarities = model.similarity(query_embeddings, document_embeddings)
573
+ print(similarities)
574
+ # tensor([[0.9295, 0.9320, 0.9230]])
575
+ ```
576
+
577
+ <!--
578
+ ### Direct Usage (Transformers)
579
+
580
+ <details><summary>Click to see the direct usage in Transformers</summary>
581
+
582
+ </details>
583
+ -->
584
+
585
+ <!--
586
+ ### Downstream Usage (Sentence Transformers)
587
+
588
+ You can finetune this model on your own dataset.
589
+
590
+ <details><summary>Click to expand</summary>
591
+
592
+ </details>
593
+ -->
594
+
595
+ <!--
596
+ ### Out-of-Scope Use
597
+
598
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
599
+ -->
600
+
601
+ ## Evaluation
602
+
603
+ ### Metrics
604
+
605
+ #### Information Retrieval
606
+
607
+ * Datasets: `msmarco-eval-1kq-1kd`, `NanoMSMARCO`, `NanoNFCorpus` and `NanoNQ`
608
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
609
+
610
+ | Metric | msmarco-eval-1kq-1kd | NanoMSMARCO | NanoNFCorpus | NanoNQ |
611
+ |:--------------------|:---------------------|:------------|:-------------|:-----------|
612
+ | cosine_accuracy@1 | 0.72 | 0.1 | 0.02 | 0.08 |
613
+ | cosine_accuracy@3 | 0.844 | 0.3 | 0.14 | 0.1 |
614
+ | cosine_accuracy@5 | 0.887 | 0.36 | 0.24 | 0.14 |
615
+ | cosine_accuracy@10 | 0.928 | 0.52 | 0.4 | 0.2 |
616
+ | cosine_precision@1 | 0.72 | 0.1 | 0.02 | 0.08 |
617
+ | cosine_precision@3 | 0.2813 | 0.1 | 0.06 | 0.04 |
618
+ | cosine_precision@5 | 0.1774 | 0.072 | 0.068 | 0.032 |
619
+ | cosine_precision@10 | 0.0928 | 0.052 | 0.082 | 0.022 |
620
+ | cosine_recall@1 | 0.72 | 0.1 | 0.0004 | 0.07 |
621
+ | cosine_recall@3 | 0.844 | 0.3 | 0.0032 | 0.1 |
622
+ | cosine_recall@5 | 0.887 | 0.36 | 0.0081 | 0.13 |
623
+ | cosine_recall@10 | 0.928 | 0.52 | 0.0245 | 0.19 |
624
+ | **cosine_ndcg@10** | **0.8243** | **0.2824** | **0.0725** | **0.1228** |
625
+ | cosine_mrr@10 | 0.791 | 0.2097 | 0.1188 | 0.1064 |
626
+ | cosine_map@100 | 0.7936 | 0.2298 | 0.0211 | 0.1139 |
627
+
628
+ #### Nano BEIR
629
+
630
+ * Dataset: `NanoBEIR_mean`
631
+ * Evaluated with [<code>NanoBEIREvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.NanoBEIREvaluator) with these parameters:
632
+ ```json
633
+ {
634
+ "dataset_names": [
635
+ "msmarco",
636
+ "nfcorpus",
637
+ "nq"
638
+ ]
639
+ }
640
+ ```
641
+
642
+ | Metric | Value |
643
+ |:--------------------|:-----------|
644
+ | cosine_accuracy@1 | 0.0667 |
645
+ | cosine_accuracy@3 | 0.18 |
646
+ | cosine_accuracy@5 | 0.2467 |
647
+ | cosine_accuracy@10 | 0.3733 |
648
+ | cosine_precision@1 | 0.0667 |
649
+ | cosine_precision@3 | 0.0667 |
650
+ | cosine_precision@5 | 0.0573 |
651
+ | cosine_precision@10 | 0.052 |
652
+ | cosine_recall@1 | 0.0568 |
653
+ | cosine_recall@3 | 0.1344 |
654
+ | cosine_recall@5 | 0.166 |
655
+ | cosine_recall@10 | 0.2448 |
656
+ | **cosine_ndcg@10** | **0.1592** |
657
+ | cosine_mrr@10 | 0.145 |
658
+ | cosine_map@100 | 0.1216 |
659
+
660
+ <!--
661
+ ## Bias, Risks and Limitations
662
+
663
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
664
+ -->
665
+
666
+ <!--
667
+ ### Recommendations
668
+
669
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
670
+ -->
671
+
672
+ ## Training Details
673
+
674
+ ### Training Dataset
675
+
676
+ #### msmarco-qwen3-reranker-0.6_b
677
+
678
+ * Dataset: [msmarco-qwen3-reranker-0.6_b](https://huggingface.co/datasets/tomaarsen/msmarco-Qwen3-Reranker-0.6B) at [20c25c8](https://huggingface.co/datasets/tomaarsen/msmarco-Qwen3-Reranker-0.6B/tree/20c25c858f80ba96bdb58f1558746e077001303a)
679
+ * Size: 10,000 training samples
680
+ * Columns: <code>query</code>, <code>positive</code>, <code>negative_1</code>, <code>negative_2</code>, <code>negative_3</code>, <code>negative_4</code>, <code>negative_5</code>, <code>negative_6</code>, <code>negative_7</code>, <code>negative_8</code>, and <code>score</code>
681
+ * Approximate statistics based on the first 1000 samples:
682
+ | | query | positive | negative_1 | negative_2 | negative_3 | negative_4 | negative_5 | negative_6 | negative_7 | negative_8 | score |
683
+ |:--------|:---------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------|
684
+ | type | string | string | string | string | string | string | string | string | string | string | list |
685
+ | details | <ul><li>min: 5 tokens</li><li>mean: 9.32 tokens</li><li>max: 40 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 80.5 tokens</li><li>max: 292 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 68.48 tokens</li><li>max: 210 tokens</li></ul> | <ul><li>min: 12 tokens</li><li>mean: 70.21 tokens</li><li>max: 204 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 70.05 tokens</li><li>max: 250 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 71.19 tokens</li><li>max: 190 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 72.01 tokens</li><li>max: 203 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 71.88 tokens</li><li>max: 199 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 73.13 tokens</li><li>max: 201 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 70.83 tokens</li><li>max: 200 tokens</li></ul> | <ul><li>size: 9 elements</li></ul> |
686
+ * Samples:
687
+ | query | positive | negative_1 | negative_2 | negative_3 | negative_4 | negative_5 | negative_6 | negative_7 | negative_8 | score |
688
+ |:-------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------|
689
+ | <code>what is clomiphene</code> | <code>Uses of This Medicine. Clomiphene is used as a fertility medicine in some women who are unable to become pregnant. Clomiphene probably works by changing the hormone balance of the body. In women, this causes ovulation to occur and prepares the body for pregnancy.ses of This Medicine. Clomiphene is used as a fertility medicine in some women who are unable to become pregnant. Clomiphene probably works by changing the hormone balance of the body. In women, this causes ovulation to occur and prepares the body for pregnancy.</code> | <code>Clomiphene citrate, a synthetic hormone commonly used to induce or regulate ovulation, is the most often prescribed fertility pill. Brand names for clomiphene citrate include Clomid and Serophene. Clomiphene works indirectly to stimulate ovulation.</code> | <code>Occasionally, clomiphene can stimulate the ovaries too much, causing multiple eggs to be released, which can result in multiple births, such as twins or triplets (see Clomid and Twins) . Clomiphene is one of the least expensive and easiest-to-use fertility drugs. However, it will not work for all types of infertility. Your healthcare provider needs to try to find your cause of infertility before you try clomiphene.</code> | <code>Clomiphene Citrate offers two benefits to the performance enhancing athlete with one being primary. Most commonly, this SERM is used for post cycle recovery purposes; specifically to stimulate natural testosterone production that has been suppressed due to the use of anabolic steroids.</code> | <code>PCOS and ovulation problems and Clomid treatment. Clomid (clomiphene citrate or Serophene) is an oral medication that is commonly used for the treatment of infertility. It is often given to try to induce ovulation in women that do not develop and release an egg (ovulate) on their own.</code> | <code>Indication: Clomid (clomiphene citrate) is often the first choice for treating infertility, because it's effective and been used for more than 40 years.</code> | <code>Clomid Description. 1 Clomid (clomiphene citrate tablets USP) is an orally administered, nonsteroidal, ovulatory stimulant designated chemically as 2-[p-(2-chloro-1,2-diphenylvinyl)phenoxy] triethylamine citrate (1:1). It has the molecular formula of C26H28ClNO • C6H8O7 and a molecular weight of 598.09.</code> | <code>PCOS and ovulation problems and Clomid treatment. Clomid (clomiphene citrate or Serophene) is an oral medication that is commonly used for the treatment of infertility. 1 It is often given to try to induce ovulation in women that do not develop and release an egg (ovulate) on their own. Clomid is started early in the menstrual cycle and is taken for five days either from cycle days 3 through 7, or from day 5 through 9. 2 Clomid is usually started at a dose of one tablet (50mg) daily-taken any time of day.</code> | <code>Clomid is taken as a pill. This is unlike the stronger fertility drugs, which require injection. Clomid is also very effective, stimulating ovulation 80 percent of the time. Clomid may also be marketed under the name Serophene, or you may see it sold under its generic name, clomiphene citrate. Note: Clomid can also be used as a treatment for male infertility. This article focuses on Clomid treatment in women.</code> | <code>[4.75390625, 6.9375, 3.92578125, 1.0400390625, 5.61328125, ...]</code> |
690
+ | <code>typical accountant cost for it contractor</code> | <code>In the current market, we’ve seen rates as low as £50 +VAT, and as high as £180 +VAT for dedicated contractor accountants. Interestingly, the average cost of contractor accounting has not risen in line with inflation over the past decade.</code> | <code>So, how much does a contractor cost, anywhere from 5% to 25% of the total project cost, with the average ranging 10-15%.ypically the contractor' s crew will be general carpentry trades people, some who may have more specialized skills. Exactly how a general contractor charges for a project depends on the type of contract you agree to. There are three common types of cost contracts, fixed price, time & materials and cost plus a fee.</code> | <code>1 Accountants charge $150-$400 or more an hour, depending on the type of work, the size of the firm and its location. 2 You'll pay lower rates for routine work done by a less-experienced associate or lesser-trained employee, such as $30-$50 for bookkeeping services. 3 An accountant's total fee depends on the project. For a simple start-up, expect a minimum of 0.5-1.5 hours of consultation ($75-$600) to go over your business structure and basic tax issues.</code> | <code>So, how much does a contractor cost, anywhere from 5% to 25% of the total project cost, with the average ranging 10-15%.xactly how a general contractor charges for a project depends on the type of contract you agree to. There are three common types of cost contracts, fixed price, time & materials and cost plus a fee. Each contract type has pros and cons for both the consumer and for the contractor.</code> | <code>1 Accountants charge $150-$400 or more an hour, depending on the type of work, the size of the firm and its location. 2 You'll pay lower rates for routine work done by a less-experienced associate or lesser-trained employee, such as $30-$50 for bookkeeping services. 3 An accountant's total fee depends on the project.</code> | <code>average data entry keystrokes per hour salaries the average salary for data entry keystrokes per hour jobs is $ 20000</code> | <code>Accounting services are typically $250 to $400 per month, or $350 to $500 per quarter. Sales tax and bank recs included. We do all the processing, filing and tax deposits. 5 employees, bi-weekly payroll, direct deposit, $135 per month.</code> | <code>The less that is outsourced, the cheaper it will be for you. A bookkeeper should be paid between $15 and $18 per hour. An accountant with a undergraduate degree (4-years) should be paid somewhere around $20/hour but that still depends on what you're having them do. An accountant with a graduate degree (masters) should be paid between $25 and $30 per hour.</code> | <code>Pay by Experience Level for Intelligence Analyst. Median of all compensation (including tips, bonus, and overtime) by years of experience. Intelligence Analysts with a lot of experience tend to enjoy higher earnings.</code> | <code>[7.44921875, 3.271484375, 5.859375, 3.234375, 5.421875, ...]</code> |
691
+ | <code>what is mch on a blood test</code> | <code>What High Levels Mean. MCH levels in blood tests are considered high if they are 35 or higher. A normal hemoglobin level is considered to be in the range between 26 and 33 picograms per red blood cell. High MCH levels can indicate macrocytic anemia, which can be caused by insufficient vitamin B12.acrocytic RBCs are large so tend to have a higher MCH, while microcytic red cells would have a lower value.”. MCH is one of three red blood cell indices (MCHC and MCV are the other two). The measurements are done by machine and can help with diagnosis of medical problems.</code> | <code>MCH stands for mean corpuscular hemoglobin. It estimates the average amount of hemoglobin in each red blood cell, measured in picograms (a trillionth of a gram). Automated cell counters calculate the MCH, which is reported as part of a complete blood count (CBC) test. MCH may be low in iron-deficiency anemia, and may be high in anemia due to vitamin B12 or folate deficiency. Other forms of anemia can also cause MCH to be abnormal. Doctors only use the MCH as supporting information, not to make a diagnosis.</code> | <code>A. MCH stands for mean corpuscular hemoglobin. It estimates the average amount of hemoglobin in each red blood cell, measured in picograms (a trillionth of a gram). Automated cell counters calculate the MCH, which is reported as part of a complete blood count (CBC) test. MCH may be low in iron-deficiency anemia, and may be high in anemia due to vitamin B12 or folate deficiency. Other forms of anemia can also cause MCH to be abnormal.</code> | <code>The test used to determine the quantity of hemoglobin in the blood is known as the MCH blood test. The full form of MCH is Mean Corpuscular Hemoglobin. This test is therefore used to determine the average amount of hemoglobin per red blood cell in the body. The results of the MCH blood test are therefore reported in picograms, a tiny measure of weight.</code> | <code>MCH blood test high indicates that there is a poor supply of oxygen to the blood where as MCH blood test low mean that hemoglobin is too little in the cells indicating a lack of iron. It is important that iron is maintained at a certain level as too much or too little iron can be dangerous to your body.</code> | <code>slide 1 of 7. What Is MCH? MCH is the initialism for Mean Corpuscular Hemoglobin. Taken from Latin, the term refers to the average amount of hemoglobin found in red blood cells. A CBC (complete blood count) blood test can be used to monitor MCH levels in the blood. Lab Tests Online explains that the MCH aspect of a CBC test “is a measurement of the average amount of oxygen-carrying hemoglobin inside a red blood cell. Macrocytic RBCs are large so tend to have a higher MCH, while microcytic red cells would have a lower value..</code> | <code>The test used to determine the quantity of hemoglobin in the blood is known as the MCH blood test. The full form of MCH is Mean Corpuscular Hemoglobin. This test is therefore used to determine the average amount of hemoglobin per red blood cell in the body. The results of the MCH blood test are therefore reported in picograms, a tiny measure of weight. The normal range of the MCH blood test is between 26 and 33 pg per cell.</code> | <code>A MCHC test is a test that is carried out to test a person for anemia. The MCHC in a MCHC test stands for Mean Corpuscular Hemoglobin Concentration. MCHC is the calculation of the average hemoglobin inside a red blood cell. A MCHC test can be performed along with a MCV test (Mean Corpuscular Volume).Both levels are used to test people for anemia.The MCHC test is also known as the MCH blood test which tests the levels of hemoglobin in the blood. The MCHC test can be ordered as part of a complete blood count (CBC) test.CHC is measured in grams per deciliter. Normal readings for MCHC are 31 grams per deciliter to 35 grams per deciliter. A MCHC blood test may be ordered when a person is showing signs of fatigue or weakness, when there is an infection, is bleeding or bruising easily or when there is an inflammation.</code> | <code>The test looks at the average amount of hemoglobin per red cell. So MCHC = the amount of hemoglobin present in each red blood cell. A MCHC blood test could be ordered for someone who has signs of fatigue or weakness, when there is an infection, is bleeding or bruising easily or when there is noticeable inflammation.</code> | <code>[6.44921875, 7.05078125, 7.2109375, 8.40625, 6.53515625, ...]</code> |
692
+ * Loss: [<code>MarginMSELoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#marginmseloss)
693
+
694
+ ### Evaluation Dataset
695
+
696
+ #### msmarco-qwen3-reranker-0.6_b
697
+
698
+ * Dataset: [msmarco-qwen3-reranker-0.6_b](https://huggingface.co/datasets/tomaarsen/msmarco-Qwen3-Reranker-0.6B) at [20c25c8](https://huggingface.co/datasets/tomaarsen/msmarco-Qwen3-Reranker-0.6B/tree/20c25c858f80ba96bdb58f1558746e077001303a)
699
+ * Size: 1,000 evaluation samples
700
+ * Columns: <code>query</code>, <code>positive</code>, <code>negative_1</code>, <code>negative_2</code>, <code>negative_3</code>, <code>negative_4</code>, <code>negative_5</code>, <code>negative_6</code>, <code>negative_7</code>, <code>negative_8</code>, and <code>score</code>
701
+ * Approximate statistics based on the first 1000 samples:
702
+ | | query | positive | negative_1 | negative_2 | negative_3 | negative_4 | negative_5 | negative_6 | negative_7 | negative_8 | score |
703
+ |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-----------------------------------|
704
+ | type | string | string | string | string | string | string | string | string | string | string | list |
705
+ | details | <ul><li>min: 4 tokens</li><li>mean: 9.26 tokens</li><li>max: 24 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 81.57 tokens</li><li>max: 231 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 69.28 tokens</li><li>max: 211 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 69.02 tokens</li><li>max: 190 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 70.8 tokens</li><li>max: 230 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 70.69 tokens</li><li>max: 229 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 72.37 tokens</li><li>max: 214 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 68.96 tokens</li><li>max: 208 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 71.83 tokens</li><li>max: 237 tokens</li></ul> | <ul><li>min: 13 tokens</li><li>mean: 70.17 tokens</li><li>max: 210 tokens</li></ul> | <ul><li>size: 9 elements</li></ul> |
706
+ * Samples:
707
+ | query | positive | negative_1 | negative_2 | negative_3 | negative_4 | negative_5 | negative_6 | negative_7 | negative_8 | score |
708
+ |:-------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------|
709
+ | <code>how many people employed by shell</code> | <code>Shell worldwide. Royal Dutch Shell was formed in 1907, although our history dates back to the early 19th century, to a small shop in London where the Samuel family sold sea shells. Today, Shell is one of the world’s major energy companies, employing an average of 93,000 people and operating in more than 70 countries. Our headquarters are in The Hague, the Netherlands, and our Chief Executive Officer is Ben van Beurden.</code> | <code>Show sources information. This statistic shows the number of employees at SeaWorld Entertainment, Inc. in the United States, by type. As of December 2016, SeaWorld employed 5,000 full-time employees and counted approximately 13,000 seasonal employees during their peak operating season.</code> | <code>Jobs, companies, people, and articles for LinkedIn’s Payroll Specialist - Addus Homecare, Inc. members. Insights about Payroll Specialist - Addus Homecare, Inc. members on LinkedIn. Median salary $31,300.</code> | <code>As of July 2014, there are 139 million people employed in the United States. This number is up by 209,000 employees from June and by 1.47 million from the beginning of 2014.</code> | <code>average data entry keystrokes per hour salaries the average salary for data entry keystrokes per hour jobs is $ 20000</code> | <code>Research and review Plano Synergy jobs. Learn more about a career with Plano Synergy including all recent jobs, hiring trends, salaries, work environment and more. Find Jobs Company Reviews Find Salaries Find Resumes Employers / Post Job Upload your resume Sign in</code> | <code>From millions of real job salary data. 13 Customer Support Specialist salary data. Average Customer Support Specialist salary is $59,032 Detailed Customer Support Specialist starting salary, median salary, pay scale, bonus data report Register & Know how much $ you can earn \| Sign In</code> | <code>From millions of real job salary data. 1 Ceo Ally salary data. Average Ceo Ally salary is $55,000 Detailed Ceo Ally starting salary, median salary, pay scale, bonus data report</code> | <code>HelpSystems benefits and perks, including insurance benefits, retirement benefits, and vacation policy. Reported anonymously by HelpSystems employees. Glassdoor uses cookies to improve your site experience.</code> | <code>[6.265625, -1.3671875, -6.91796875, 1.111328125, -7.96875, ...]</code> |
710
+ | <code>what is a lcsw</code> | <code>LCSW is an acronym for licensed clinical social worker, and people with this title are skilled professionals who meet certain requirements and work in a variety of fields. The term social worker is not always synonymous with licensed clinical social worker.</code> | <code>LISW means the person is a Licensed Independent Social Worker. LCSW means the person is a Licensed Clinical Social Worker. Source(s): Introduction to Social Work 101 at University of Nevada, Las Vega (UNLV) Dorothy K. · 1 decade ago.</code> | <code>An LCSW is a licensed clinical social worker. A LMHC is the newest addition to the field of mental health. They are highly similar and can do most of the same things with few exceptions. One thing to keep in mind is that because the LMHC lincense is so new, there are fewer in number in the field.n LCSW is a licensed clinical social worker. A LMHC is the newest addition to the field of mental health. They are highly similar and can do most of the same things with few exceptions. One thing to keep in mind is that because the LMHC lincense is so new, there are fewer in number in the field.</code> | <code>The Licensed Clinical Social Worker or LCSW, is a sub-sector within the field of Social Work. They work with clients in order to help them deal with issues involving their mental and emotional health. This could be related to substance abuse, past trauma or mental illness.</code> | <code>Licensed Clinical Social Worker \| LCSW. The Licensed Clinical Social Worker or LCSW, is a sub-sector within the field of Social Work. LCSW's work with clients in order to help deal with issues involving mental and emotional health. There are a wide variety of specializations the Licensed Clinical Social Worker can focus on.</code> | <code>The LMSW exam is a computer-based test containing 170 multiple-choice questions designed to measure minimum competencies in four categories of social work practice: Human development, diversity, and behavior in the environment. Assessment and intervention planning.</code> | <code>The Licensed Clinical Social Worker, also known as the LCSW, is a branch of social work that specializes in mental health therapy in a counseling format. Becoming an LCSW requires a significant degree of training, including having earned a Master of Social Work (MSW) degree from a Council on Social Work Education (CSWE) accredited program.</code> | <code>a. The examination requirements for licensure as an LCSW include passing the Clinical Examination of the ASWB or the Clinical Social Workers Examination of the State of California. Scope of practice-Limitations. a.To the extent they are prepared through education and training, an LCSW can engage in all acts and practices defined as the practice of clinical social work. Certified Social Work (CSW): CSW means a licensed certified social worker. A CSW must have a master s degree.</code> | <code>The LTCM Client is a way for companies to stay in touch with you, their customers, in a way that is unobtrusive and completely under the users' control. It's an application that runs quietly on the computer. Users can and should customize the client to match their desired preferences.</code> | <code>[7.34375, 6.046875, 7.09765625, 6.46484375, 7.28515625, ...]</code> |
711
+ | <code>does oolong tea have much caffeine?</code> | <code>At a given weight, tea contains more caffeine than coffee, but this doesn’t mean that a usual portion of tea contains more caffeine than coffee because tea is usually brewed in a weak way. Some kinds of tea, such as oolong and black tea, contain higher level of caffeine than most other teas. Among six basic teas (green, black, yellow, white, oolong, dark), green tea contains less caffeine than black tea and white tea contains less than green tea. But many studies found that the caffeine content varies more among individual teas than it does among broad categories.</code> | <code>Actually, oolong tea has less caffeine than coffee and black tea. A cup of oolong tea only has about 1/3 of caffeine of a cup of coffee. According to a research conducted by HICKS M.B, the caffeine decreases whenever the tea leaves go through the process of brewing.</code> | <code>Oolong tea contains caffeine. Caffeine works by stimulating the central nervous system (CNS), heart, and muscles. Oolong tea also contains theophylline and theobromine, which are chemicals similar to caffeine. Too much oolong tea, more than five cups per day, can cause side effects because of the caffeine.</code> | <code>Oolong tea, made from more mature leaves, usually have less caffeine than green tea. On the flip side, mature leaves contain less theanine, a sweet, natural relaxant that makes a tea much less caffeinated than it actually is. That is the theory, anyway.</code> | <code>Oolong tea is a product made from the leaves, buds, and stems of the Camellia sinensis plant. This is the same plant that is also used to make black tea and green tea. The difference is in the processing.Oolong tea is partially fermented, black tea is fully fermented, and green tea is unfermented. Oolong tea is used to sharpen thinking skills and improve mental alertness. It is also used to prevent cancer, tooth decay, osteoporosis, and heart disease.owever, do not drink more than 2 cups a day of oolong tea. That amount of tea contains about 200 mg of caffeine. Too much caffeine during pregnancy might cause premature delivery, low birth weight, and harm to the baby.</code> | <code>A Department of Nutritional Services report provides the following ranges of caffeine content for a cup of tea made with loose leaves: 1 Black Tea: 23 - 110 mg. 2 Oolong Tea: 12 - 55 mg. Green Tea: 8 - 36 mg.</code> | <code>Oolong tea is a product made from the leaves, buds, and stems of the Camellia sinensis plant. This is the same plant that is also used to make black tea and green tea. The difference is in the processing. Oolong tea is partially fermented, black tea is fully fermented, and green tea is unfermented. Oolong tea is used to sharpen thinking skills and improve mental alertness. It is also used to prevent cancer, tooth decay, osteoporosis, and heart disease.</code> | <code>Health Effects of Tea – Caffeine. In dry form, a kilogram of black tea has twice the caffeine as a kilogram of coffee…. But one kilogram of black tea makes about 450 cups of tea and one kilogram of coffee makes about 100 cups of coffee, so…. There is less caffeine in a cup of tea than in a cup of coffee. Green teas have less caffeine than black teas, and white teas have even less caffeine than green teas. Oolong teas fall between black and green teas. Herbal tea, because it is not made from the same tea plant, is caffeine-free, naturally. Here is a graphical representation of their respective caffeine content.</code> | <code>The average 8-ounce serving of brewed black tea contains 14 to 70 mg of caffeine. This compares to 24 to 45 mg of caffeine found in green tea. An 8-ounce glass of instant iced tea prepared with water contains 11 to 47 mg of caffeine. Most ready-to-drink bottled teas contain 5 to 40 mg of caffeine. Just as with coffee, decaffeinated tea still contains 5 to 10 mg of caffeine per cup.</code> | <code>[7.60546875, 8.78125, 9.109375, 8.609375, 7.984375, ...]</code> |
712
+ * Loss: [<code>MarginMSELoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#marginmseloss)
713
+
714
+ ### Training Hyperparameters
715
+ #### Non-Default Hyperparameters
716
+
717
+ - `eval_strategy`: steps
718
+ - `per_device_train_batch_size`: 16
719
+ - `per_device_eval_batch_size`: 16
720
+ - `learning_rate`: 4e-05
721
+ - `num_train_epochs`: 1
722
+ - `warmup_ratio`: 0.1
723
+ - `bf16`: True
724
+
725
+ #### All Hyperparameters
726
+ <details><summary>Click to expand</summary>
727
+
728
+ - `overwrite_output_dir`: False
729
+ - `do_predict`: False
730
+ - `eval_strategy`: steps
731
+ - `prediction_loss_only`: True
732
+ - `per_device_train_batch_size`: 16
733
+ - `per_device_eval_batch_size`: 16
734
+ - `per_gpu_train_batch_size`: None
735
+ - `per_gpu_eval_batch_size`: None
736
+ - `gradient_accumulation_steps`: 1
737
+ - `eval_accumulation_steps`: None
738
+ - `torch_empty_cache_steps`: None
739
+ - `learning_rate`: 4e-05
740
+ - `weight_decay`: 0.0
741
+ - `adam_beta1`: 0.9
742
+ - `adam_beta2`: 0.999
743
+ - `adam_epsilon`: 1e-08
744
+ - `max_grad_norm`: 1.0
745
+ - `num_train_epochs`: 1
746
+ - `max_steps`: -1
747
+ - `lr_scheduler_type`: linear
748
+ - `lr_scheduler_kwargs`: {}
749
+ - `warmup_ratio`: 0.1
750
+ - `warmup_steps`: 0
751
+ - `log_level`: passive
752
+ - `log_level_replica`: warning
753
+ - `log_on_each_node`: True
754
+ - `logging_nan_inf_filter`: True
755
+ - `save_safetensors`: True
756
+ - `save_on_each_node`: False
757
+ - `save_only_model`: False
758
+ - `restore_callback_states_from_checkpoint`: False
759
+ - `no_cuda`: False
760
+ - `use_cpu`: False
761
+ - `use_mps_device`: False
762
+ - `seed`: 42
763
+ - `data_seed`: None
764
+ - `jit_mode_eval`: False
765
+ - `use_ipex`: False
766
+ - `bf16`: True
767
+ - `fp16`: False
768
+ - `fp16_opt_level`: O1
769
+ - `half_precision_backend`: auto
770
+ - `bf16_full_eval`: False
771
+ - `fp16_full_eval`: False
772
+ - `tf32`: None
773
+ - `local_rank`: 0
774
+ - `ddp_backend`: None
775
+ - `tpu_num_cores`: None
776
+ - `tpu_metrics_debug`: False
777
+ - `debug`: []
778
+ - `dataloader_drop_last`: False
779
+ - `dataloader_num_workers`: 0
780
+ - `dataloader_prefetch_factor`: None
781
+ - `past_index`: -1
782
+ - `disable_tqdm`: False
783
+ - `remove_unused_columns`: True
784
+ - `label_names`: None
785
+ - `load_best_model_at_end`: False
786
+ - `ignore_data_skip`: False
787
+ - `fsdp`: []
788
+ - `fsdp_min_num_params`: 0
789
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
790
+ - `tp_size`: 0
791
+ - `fsdp_transformer_layer_cls_to_wrap`: None
792
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
793
+ - `deepspeed`: None
794
+ - `label_smoothing_factor`: 0.0
795
+ - `optim`: adamw_torch
796
+ - `optim_args`: None
797
+ - `adafactor`: False
798
+ - `group_by_length`: False
799
+ - `length_column_name`: length
800
+ - `ddp_find_unused_parameters`: None
801
+ - `ddp_bucket_cap_mb`: None
802
+ - `ddp_broadcast_buffers`: False
803
+ - `dataloader_pin_memory`: True
804
+ - `dataloader_persistent_workers`: False
805
+ - `skip_memory_metrics`: True
806
+ - `use_legacy_prediction_loop`: False
807
+ - `push_to_hub`: False
808
+ - `resume_from_checkpoint`: None
809
+ - `hub_model_id`: None
810
+ - `hub_strategy`: every_save
811
+ - `hub_private_repo`: None
812
+ - `hub_always_push`: False
813
+ - `gradient_checkpointing`: False
814
+ - `gradient_checkpointing_kwargs`: None
815
+ - `include_inputs_for_metrics`: False
816
+ - `include_for_metrics`: []
817
+ - `eval_do_concat_batches`: True
818
+ - `fp16_backend`: auto
819
+ - `push_to_hub_model_id`: None
820
+ - `push_to_hub_organization`: None
821
+ - `mp_parameters`:
822
+ - `auto_find_batch_size`: False
823
+ - `full_determinism`: False
824
+ - `torchdynamo`: None
825
+ - `ray_scope`: last
826
+ - `ddp_timeout`: 1800
827
+ - `torch_compile`: False
828
+ - `torch_compile_backend`: None
829
+ - `torch_compile_mode`: None
830
+ - `include_tokens_per_second`: False
831
+ - `include_num_input_tokens_seen`: False
832
+ - `neftune_noise_alpha`: None
833
+ - `optim_target_modules`: None
834
+ - `batch_eval_metrics`: False
835
+ - `eval_on_start`: False
836
+ - `use_liger_kernel`: False
837
+ - `eval_use_gather_object`: False
838
+ - `average_tokens_across_devices`: False
839
+ - `prompts`: None
840
+ - `batch_sampler`: batch_sampler
841
+ - `multi_dataset_batch_sampler`: proportional
842
+ - `router_mapping`: {}
843
+ - `learning_rate_mapping`: {}
844
+
845
+ </details>
846
+
847
+ ### Training Logs
848
+ | Epoch | Step | Training Loss | Validation Loss | msmarco-eval-1kq-1kd_cosine_ndcg@10 | NanoMSMARCO_cosine_ndcg@10 | NanoNFCorpus_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
849
+ |:-----:|:----:|:-------------:|:---------------:|:-----------------------------------:|:--------------------------:|:---------------------------:|:---------------------:|:----------------------------:|
850
+ | 0.032 | 20 | 527.7782 | - | - | - | - | - | - |
851
+ | 0.064 | 40 | 135.2216 | - | - | - | - | - | - |
852
+ | 0.096 | 60 | 77.5416 | - | - | - | - | - | - |
853
+ | 0.128 | 80 | 73.6011 | - | - | - | - | - | - |
854
+ | 0.16 | 100 | 50.552 | 40.1823 | 0.0575 | 0.0 | 0.0187 | 0.0071 | 0.0086 |
855
+ | 0.192 | 120 | 33.1184 | - | - | - | - | - | - |
856
+ | 0.224 | 140 | 29.2259 | - | - | - | - | - | - |
857
+ | 0.256 | 160 | 28.9969 | - | - | - | - | - | - |
858
+ | 0.288 | 180 | 27.3632 | - | - | - | - | - | - |
859
+ | 0.32 | 200 | 25.5289 | 28.1320 | 0.1823 | 0.0426 | 0.0185 | 0.0235 | 0.0282 |
860
+ | 0.352 | 220 | 27.6791 | - | - | - | - | - | - |
861
+ | 0.384 | 240 | 26.2424 | - | - | - | - | - | - |
862
+ | 0.416 | 260 | 24.3815 | - | - | - | - | - | - |
863
+ | 0.448 | 280 | 22.2716 | - | - | - | - | - | - |
864
+ | 0.48 | 300 | 22.4707 | 23.6126 | 0.5543 | 0.1939 | 0.0416 | 0.0876 | 0.1077 |
865
+ | 0.512 | 320 | 24.0737 | - | - | - | - | - | - |
866
+ | 0.544 | 340 | 23.0891 | - | - | - | - | - | - |
867
+ | 0.576 | 360 | 21.7429 | - | - | - | - | - | - |
868
+ | 0.608 | 380 | 18.1801 | - | - | - | - | - | - |
869
+ | 0.64 | 400 | 19.0422 | 19.1492 | 0.7010 | 0.2028 | 0.0390 | 0.0445 | 0.0954 |
870
+ | 0.672 | 420 | 17.5942 | - | - | - | - | - | - |
871
+ | 0.704 | 440 | 16.2377 | - | - | - | - | - | - |
872
+ | 0.736 | 460 | 17.1898 | - | - | - | - | - | - |
873
+ | 0.768 | 480 | 18.1398 | - | - | - | - | - | - |
874
+ | 0.8 | 500 | 16.7881 | 17.4740 | 0.8166 | 0.2706 | 0.0576 | 0.1025 | 0.1435 |
875
+ | 0.832 | 520 | 16.3059 | - | - | - | - | - | - |
876
+ | 0.864 | 540 | 16.6533 | - | - | - | - | - | - |
877
+ | 0.896 | 560 | 16.0106 | - | - | - | - | - | - |
878
+ | 0.928 | 580 | 15.2358 | - | - | - | - | - | - |
879
+ | 0.96 | 600 | 16.9444 | 16.6125 | 0.8235 | 0.2751 | 0.0745 | 0.1049 | 0.1515 |
880
+ | 0.992 | 620 | 14.4947 | - | - | - | - | - | - |
881
+ | -1 | -1 | - | - | 0.8243 | 0.2824 | 0.0725 | 0.1228 | 0.1592 |
882
+
883
+
884
+ ### Framework Versions
885
+ - Python: 3.11.10
886
+ - Sentence Transformers: 5.1.0.dev0
887
+ - Transformers: 4.51.2
888
+ - PyTorch: 2.5.1+cu124
889
+ - Accelerate: 1.5.2
890
+ - Datasets: 3.5.0
891
+ - Tokenizers: 0.21.0
892
+
893
+ ## Citation
894
+
895
+ ### BibTeX
896
+
897
+ #### Sentence Transformers
898
+ ```bibtex
899
+ @inproceedings{reimers-2019-sentence-bert,
900
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
901
+ author = "Reimers, Nils and Gurevych, Iryna",
902
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
903
+ month = "11",
904
+ year = "2019",
905
+ publisher = "Association for Computational Linguistics",
906
+ url = "https://arxiv.org/abs/1908.10084",
907
+ }
908
+ ```
909
+
910
+ #### MarginMSELoss
911
+ ```bibtex
912
+ @misc{hofstätter2021improving,
913
+ title={Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation},
914
+ author={Sebastian Hofstätter and Sophia Althammer and Michael Schröder and Mete Sertkan and Allan Hanbury},
915
+ year={2021},
916
+ eprint={2010.02666},
917
+ archivePrefix={arXiv},
918
+ primaryClass={cs.IR}
919
+ }
920
+ ```
921
+
922
+ <!--
923
+ ## Glossary
924
+
925
+ *Clearly define terms in order to be accessible across audiences.*
926
+ -->
927
+
928
+ <!--
929
+ ## Model Card Authors
930
+
931
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
932
+ -->
933
+
934
+ <!--
935
+ ## Model Card Contact
936
+
937
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
938
+ -->
config.json ADDED
@@ -0,0 +1,45 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "ModernBertModel"
4
+ ],
5
+ "attention_bias": false,
6
+ "attention_dropout": 0.0,
7
+ "bos_token_id": 34533,
8
+ "classifier_activation": "gelu",
9
+ "classifier_bias": false,
10
+ "classifier_dropout": 0.0,
11
+ "classifier_pooling": "mean",
12
+ "cls_token_id": 34533,
13
+ "decoder_bias": true,
14
+ "deterministic_flash_attn": false,
15
+ "embedding_dropout": 0.0,
16
+ "eos_token_id": 34534,
17
+ "global_attn_every_n_layers": 3,
18
+ "global_rope_theta": 160000.0,
19
+ "gradient_checkpointing": false,
20
+ "hidden_activation": "gelu",
21
+ "hidden_size": 768,
22
+ "initializer_cutoff_factor": 2.0,
23
+ "initializer_range": 0.02,
24
+ "intermediate_size": 1152,
25
+ "layer_norm_eps": 1e-05,
26
+ "local_attention": 128,
27
+ "local_rope_theta": 10000.0,
28
+ "max_position_embeddings": 8192,
29
+ "mlp_bias": false,
30
+ "mlp_dropout": 0.0,
31
+ "model_type": "modernbert",
32
+ "norm_bias": false,
33
+ "norm_eps": 1e-05,
34
+ "num_attention_heads": 12,
35
+ "num_hidden_layers": 22,
36
+ "pad_token_id": 34535,
37
+ "position_embedding_type": "absolute",
38
+ "repad_logits_with_grad": false,
39
+ "sep_token_id": 34534,
40
+ "sparse_pred_ignore_index": -100,
41
+ "sparse_prediction": false,
42
+ "torch_dtype": "float32",
43
+ "transformers_version": "4.51.2",
44
+ "vocab_size": 34623
45
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "model_type": "SentenceTransformer",
3
+ "__version__": {
4
+ "sentence_transformers": "5.1.0.dev0",
5
+ "transformers": "4.51.2",
6
+ "pytorch": "2.5.1+cu124"
7
+ },
8
+ "prompts": {
9
+ "query": "",
10
+ "document": ""
11
+ },
12
+ "default_prompt_name": null,
13
+ "similarity_fn_name": "cosine"
14
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d6cdffee365affea65a00180e330def7be8b08b83ae1c14f2d6665c5bc18c5e7
3
+ size 547701496
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 8192,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": true,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,945 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<|padding|>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "34508": {
12
+ "content": " ",
13
+ "lstrip": false,
14
+ "normalized": true,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": false
18
+ },
19
+ "34509": {
20
+ "content": " ",
21
+ "lstrip": false,
22
+ "normalized": true,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": false
26
+ },
27
+ "34510": {
28
+ "content": " ",
29
+ "lstrip": false,
30
+ "normalized": true,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": false
34
+ },
35
+ "34511": {
36
+ "content": " ",
37
+ "lstrip": false,
38
+ "normalized": true,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": false
42
+ },
43
+ "34512": {
44
+ "content": " ",
45
+ "lstrip": false,
46
+ "normalized": true,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": false
50
+ },
51
+ "34513": {
52
+ "content": " ",
53
+ "lstrip": false,
54
+ "normalized": true,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": false
58
+ },
59
+ "34514": {
60
+ "content": " ",
61
+ "lstrip": false,
62
+ "normalized": true,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": false
66
+ },
67
+ "34515": {
68
+ "content": " ",
69
+ "lstrip": false,
70
+ "normalized": true,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": false
74
+ },
75
+ "34516": {
76
+ "content": " ",
77
+ "lstrip": false,
78
+ "normalized": true,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": false
82
+ },
83
+ "34517": {
84
+ "content": " ",
85
+ "lstrip": false,
86
+ "normalized": true,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": false
90
+ },
91
+ "34518": {
92
+ "content": " ",
93
+ "lstrip": false,
94
+ "normalized": true,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": false
98
+ },
99
+ "34519": {
100
+ "content": " ",
101
+ "lstrip": false,
102
+ "normalized": true,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": false
106
+ },
107
+ "34520": {
108
+ "content": " ",
109
+ "lstrip": false,
110
+ "normalized": true,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": false
114
+ },
115
+ "34521": {
116
+ "content": " ",
117
+ "lstrip": false,
118
+ "normalized": true,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": false
122
+ },
123
+ "34522": {
124
+ "content": " ",
125
+ "lstrip": false,
126
+ "normalized": true,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": false
130
+ },
131
+ "34523": {
132
+ "content": " ",
133
+ "lstrip": false,
134
+ "normalized": true,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": false
138
+ },
139
+ "34524": {
140
+ "content": " ",
141
+ "lstrip": false,
142
+ "normalized": true,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": false
146
+ },
147
+ "34525": {
148
+ "content": " ",
149
+ "lstrip": false,
150
+ "normalized": true,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": false
154
+ },
155
+ "34526": {
156
+ "content": " ",
157
+ "lstrip": false,
158
+ "normalized": true,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": false
162
+ },
163
+ "34527": {
164
+ "content": " ",
165
+ "lstrip": false,
166
+ "normalized": true,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": false
170
+ },
171
+ "34528": {
172
+ "content": " ",
173
+ "lstrip": false,
174
+ "normalized": true,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": false
178
+ },
179
+ "34529": {
180
+ "content": " ",
181
+ "lstrip": false,
182
+ "normalized": true,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": false
186
+ },
187
+ "34530": {
188
+ "content": " ",
189
+ "lstrip": false,
190
+ "normalized": true,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": false
194
+ },
195
+ "34531": {
196
+ "content": "<|endoftext|>",
197
+ "lstrip": false,
198
+ "normalized": false,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": true
202
+ },
203
+ "34532": {
204
+ "content": "[UNK]",
205
+ "lstrip": false,
206
+ "normalized": false,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": true
210
+ },
211
+ "34533": {
212
+ "content": "[CLS]",
213
+ "lstrip": false,
214
+ "normalized": false,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": true
218
+ },
219
+ "34534": {
220
+ "content": "[SEP]",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "34535": {
228
+ "content": "[PAD]",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "34536": {
236
+ "content": "[MASK]",
237
+ "lstrip": true,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "34537": {
244
+ "content": "[unused0]",
245
+ "lstrip": false,
246
+ "normalized": true,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": false
250
+ },
251
+ "34538": {
252
+ "content": "[unused1]",
253
+ "lstrip": false,
254
+ "normalized": true,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": false
258
+ },
259
+ "34539": {
260
+ "content": "[unused2]",
261
+ "lstrip": false,
262
+ "normalized": true,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": false
266
+ },
267
+ "34540": {
268
+ "content": "[unused3]",
269
+ "lstrip": false,
270
+ "normalized": true,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": false
274
+ },
275
+ "34541": {
276
+ "content": "[unused4]",
277
+ "lstrip": false,
278
+ "normalized": true,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": false
282
+ },
283
+ "34542": {
284
+ "content": "[unused5]",
285
+ "lstrip": false,
286
+ "normalized": true,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": false
290
+ },
291
+ "34543": {
292
+ "content": "[unused6]",
293
+ "lstrip": false,
294
+ "normalized": true,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": false
298
+ },
299
+ "34544": {
300
+ "content": "[unused7]",
301
+ "lstrip": false,
302
+ "normalized": true,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": false
306
+ },
307
+ "34545": {
308
+ "content": "[unused8]",
309
+ "lstrip": false,
310
+ "normalized": true,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": false
314
+ },
315
+ "34546": {
316
+ "content": "[unused9]",
317
+ "lstrip": false,
318
+ "normalized": true,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": false
322
+ },
323
+ "34547": {
324
+ "content": "[unused10]",
325
+ "lstrip": false,
326
+ "normalized": true,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": false
330
+ },
331
+ "34548": {
332
+ "content": "[unused11]",
333
+ "lstrip": false,
334
+ "normalized": true,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": false
338
+ },
339
+ "34549": {
340
+ "content": "[unused12]",
341
+ "lstrip": false,
342
+ "normalized": true,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": false
346
+ },
347
+ "34550": {
348
+ "content": "[unused13]",
349
+ "lstrip": false,
350
+ "normalized": true,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": false
354
+ },
355
+ "34551": {
356
+ "content": "[unused14]",
357
+ "lstrip": false,
358
+ "normalized": true,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": false
362
+ },
363
+ "34552": {
364
+ "content": "[unused15]",
365
+ "lstrip": false,
366
+ "normalized": true,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": false
370
+ },
371
+ "34553": {
372
+ "content": "[unused16]",
373
+ "lstrip": false,
374
+ "normalized": true,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": false
378
+ },
379
+ "34554": {
380
+ "content": "[unused17]",
381
+ "lstrip": false,
382
+ "normalized": true,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": false
386
+ },
387
+ "34555": {
388
+ "content": "[unused18]",
389
+ "lstrip": false,
390
+ "normalized": true,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": false
394
+ },
395
+ "34556": {
396
+ "content": "[unused19]",
397
+ "lstrip": false,
398
+ "normalized": true,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": false
402
+ },
403
+ "34557": {
404
+ "content": "[unused20]",
405
+ "lstrip": false,
406
+ "normalized": true,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": false
410
+ },
411
+ "34558": {
412
+ "content": "[unused21]",
413
+ "lstrip": false,
414
+ "normalized": true,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": false
418
+ },
419
+ "34559": {
420
+ "content": "[unused22]",
421
+ "lstrip": false,
422
+ "normalized": true,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": false
426
+ },
427
+ "34560": {
428
+ "content": "[unused23]",
429
+ "lstrip": false,
430
+ "normalized": true,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": false
434
+ },
435
+ "34561": {
436
+ "content": "[unused24]",
437
+ "lstrip": false,
438
+ "normalized": true,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": false
442
+ },
443
+ "34562": {
444
+ "content": "[unused25]",
445
+ "lstrip": false,
446
+ "normalized": true,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": false
450
+ },
451
+ "34563": {
452
+ "content": "[unused26]",
453
+ "lstrip": false,
454
+ "normalized": true,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": false
458
+ },
459
+ "34564": {
460
+ "content": "[unused27]",
461
+ "lstrip": false,
462
+ "normalized": true,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": false
466
+ },
467
+ "34565": {
468
+ "content": "[unused28]",
469
+ "lstrip": false,
470
+ "normalized": true,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": false
474
+ },
475
+ "34566": {
476
+ "content": "[unused29]",
477
+ "lstrip": false,
478
+ "normalized": true,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": false
482
+ },
483
+ "34567": {
484
+ "content": "[unused30]",
485
+ "lstrip": false,
486
+ "normalized": true,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": false
490
+ },
491
+ "34568": {
492
+ "content": "[unused31]",
493
+ "lstrip": false,
494
+ "normalized": true,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": false
498
+ },
499
+ "34569": {
500
+ "content": "[unused32]",
501
+ "lstrip": false,
502
+ "normalized": true,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": false
506
+ },
507
+ "34570": {
508
+ "content": "[unused33]",
509
+ "lstrip": false,
510
+ "normalized": true,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": false
514
+ },
515
+ "34571": {
516
+ "content": "[unused34]",
517
+ "lstrip": false,
518
+ "normalized": true,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": false
522
+ },
523
+ "34572": {
524
+ "content": "[unused35]",
525
+ "lstrip": false,
526
+ "normalized": true,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": false
530
+ },
531
+ "34573": {
532
+ "content": "[unused36]",
533
+ "lstrip": false,
534
+ "normalized": true,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": false
538
+ },
539
+ "34574": {
540
+ "content": "[unused37]",
541
+ "lstrip": false,
542
+ "normalized": true,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": false
546
+ },
547
+ "34575": {
548
+ "content": "[unused38]",
549
+ "lstrip": false,
550
+ "normalized": true,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": false
554
+ },
555
+ "34576": {
556
+ "content": "[unused39]",
557
+ "lstrip": false,
558
+ "normalized": true,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": false
562
+ },
563
+ "34577": {
564
+ "content": "[unused40]",
565
+ "lstrip": false,
566
+ "normalized": true,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": false
570
+ },
571
+ "34578": {
572
+ "content": "[unused41]",
573
+ "lstrip": false,
574
+ "normalized": true,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": false
578
+ },
579
+ "34579": {
580
+ "content": "[unused42]",
581
+ "lstrip": false,
582
+ "normalized": true,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": false
586
+ },
587
+ "34580": {
588
+ "content": "[unused43]",
589
+ "lstrip": false,
590
+ "normalized": true,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": false
594
+ },
595
+ "34581": {
596
+ "content": "[unused44]",
597
+ "lstrip": false,
598
+ "normalized": true,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": false
602
+ },
603
+ "34582": {
604
+ "content": "[unused45]",
605
+ "lstrip": false,
606
+ "normalized": true,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": false
610
+ },
611
+ "34583": {
612
+ "content": "[unused46]",
613
+ "lstrip": false,
614
+ "normalized": true,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": false
618
+ },
619
+ "34584": {
620
+ "content": "[unused47]",
621
+ "lstrip": false,
622
+ "normalized": true,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": false
626
+ },
627
+ "34585": {
628
+ "content": "[unused48]",
629
+ "lstrip": false,
630
+ "normalized": true,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": false
634
+ },
635
+ "34586": {
636
+ "content": "[unused49]",
637
+ "lstrip": false,
638
+ "normalized": true,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": false
642
+ },
643
+ "34587": {
644
+ "content": "[unused50]",
645
+ "lstrip": false,
646
+ "normalized": true,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": false
650
+ },
651
+ "34588": {
652
+ "content": "[unused51]",
653
+ "lstrip": false,
654
+ "normalized": true,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": false
658
+ },
659
+ "34589": {
660
+ "content": "[unused52]",
661
+ "lstrip": false,
662
+ "normalized": true,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": false
666
+ },
667
+ "34590": {
668
+ "content": "[unused53]",
669
+ "lstrip": false,
670
+ "normalized": true,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": false
674
+ },
675
+ "34591": {
676
+ "content": "[unused54]",
677
+ "lstrip": false,
678
+ "normalized": true,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": false
682
+ },
683
+ "34592": {
684
+ "content": "[unused55]",
685
+ "lstrip": false,
686
+ "normalized": true,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": false
690
+ },
691
+ "34593": {
692
+ "content": "[unused56]",
693
+ "lstrip": false,
694
+ "normalized": true,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": false
698
+ },
699
+ "34594": {
700
+ "content": "[unused57]",
701
+ "lstrip": false,
702
+ "normalized": true,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": false
706
+ },
707
+ "34595": {
708
+ "content": "[unused58]",
709
+ "lstrip": false,
710
+ "normalized": true,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": false
714
+ },
715
+ "34596": {
716
+ "content": "[unused59]",
717
+ "lstrip": false,
718
+ "normalized": true,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": false
722
+ },
723
+ "34597": {
724
+ "content": "[unused60]",
725
+ "lstrip": false,
726
+ "normalized": true,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": false
730
+ },
731
+ "34598": {
732
+ "content": "[unused61]",
733
+ "lstrip": false,
734
+ "normalized": true,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": false
738
+ },
739
+ "34599": {
740
+ "content": "[unused62]",
741
+ "lstrip": false,
742
+ "normalized": true,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": false
746
+ },
747
+ "34600": {
748
+ "content": "[unused63]",
749
+ "lstrip": false,
750
+ "normalized": true,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": false
754
+ },
755
+ "34601": {
756
+ "content": "[unused64]",
757
+ "lstrip": false,
758
+ "normalized": true,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": false
762
+ },
763
+ "34602": {
764
+ "content": "[unused65]",
765
+ "lstrip": false,
766
+ "normalized": true,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": false
770
+ },
771
+ "34603": {
772
+ "content": "[unused66]",
773
+ "lstrip": false,
774
+ "normalized": true,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": false
778
+ },
779
+ "34604": {
780
+ "content": "[unused67]",
781
+ "lstrip": false,
782
+ "normalized": true,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": false
786
+ },
787
+ "34605": {
788
+ "content": "[unused68]",
789
+ "lstrip": false,
790
+ "normalized": true,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": false
794
+ },
795
+ "34606": {
796
+ "content": "[unused69]",
797
+ "lstrip": false,
798
+ "normalized": true,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": false
802
+ },
803
+ "34607": {
804
+ "content": "[unused70]",
805
+ "lstrip": false,
806
+ "normalized": true,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": false
810
+ },
811
+ "34608": {
812
+ "content": "[unused71]",
813
+ "lstrip": false,
814
+ "normalized": true,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": false
818
+ },
819
+ "34609": {
820
+ "content": "[unused72]",
821
+ "lstrip": false,
822
+ "normalized": true,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": false
826
+ },
827
+ "34610": {
828
+ "content": "[unused73]",
829
+ "lstrip": false,
830
+ "normalized": true,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": false
834
+ },
835
+ "34611": {
836
+ "content": "[unused74]",
837
+ "lstrip": false,
838
+ "normalized": true,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": false
842
+ },
843
+ "34612": {
844
+ "content": "[unused75]",
845
+ "lstrip": false,
846
+ "normalized": true,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": false
850
+ },
851
+ "34613": {
852
+ "content": "[unused76]",
853
+ "lstrip": false,
854
+ "normalized": true,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": false
858
+ },
859
+ "34614": {
860
+ "content": "[unused77]",
861
+ "lstrip": false,
862
+ "normalized": true,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": false
866
+ },
867
+ "34615": {
868
+ "content": "[unused78]",
869
+ "lstrip": false,
870
+ "normalized": true,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": false
874
+ },
875
+ "34616": {
876
+ "content": "[unused79]",
877
+ "lstrip": false,
878
+ "normalized": true,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": false
882
+ },
883
+ "34617": {
884
+ "content": "[unused80]",
885
+ "lstrip": false,
886
+ "normalized": true,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": false
890
+ },
891
+ "34618": {
892
+ "content": "[unused81]",
893
+ "lstrip": false,
894
+ "normalized": true,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": false
898
+ },
899
+ "34619": {
900
+ "content": "[unused82]",
901
+ "lstrip": false,
902
+ "normalized": true,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": false
906
+ },
907
+ "34620": {
908
+ "content": "|||IP_ADDRESS|||",
909
+ "lstrip": false,
910
+ "normalized": true,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": false
914
+ },
915
+ "34621": {
916
+ "content": "|||EMAIL_ADDRESS|||",
917
+ "lstrip": false,
918
+ "normalized": true,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": false
922
+ },
923
+ "34622": {
924
+ "content": "|||PHONE_NUMBER|||",
925
+ "lstrip": false,
926
+ "normalized": true,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": false
930
+ }
931
+ },
932
+ "clean_up_tokenization_spaces": true,
933
+ "cls_token": "[CLS]",
934
+ "extra_special_tokens": {},
935
+ "mask_token": "[MASK]",
936
+ "model_input_names": [
937
+ "input_ids",
938
+ "attention_mask"
939
+ ],
940
+ "model_max_length": 8192,
941
+ "pad_token": "[PAD]",
942
+ "sep_token": "[SEP]",
943
+ "tokenizer_class": "PreTrainedTokenizer",
944
+ "unk_token": "[UNK]"
945
+ }