HassanCS commited on
Commit
e0c2eb6
·
verified ·
1 Parent(s): b3599b9

Add new SentenceTransformer model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 384,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,457 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:35520
8
+ - loss:MultipleNegativesRankingLoss
9
+ base_model: DeepChem/ChemBERTa-77M-MLM
10
+ widget:
11
+ - source_sentence: C[NH+]1CCC(CN2c3ccccc3Sc3ccccc32)C1
12
+ sentences:
13
+ - CC(C)CN(CC(O)C(Cc1ccccc1)NC(=O)OC1COC2OCCC12)S(=O)(=O)c1ccc(N)cc1
14
+ - COC(=O)NC(C(=O)NC(Cc1ccccc1)C(O)CN(Cc1ccc(-c2ccccn2)cc1)NC(=O)C(NC(=O)OC)C(C)(C)C)C(C)(C)C
15
+ - C=C1c2cccc([O-])c2C(=O)C2=C([O-])C3(O)C(=O)C(C(N)=O)=C([O-])C([NH+](C)C)C3C(O)C12
16
+ - source_sentence: CC(C)(C)[NH2+]CC(O)COc1ccccc1C1CCCC1
17
+ sentences:
18
+ - C[NH2+]C1C(OC2C(OC3C(O)C(O)C(NC(N)=[NH2+])C(O)C3NC(N)=[NH2+])OC(C)C2(O)C=O)OC(CO)C(O)C1O
19
+ - CC(C)CNCc1ccc(-c2ccccc2S(=O)(=O)N2CCCC2)cc1
20
+ - CC1C[NH+](CC(Cc2ccccc2)C(=O)NCC(=O)[O-])CCC1(C)c1cccc(O)c1
21
+ - source_sentence: CC1CC2C3CCC4=CC(=O)C=CC4(C)C3(F)C(O)CC2(C)C1(OC(=O)c1ccccc1)C(=O)CO
22
+ sentences:
23
+ - CC1CC=CC=CC=CC=CC(OC2OC(C)C(O)C([NH3+])C2O)CC2OC(O)(CC(O)CC3OC3C=CC(=O)O1)CC(O)C2C(=O)[O-]
24
+ - C=CC1(C)CC(OC(=O)CSC2CC3CCC(C2)[NH+]3C)C2(C)C(C)CCC3(CCC(=O)C32)C(C)C1O
25
+ - CC(C)C(CN1CCC(C)(c2cccc(O)c2)C(C)C1)NC(=O)C1Cc2ccc(O)cc2CN1
26
+ - source_sentence: CC(C)[NH2+]CC1CCc2cc(CO)c([N+](=O)[O-])cc2N1
27
+ sentences:
28
+ - CC(Cc1cc2c(c(C(N)=O)c1)N(CCCO)CC2)[NH2+]CCOc1ccccc1OCC(F)(F)F
29
+ - COC(=O)NC(C(=O)NC(Cc1ccccc1)C(O)CN(Cc1ccc(-c2ccccn2)cc1)NC(=O)C(NC(=O)OC)C(C)(C)C)C(C)(C)C
30
+ - COc1ccccc1Oc1c([N-]S(=O)(=O)c2ccc(C(C)(C)C)cc2)nc(-c2ncccn2)nc1OCCO
31
+ - source_sentence: COc1ccc(C(=O)CC(=O)c2ccc(C(C)(C)C)cc2)cc1
32
+ sentences:
33
+ - C[N+]1(C)CCC(=C(c2ccccc2)c2ccccc2)CC1
34
+ - CC#CCC(C)C(O)C=CC1C(O)CC2CC(=CCCCC(=O)[O-])CC21
35
+ - C=C1CC2CCC34CC5OC6C(OC7CCC(CC(=O)CC8C(CC9OC(CCC1O2)CC(C)C9=C)OC(CC(O)CN)C8OC)OC7C6O3)C5O4
36
+ pipeline_tag: sentence-similarity
37
+ library_name: sentence-transformers
38
+ metrics:
39
+ - cosine_accuracy
40
+ model-index:
41
+ - name: SentenceTransformer based on DeepChem/ChemBERTa-77M-MLM
42
+ results:
43
+ - task:
44
+ type: triplet
45
+ name: Triplet
46
+ dataset:
47
+ name: all dev
48
+ type: all-dev
49
+ metrics:
50
+ - type: cosine_accuracy
51
+ value: 0.7844594594594595
52
+ name: Cosine Accuracy
53
+ ---
54
+
55
+ # SentenceTransformer based on DeepChem/ChemBERTa-77M-MLM
56
+
57
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [DeepChem/ChemBERTa-77M-MLM](https://huggingface.co/DeepChem/ChemBERTa-77M-MLM). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
58
+
59
+ ## Model Details
60
+
61
+ ### Model Description
62
+ - **Model Type:** Sentence Transformer
63
+ - **Base model:** [DeepChem/ChemBERTa-77M-MLM](https://huggingface.co/DeepChem/ChemBERTa-77M-MLM) <!-- at revision ed8a5374f2024ec8da53760af91a33fb8f6a15ff -->
64
+ - **Maximum Sequence Length:** 512 tokens
65
+ - **Output Dimensionality:** 384 dimensions
66
+ - **Similarity Function:** Cosine Similarity
67
+ <!-- - **Training Dataset:** Unknown -->
68
+ <!-- - **Language:** Unknown -->
69
+ <!-- - **License:** Unknown -->
70
+
71
+ ### Model Sources
72
+
73
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
74
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
75
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
76
+
77
+ ### Full Model Architecture
78
+
79
+ ```
80
+ SentenceTransformer(
81
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: RobertaModel
82
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
83
+ )
84
+ ```
85
+
86
+ ## Usage
87
+
88
+ ### Direct Usage (Sentence Transformers)
89
+
90
+ First install the Sentence Transformers library:
91
+
92
+ ```bash
93
+ pip install -U sentence-transformers
94
+ ```
95
+
96
+ Then you can load this model and run inference.
97
+ ```python
98
+ from sentence_transformers import SentenceTransformer
99
+
100
+ # Download from the 🤗 Hub
101
+ model = SentenceTransformer("HassanCS/chemBERTa-tuned-on-ClinTox-4")
102
+ # Run inference
103
+ sentences = [
104
+ 'COc1ccc(C(=O)CC(=O)c2ccc(C(C)(C)C)cc2)cc1',
105
+ 'C[N+]1(C)CCC(=C(c2ccccc2)c2ccccc2)CC1',
106
+ 'C=C1CC2CCC34CC5OC6C(OC7CCC(CC(=O)CC8C(CC9OC(CCC1O2)CC(C)C9=C)OC(CC(O)CN)C8OC)OC7C6O3)C5O4',
107
+ ]
108
+ embeddings = model.encode(sentences)
109
+ print(embeddings.shape)
110
+ # [3, 384]
111
+
112
+ # Get the similarity scores for the embeddings
113
+ similarities = model.similarity(embeddings, embeddings)
114
+ print(similarities.shape)
115
+ # [3, 3]
116
+ ```
117
+
118
+ <!--
119
+ ### Direct Usage (Transformers)
120
+
121
+ <details><summary>Click to see the direct usage in Transformers</summary>
122
+
123
+ </details>
124
+ -->
125
+
126
+ <!--
127
+ ### Downstream Usage (Sentence Transformers)
128
+
129
+ You can finetune this model on your own dataset.
130
+
131
+ <details><summary>Click to expand</summary>
132
+
133
+ </details>
134
+ -->
135
+
136
+ <!--
137
+ ### Out-of-Scope Use
138
+
139
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
140
+ -->
141
+
142
+ ## Evaluation
143
+
144
+ ### Metrics
145
+
146
+ #### Triplet
147
+
148
+ * Dataset: `all-dev`
149
+ * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)
150
+
151
+ | Metric | Value |
152
+ |:--------------------|:-----------|
153
+ | **cosine_accuracy** | **0.7845** |
154
+
155
+ <!--
156
+ ## Bias, Risks and Limitations
157
+
158
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
159
+ -->
160
+
161
+ <!--
162
+ ### Recommendations
163
+
164
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
165
+ -->
166
+
167
+ ## Training Details
168
+
169
+ ### Training Dataset
170
+
171
+ #### Unnamed Dataset
172
+
173
+
174
+ * Size: 35,520 training samples
175
+ * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
176
+ * Approximate statistics based on the first 1000 samples:
177
+ | | anchor | positive | negative |
178
+ |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
179
+ | type | string | string | string |
180
+ | details | <ul><li>min: 14 tokens</li><li>mean: 29.75 tokens</li><li>max: 68 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 47.08 tokens</li><li>max: 221 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 53.95 tokens</li><li>max: 189 tokens</li></ul> |
181
+ * Samples:
182
+ | anchor | positive | negative |
183
+ |:-------------------------------------------------------|:---------------------------------------------------------------------------------------------|:-----------------------------------------------------|
184
+ | <code>CC(C)CC(NC(=O)CNC(=O)c1cc(Cl)ccc1Cl)B(O)O</code> | <code>CC(=O)OC1CCC2(C)C(=CCC3C2CCC2(C)C(c4cccnc4)=CCC32)C1</code> | <code>CCOC(=O)c1ncn2c1CN(C)C(=O)c1cc(F)ccc1-2</code> |
185
+ | <code>CC(C)CC(NC(=O)CNC(=O)c1cc(Cl)ccc1Cl)B(O)O</code> | <code>COc1ccc(C(CN(C)C)C2(O)CCCCC2)cc1</code> | <code>C[NH2+]C1(C)C2CCC(C2)C1(C)C</code> |
186
+ | <code>CC(C)CC(NC(=O)CNC(=O)c1cc(Cl)ccc1Cl)B(O)O</code> | <code>CNC(=O)c1cc(Oc2ccc(NC(=O)Nc3ccc(Cl)c(C(F)(F)F)c3)cc2)ccn1.Cc1ccc(S(=O)(=O)O)cc1</code> | <code>Nc1ncnc2c1ncn2C1OC(CO)C(O)C1O</code> |
187
+ * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
188
+ ```json
189
+ {
190
+ "scale": 20.0,
191
+ "similarity_fct": "cos_sim"
192
+ }
193
+ ```
194
+
195
+ ### Evaluation Dataset
196
+
197
+ #### Unnamed Dataset
198
+
199
+
200
+ * Size: 1,480 evaluation samples
201
+ * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
202
+ * Approximate statistics based on the first 1000 samples:
203
+ | | anchor | positive | negative |
204
+ |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
205
+ | type | string | string | string |
206
+ | details | <ul><li>min: 18 tokens</li><li>mean: 54.07 tokens</li><li>max: 169 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 58.71 tokens</li><li>max: 244 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 71.06 tokens</li><li>max: 209 tokens</li></ul> |
207
+ * Samples:
208
+ | anchor | positive | negative |
209
+ |:------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------|
210
+ | <code>CC(C)OC(=O)CCCC=CCC1C(O)CC(O)C1C=CC(O)COc1cccc(C(F)(F)F)c1</code> | <code>C#CC1(O)CCC2C3CCC4=C(CCC(=O)C4)C3CCC21C</code> | <code>CC(C)CC(NC(=O)C(CCc1ccccc1)NC(=O)CN1CCOCC1)C(=O)NC(Cc1ccccc1)C(=O)NC(CC(C)C)C(=O)C1(C)CO1</code> |
211
+ | <code>CC(C)OC(=O)CCCC=CCC1C(O)CC(O)C1C=CC(O)COc1cccc(C(F)(F)F)c1</code> | <code>C=CC1(C)CC(OC(=O)CSC2CC3CCC(C2)[NH+]3C)C2(C)C(C)CCC3(CCC(=O)C32)C(C)C1O</code> | <code>COC(=O)NC(C(=O)NC(Cc1ccccc1)C(O)CN(Cc1ccc(-c2ccccn2)cc1)NC(=O)C(NC(=O)OC)C(C)(C)C)C(C)(C)C</code> |
212
+ | <code>CC(C)OC(=O)CCCC=CCC1C(O)CC(O)C1C=CC(O)COc1cccc(C(F)(F)F)c1</code> | <code>CC(Cc1cc2c(c(C(N)=O)c1)N(CCCO)CC2)[NH2+]CCOc1ccccc1OCC(F)(F)F</code> | <code>CC(C)C1(C(=O)NC2CC(=O)OC2(O)CF)CC(c2nccc3ccccc23)=NO1</code> |
213
+ * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
214
+ ```json
215
+ {
216
+ "scale": 20.0,
217
+ "similarity_fct": "cos_sim"
218
+ }
219
+ ```
220
+
221
+ ### Training Hyperparameters
222
+ #### Non-Default Hyperparameters
223
+
224
+ - `eval_strategy`: steps
225
+ - `per_device_train_batch_size`: 16
226
+ - `per_device_eval_batch_size`: 16
227
+ - `learning_rate`: 2e-05
228
+ - `num_train_epochs`: 10
229
+ - `warmup_ratio`: 0.1
230
+ - `fp16`: True
231
+ - `batch_sampler`: no_duplicates
232
+
233
+ #### All Hyperparameters
234
+ <details><summary>Click to expand</summary>
235
+
236
+ - `overwrite_output_dir`: False
237
+ - `do_predict`: False
238
+ - `eval_strategy`: steps
239
+ - `prediction_loss_only`: True
240
+ - `per_device_train_batch_size`: 16
241
+ - `per_device_eval_batch_size`: 16
242
+ - `per_gpu_train_batch_size`: None
243
+ - `per_gpu_eval_batch_size`: None
244
+ - `gradient_accumulation_steps`: 1
245
+ - `eval_accumulation_steps`: None
246
+ - `torch_empty_cache_steps`: None
247
+ - `learning_rate`: 2e-05
248
+ - `weight_decay`: 0.0
249
+ - `adam_beta1`: 0.9
250
+ - `adam_beta2`: 0.999
251
+ - `adam_epsilon`: 1e-08
252
+ - `max_grad_norm`: 1.0
253
+ - `num_train_epochs`: 10
254
+ - `max_steps`: -1
255
+ - `lr_scheduler_type`: linear
256
+ - `lr_scheduler_kwargs`: {}
257
+ - `warmup_ratio`: 0.1
258
+ - `warmup_steps`: 0
259
+ - `log_level`: passive
260
+ - `log_level_replica`: warning
261
+ - `log_on_each_node`: True
262
+ - `logging_nan_inf_filter`: True
263
+ - `save_safetensors`: True
264
+ - `save_on_each_node`: False
265
+ - `save_only_model`: False
266
+ - `restore_callback_states_from_checkpoint`: False
267
+ - `no_cuda`: False
268
+ - `use_cpu`: False
269
+ - `use_mps_device`: False
270
+ - `seed`: 42
271
+ - `data_seed`: None
272
+ - `jit_mode_eval`: False
273
+ - `use_ipex`: False
274
+ - `bf16`: False
275
+ - `fp16`: True
276
+ - `fp16_opt_level`: O1
277
+ - `half_precision_backend`: auto
278
+ - `bf16_full_eval`: False
279
+ - `fp16_full_eval`: False
280
+ - `tf32`: None
281
+ - `local_rank`: 0
282
+ - `ddp_backend`: None
283
+ - `tpu_num_cores`: None
284
+ - `tpu_metrics_debug`: False
285
+ - `debug`: []
286
+ - `dataloader_drop_last`: False
287
+ - `dataloader_num_workers`: 0
288
+ - `dataloader_prefetch_factor`: None
289
+ - `past_index`: -1
290
+ - `disable_tqdm`: False
291
+ - `remove_unused_columns`: True
292
+ - `label_names`: None
293
+ - `load_best_model_at_end`: False
294
+ - `ignore_data_skip`: False
295
+ - `fsdp`: []
296
+ - `fsdp_min_num_params`: 0
297
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
298
+ - `fsdp_transformer_layer_cls_to_wrap`: None
299
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
300
+ - `deepspeed`: None
301
+ - `label_smoothing_factor`: 0.0
302
+ - `optim`: adamw_torch
303
+ - `optim_args`: None
304
+ - `adafactor`: False
305
+ - `group_by_length`: False
306
+ - `length_column_name`: length
307
+ - `ddp_find_unused_parameters`: None
308
+ - `ddp_bucket_cap_mb`: None
309
+ - `ddp_broadcast_buffers`: False
310
+ - `dataloader_pin_memory`: True
311
+ - `dataloader_persistent_workers`: False
312
+ - `skip_memory_metrics`: True
313
+ - `use_legacy_prediction_loop`: False
314
+ - `push_to_hub`: False
315
+ - `resume_from_checkpoint`: None
316
+ - `hub_model_id`: None
317
+ - `hub_strategy`: every_save
318
+ - `hub_private_repo`: None
319
+ - `hub_always_push`: False
320
+ - `gradient_checkpointing`: False
321
+ - `gradient_checkpointing_kwargs`: None
322
+ - `include_inputs_for_metrics`: False
323
+ - `include_for_metrics`: []
324
+ - `eval_do_concat_batches`: True
325
+ - `fp16_backend`: auto
326
+ - `push_to_hub_model_id`: None
327
+ - `push_to_hub_organization`: None
328
+ - `mp_parameters`:
329
+ - `auto_find_batch_size`: False
330
+ - `full_determinism`: False
331
+ - `torchdynamo`: None
332
+ - `ray_scope`: last
333
+ - `ddp_timeout`: 1800
334
+ - `torch_compile`: False
335
+ - `torch_compile_backend`: None
336
+ - `torch_compile_mode`: None
337
+ - `dispatch_batches`: None
338
+ - `split_batches`: None
339
+ - `include_tokens_per_second`: False
340
+ - `include_num_input_tokens_seen`: False
341
+ - `neftune_noise_alpha`: None
342
+ - `optim_target_modules`: None
343
+ - `batch_eval_metrics`: False
344
+ - `eval_on_start`: False
345
+ - `use_liger_kernel`: False
346
+ - `eval_use_gather_object`: False
347
+ - `average_tokens_across_devices`: False
348
+ - `prompts`: None
349
+ - `batch_sampler`: no_duplicates
350
+ - `multi_dataset_batch_sampler`: proportional
351
+
352
+ </details>
353
+
354
+ ### Training Logs
355
+ | Epoch | Step | Training Loss | Validation Loss | all-dev_cosine_accuracy |
356
+ |:------:|:-----:|:-------------:|:---------------:|:-----------------------:|
357
+ | 0.2252 | 500 | 4.2712 | 3.3651 | 0.45 |
358
+ | 0.4505 | 1000 | 3.5714 | 2.5580 | 0.6223 |
359
+ | 0.6757 | 1500 | 3.3655 | 2.5956 | 0.6169 |
360
+ | 0.9009 | 2000 | 3.2218 | 2.6932 | 0.6493 |
361
+ | 1.1257 | 2500 | 3.0911 | 2.7852 | 0.6736 |
362
+ | 1.3509 | 3000 | 3.0007 | 2.7838 | 0.6703 |
363
+ | 1.5761 | 3500 | 3.0536 | 2.5324 | 0.7311 |
364
+ | 1.8014 | 4000 | 3.0286 | 2.6623 | 0.6892 |
365
+ | 2.0261 | 4500 | 2.9539 | 2.6397 | 0.7088 |
366
+ | 2.2514 | 5000 | 2.9252 | 2.5550 | 0.7419 |
367
+ | 2.4766 | 5500 | 2.944 | 2.5391 | 0.7419 |
368
+ | 2.7018 | 6000 | 3.028 | 2.6421 | 0.6919 |
369
+ | 2.9270 | 6500 | 2.9389 | 2.5931 | 0.7209 |
370
+ | 3.1518 | 7000 | 2.9006 | 2.6597 | 0.7365 |
371
+ | 3.3770 | 7500 | 2.9107 | 2.4841 | 0.7709 |
372
+ | 3.6023 | 8000 | 2.9802 | 2.5128 | 0.7493 |
373
+ | 3.8275 | 8500 | 2.9498 | 2.5716 | 0.7439 |
374
+ | 4.0523 | 9000 | 2.9004 | 2.4889 | 0.7669 |
375
+ | 4.2775 | 9500 | 2.89 | 2.5824 | 0.7453 |
376
+ | 4.5027 | 10000 | 2.9343 | 2.4388 | 0.7757 |
377
+ | 4.7279 | 10500 | 2.9666 | 2.4759 | 0.7520 |
378
+ | 4.9532 | 11000 | 2.9153 | 2.6096 | 0.7399 |
379
+ | 5.1779 | 11500 | 2.873 | 2.5489 | 0.7520 |
380
+ | 5.4032 | 12000 | 2.8978 | 2.5579 | 0.7527 |
381
+ | 5.6284 | 12500 | 2.9576 | 2.5336 | 0.7581 |
382
+ | 5.8536 | 13000 | 2.93 | 2.4656 | 0.7730 |
383
+ | 6.0784 | 13500 | 2.8825 | 2.4987 | 0.7730 |
384
+ | 6.3036 | 14000 | 2.8863 | 2.4866 | 0.7818 |
385
+ | 6.5288 | 14500 | 2.9221 | 2.4416 | 0.7818 |
386
+ | 6.7541 | 15000 | 2.9544 | 2.4705 | 0.7622 |
387
+ | 6.9793 | 15500 | 2.8929 | 2.4991 | 0.7669 |
388
+ | 7.2041 | 16000 | 2.8656 | 2.5163 | 0.7689 |
389
+ | 7.4293 | 16500 | 2.8866 | 2.5390 | 0.7689 |
390
+ | 7.6545 | 17000 | 2.9675 | 2.4476 | 0.7872 |
391
+ | 7.8797 | 17500 | 2.9094 | 2.4572 | 0.775 |
392
+ | 8.1045 | 18000 | 2.8743 | 2.4677 | 0.7743 |
393
+ | 8.3297 | 18500 | 2.8748 | 2.4658 | 0.7872 |
394
+ | 8.5550 | 19000 | 2.9201 | 2.4412 | 0.7865 |
395
+ | 8.7802 | 19500 | 2.9437 | 2.4620 | 0.7811 |
396
+ | 9.0050 | 20000 | 2.881 | 2.4608 | 0.7797 |
397
+ | 9.2302 | 20500 | 2.8628 | 2.4801 | 0.7770 |
398
+ | 9.4554 | 21000 | 2.884 | 2.4699 | 0.7831 |
399
+ | 9.6806 | 21500 | 2.9658 | 2.4519 | 0.7845 |
400
+ | 9.9059 | 22000 | 2.8991 | 2.4474 | 0.7845 |
401
+
402
+
403
+ ### Framework Versions
404
+ - Python: 3.11.11
405
+ - Sentence Transformers: 3.3.1
406
+ - Transformers: 4.47.1
407
+ - PyTorch: 2.5.1+cu124
408
+ - Accelerate: 1.2.1
409
+ - Datasets: 3.2.0
410
+ - Tokenizers: 0.21.0
411
+
412
+ ## Citation
413
+
414
+ ### BibTeX
415
+
416
+ #### Sentence Transformers
417
+ ```bibtex
418
+ @inproceedings{reimers-2019-sentence-bert,
419
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
420
+ author = "Reimers, Nils and Gurevych, Iryna",
421
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
422
+ month = "11",
423
+ year = "2019",
424
+ publisher = "Association for Computational Linguistics",
425
+ url = "https://arxiv.org/abs/1908.10084",
426
+ }
427
+ ```
428
+
429
+ #### MultipleNegativesRankingLoss
430
+ ```bibtex
431
+ @misc{henderson2017efficient,
432
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
433
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
434
+ year={2017},
435
+ eprint={1705.00652},
436
+ archivePrefix={arXiv},
437
+ primaryClass={cs.CL}
438
+ }
439
+ ```
440
+
441
+ <!--
442
+ ## Glossary
443
+
444
+ *Clearly define terms in order to be accessible across audiences.*
445
+ -->
446
+
447
+ <!--
448
+ ## Model Card Authors
449
+
450
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
451
+ -->
452
+
453
+ <!--
454
+ ## Model Card Contact
455
+
456
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
457
+ -->
added_tokens.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "</s>": 592,
3
+ "<s>": 591
4
+ }
config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "DeepChem/ChemBERTa-77M-MLM",
3
+ "architectures": [
4
+ "RobertaModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.109,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "eos_token_id": 2,
10
+ "gradient_checkpointing": false,
11
+ "hidden_act": "gelu",
12
+ "hidden_dropout_prob": 0.144,
13
+ "hidden_size": 384,
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 464,
16
+ "is_gpu": true,
17
+ "layer_norm_eps": 1e-12,
18
+ "max_position_embeddings": 515,
19
+ "model_type": "roberta",
20
+ "num_attention_heads": 12,
21
+ "num_hidden_layers": 3,
22
+ "pad_token_id": 1,
23
+ "position_embedding_type": "absolute",
24
+ "torch_dtype": "float32",
25
+ "transformers_version": "4.47.1",
26
+ "type_vocab_size": 1,
27
+ "use_cache": true,
28
+ "vocab_size": 600
29
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.3.1",
4
+ "transformers": "4.47.1",
5
+ "pytorch": "2.5.1+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
merges.txt ADDED
@@ -0,0 +1 @@
 
 
1
+ #version: 0.2
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f10e6775bfa3a5410625340432c7eade8518eacd0e18fbd1592e90da29f812c3
3
+ size 13715688
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "[CLS]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": true,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "[MASK]",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "[PAD]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "[SEP]",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "[UNK]",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
tokenizer.json ADDED
@@ -0,0 +1,712 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "version": "1.0",
3
+ "truncation": {
4
+ "direction": "Right",
5
+ "max_length": 512,
6
+ "strategy": "LongestFirst",
7
+ "stride": 0
8
+ },
9
+ "padding": {
10
+ "strategy": "BatchLongest",
11
+ "direction": "Right",
12
+ "pad_to_multiple_of": null,
13
+ "pad_id": 0,
14
+ "pad_type_id": 0,
15
+ "pad_token": "[PAD]"
16
+ },
17
+ "added_tokens": [
18
+ {
19
+ "id": 0,
20
+ "content": "[PAD]",
21
+ "single_word": false,
22
+ "lstrip": false,
23
+ "rstrip": false,
24
+ "normalized": false,
25
+ "special": true
26
+ },
27
+ {
28
+ "id": 11,
29
+ "content": "[UNK]",
30
+ "single_word": false,
31
+ "lstrip": false,
32
+ "rstrip": false,
33
+ "normalized": false,
34
+ "special": true
35
+ },
36
+ {
37
+ "id": 12,
38
+ "content": "[CLS]",
39
+ "single_word": false,
40
+ "lstrip": false,
41
+ "rstrip": false,
42
+ "normalized": false,
43
+ "special": true
44
+ },
45
+ {
46
+ "id": 13,
47
+ "content": "[SEP]",
48
+ "single_word": false,
49
+ "lstrip": false,
50
+ "rstrip": false,
51
+ "normalized": false,
52
+ "special": true
53
+ },
54
+ {
55
+ "id": 14,
56
+ "content": "[MASK]",
57
+ "single_word": false,
58
+ "lstrip": true,
59
+ "rstrip": false,
60
+ "normalized": false,
61
+ "special": true
62
+ },
63
+ {
64
+ "id": 591,
65
+ "content": "<s>",
66
+ "single_word": false,
67
+ "lstrip": false,
68
+ "rstrip": false,
69
+ "normalized": true,
70
+ "special": true
71
+ },
72
+ {
73
+ "id": 592,
74
+ "content": "</s>",
75
+ "single_word": false,
76
+ "lstrip": false,
77
+ "rstrip": false,
78
+ "normalized": true,
79
+ "special": true
80
+ }
81
+ ],
82
+ "normalizer": null,
83
+ "pre_tokenizer": {
84
+ "type": "ByteLevel",
85
+ "add_prefix_space": false,
86
+ "trim_offsets": true,
87
+ "use_regex": true
88
+ },
89
+ "post_processor": {
90
+ "type": "RobertaProcessing",
91
+ "sep": [
92
+ "[SEP]",
93
+ 13
94
+ ],
95
+ "cls": [
96
+ "[CLS]",
97
+ 12
98
+ ],
99
+ "trim_offsets": true,
100
+ "add_prefix_space": false
101
+ },
102
+ "decoder": {
103
+ "type": "ByteLevel",
104
+ "add_prefix_space": true,
105
+ "trim_offsets": true,
106
+ "use_regex": true
107
+ },
108
+ "model": {
109
+ "type": "BPE",
110
+ "dropout": null,
111
+ "unk_token": null,
112
+ "continuing_subword_prefix": "",
113
+ "end_of_word_suffix": "",
114
+ "fuse_unk": false,
115
+ "byte_fallback": false,
116
+ "ignore_merges": false,
117
+ "vocab": {
118
+ "[PAD]": 0,
119
+ "[unused1]": 1,
120
+ "[unused2]": 2,
121
+ "[unused3]": 3,
122
+ "[unused4]": 4,
123
+ "[unused5]": 5,
124
+ "[unused6]": 6,
125
+ "[unused7]": 7,
126
+ "[unused8]": 8,
127
+ "[unused9]": 9,
128
+ "[unused10]": 10,
129
+ "[UNK]": 11,
130
+ "[CLS]": 12,
131
+ "[SEP]": 13,
132
+ "[MASK]": 14,
133
+ "c": 15,
134
+ "C": 16,
135
+ "(": 17,
136
+ ")": 18,
137
+ "O": 19,
138
+ "1": 20,
139
+ "2": 21,
140
+ "=": 22,
141
+ "N": 23,
142
+ ".": 24,
143
+ "n": 25,
144
+ "3": 26,
145
+ "F": 27,
146
+ "Cl": 28,
147
+ ">>": 29,
148
+ "~": 30,
149
+ "-": 31,
150
+ "4": 32,
151
+ "[C@H]": 33,
152
+ "S": 34,
153
+ "[C@@H]": 35,
154
+ "[O-]": 36,
155
+ "Br": 37,
156
+ "#": 38,
157
+ "/": 39,
158
+ "[nH]": 40,
159
+ "[N+]": 41,
160
+ "s": 42,
161
+ "5": 43,
162
+ "o": 44,
163
+ "P": 45,
164
+ "[Na+]": 46,
165
+ "[Si]": 47,
166
+ "I": 48,
167
+ "[Na]": 49,
168
+ "[Pd]": 50,
169
+ "[K+]": 51,
170
+ "[K]": 52,
171
+ "[P]": 53,
172
+ "B": 54,
173
+ "[C@]": 55,
174
+ "[C@@]": 56,
175
+ "[Cl-]": 57,
176
+ "6": 58,
177
+ "[OH-]": 59,
178
+ "\\": 60,
179
+ "[N-]": 61,
180
+ "[Li]": 62,
181
+ "[H]": 63,
182
+ "[2H]": 64,
183
+ "[NH4+]": 65,
184
+ "[c-]": 66,
185
+ "[P-]": 67,
186
+ "[Cs+]": 68,
187
+ "[Li+]": 69,
188
+ "[Cs]": 70,
189
+ "[NaH]": 71,
190
+ "[H-]": 72,
191
+ "[O+]": 73,
192
+ "[BH4-]": 74,
193
+ "[Cu]": 75,
194
+ "7": 76,
195
+ "[Mg]": 77,
196
+ "[Fe+2]": 78,
197
+ "[n+]": 79,
198
+ "[Sn]": 80,
199
+ "[BH-]": 81,
200
+ "[Pd+2]": 82,
201
+ "[CH]": 83,
202
+ "[I-]": 84,
203
+ "[Br-]": 85,
204
+ "[C-]": 86,
205
+ "[Zn]": 87,
206
+ "[B-]": 88,
207
+ "[F-]": 89,
208
+ "[Al]": 90,
209
+ "[P+]": 91,
210
+ "[BH3-]": 92,
211
+ "[Fe]": 93,
212
+ "[C]": 94,
213
+ "[AlH4]": 95,
214
+ "[Ni]": 96,
215
+ "[SiH]": 97,
216
+ "8": 98,
217
+ "[Cu+2]": 99,
218
+ "[Mn]": 100,
219
+ "[AlH]": 101,
220
+ "[nH+]": 102,
221
+ "[AlH4-]": 103,
222
+ "[O-2]": 104,
223
+ "[Cr]": 105,
224
+ "[Mg+2]": 106,
225
+ "[NH3+]": 107,
226
+ "[S@]": 108,
227
+ "[Pt]": 109,
228
+ "[Al+3]": 110,
229
+ "[S@@]": 111,
230
+ "[S-]": 112,
231
+ "[Ti]": 113,
232
+ "[Zn+2]": 114,
233
+ "[PH]": 115,
234
+ "[NH2+]": 116,
235
+ "[Ru]": 117,
236
+ "[Ag+]": 118,
237
+ "[S+]": 119,
238
+ "[I+3]": 120,
239
+ "[NH+]": 121,
240
+ "[Ca+2]": 122,
241
+ "[Ag]": 123,
242
+ "9": 124,
243
+ "[Os]": 125,
244
+ "[Se]": 126,
245
+ "[SiH2]": 127,
246
+ "[Ca]": 128,
247
+ "[Ti+4]": 129,
248
+ "[Ac]": 130,
249
+ "[Cu+]": 131,
250
+ "[S]": 132,
251
+ "[Rh]": 133,
252
+ "[Cl+3]": 134,
253
+ "[cH-]": 135,
254
+ "[Zn+]": 136,
255
+ "[O]": 137,
256
+ "[Cl+]": 138,
257
+ "[SH]": 139,
258
+ "[H+]": 140,
259
+ "[Pd+]": 141,
260
+ "[se]": 142,
261
+ "[PH+]": 143,
262
+ "[I]": 144,
263
+ "[Pt+2]": 145,
264
+ "[C+]": 146,
265
+ "[Mg+]": 147,
266
+ "[Hg]": 148,
267
+ "[W]": 149,
268
+ "[SnH]": 150,
269
+ "[SiH3]": 151,
270
+ "[Fe+3]": 152,
271
+ "[NH]": 153,
272
+ "[Mo]": 154,
273
+ "[CH2+]": 155,
274
+ "%10": 156,
275
+ "[CH2-]": 157,
276
+ "[CH2]": 158,
277
+ "[n-]": 159,
278
+ "[Ce+4]": 160,
279
+ "[NH-]": 161,
280
+ "[Co]": 162,
281
+ "[I+]": 163,
282
+ "[PH2]": 164,
283
+ "[Pt+4]": 165,
284
+ "[Ce]": 166,
285
+ "[B]": 167,
286
+ "[Sn+2]": 168,
287
+ "[Ba+2]": 169,
288
+ "%11": 170,
289
+ "[Fe-3]": 171,
290
+ "[18F]": 172,
291
+ "[SH-]": 173,
292
+ "[Pb+2]": 174,
293
+ "[Os-2]": 175,
294
+ "[Zr+4]": 176,
295
+ "[N]": 177,
296
+ "[Ir]": 178,
297
+ "[Bi]": 179,
298
+ "[Ni+2]": 180,
299
+ "[P@]": 181,
300
+ "[Co+2]": 182,
301
+ "[s+]": 183,
302
+ "[As]": 184,
303
+ "[P+3]": 185,
304
+ "[Hg+2]": 186,
305
+ "[Yb+3]": 187,
306
+ "[CH-]": 188,
307
+ "[Zr+2]": 189,
308
+ "[Mn+2]": 190,
309
+ "[CH+]": 191,
310
+ "[In]": 192,
311
+ "[KH]": 193,
312
+ "[Ce+3]": 194,
313
+ "[Zr]": 195,
314
+ "[AlH2-]": 196,
315
+ "[OH2+]": 197,
316
+ "[Ti+3]": 198,
317
+ "[Rh+2]": 199,
318
+ "[Sb]": 200,
319
+ "[S-2]": 201,
320
+ "%12": 202,
321
+ "[P@@]": 203,
322
+ "[Si@H]": 204,
323
+ "[Mn+4]": 205,
324
+ "p": 206,
325
+ "[Ba]": 207,
326
+ "[NH2-]": 208,
327
+ "[Ge]": 209,
328
+ "[Pb+4]": 210,
329
+ "[Cr+3]": 211,
330
+ "[Au]": 212,
331
+ "[LiH]": 213,
332
+ "[Sc+3]": 214,
333
+ "[o+]": 215,
334
+ "[Rh-3]": 216,
335
+ "%13": 217,
336
+ "[Br]": 218,
337
+ "[Sb-]": 219,
338
+ "[S@+]": 220,
339
+ "[I+2]": 221,
340
+ "[Ar]": 222,
341
+ "[V]": 223,
342
+ "[Cu-]": 224,
343
+ "[Al-]": 225,
344
+ "[Te]": 226,
345
+ "[13c]": 227,
346
+ "[13C]": 228,
347
+ "[Cl]": 229,
348
+ "[PH4+]": 230,
349
+ "[SiH4]": 231,
350
+ "[te]": 232,
351
+ "[CH3-]": 233,
352
+ "[S@@+]": 234,
353
+ "[Rh+3]": 235,
354
+ "[SH+]": 236,
355
+ "[Bi+3]": 237,
356
+ "[Br+2]": 238,
357
+ "[La]": 239,
358
+ "[La+3]": 240,
359
+ "[Pt-2]": 241,
360
+ "[N@@]": 242,
361
+ "[PH3+]": 243,
362
+ "[N@]": 244,
363
+ "[Si+4]": 245,
364
+ "[Sr+2]": 246,
365
+ "[Al+]": 247,
366
+ "[Pb]": 248,
367
+ "[SeH]": 249,
368
+ "[Si-]": 250,
369
+ "[V+5]": 251,
370
+ "[Y+3]": 252,
371
+ "[Re]": 253,
372
+ "[Ru+]": 254,
373
+ "[Sm]": 255,
374
+ "*": 256,
375
+ "[3H]": 257,
376
+ "[NH2]": 258,
377
+ "[Ag-]": 259,
378
+ "[13CH3]": 260,
379
+ "[OH+]": 261,
380
+ "[Ru+3]": 262,
381
+ "[OH]": 263,
382
+ "[Gd+3]": 264,
383
+ "[13CH2]": 265,
384
+ "[In+3]": 266,
385
+ "[Si@@]": 267,
386
+ "[Si@]": 268,
387
+ "[Ti+2]": 269,
388
+ "[Sn+]": 270,
389
+ "[Cl+2]": 271,
390
+ "[AlH-]": 272,
391
+ "[Pd-2]": 273,
392
+ "[SnH3]": 274,
393
+ "[B+3]": 275,
394
+ "[Cu-2]": 276,
395
+ "[Nd+3]": 277,
396
+ "[Pb+3]": 278,
397
+ "[13cH]": 279,
398
+ "[Fe-4]": 280,
399
+ "[Ga]": 281,
400
+ "[Sn+4]": 282,
401
+ "[Hg+]": 283,
402
+ "[11CH3]": 284,
403
+ "[Hf]": 285,
404
+ "[Pr]": 286,
405
+ "[Y]": 287,
406
+ "[S+2]": 288,
407
+ "[Cd]": 289,
408
+ "[Cr+6]": 290,
409
+ "[Zr+3]": 291,
410
+ "[Rh+]": 292,
411
+ "[CH3]": 293,
412
+ "[N-3]": 294,
413
+ "[Hf+2]": 295,
414
+ "[Th]": 296,
415
+ "[Sb+3]": 297,
416
+ "%14": 298,
417
+ "[Cr+2]": 299,
418
+ "[Ru+2]": 300,
419
+ "[Hf+4]": 301,
420
+ "[14C]": 302,
421
+ "[Ta]": 303,
422
+ "[Tl+]": 304,
423
+ "[B+]": 305,
424
+ "[Os+4]": 306,
425
+ "[PdH2]": 307,
426
+ "[Pd-]": 308,
427
+ "[Cd+2]": 309,
428
+ "[Co+3]": 310,
429
+ "[S+4]": 311,
430
+ "[Nb+5]": 312,
431
+ "[123I]": 313,
432
+ "[c+]": 314,
433
+ "[Rb+]": 315,
434
+ "[V+2]": 316,
435
+ "[CH3+]": 317,
436
+ "[Ag+2]": 318,
437
+ "[cH+]": 319,
438
+ "[Mn+3]": 320,
439
+ "[Se-]": 321,
440
+ "[As-]": 322,
441
+ "[Eu+3]": 323,
442
+ "[SH2]": 324,
443
+ "[Sm+3]": 325,
444
+ "[IH+]": 326,
445
+ "%15": 327,
446
+ "[OH3+]": 328,
447
+ "[PH3]": 329,
448
+ "[IH2+]": 330,
449
+ "[SH2+]": 331,
450
+ "[Ir+3]": 332,
451
+ "[AlH3]": 333,
452
+ "[Sc]": 334,
453
+ "[Yb]": 335,
454
+ "[15NH2]": 336,
455
+ "[Lu]": 337,
456
+ "[sH+]": 338,
457
+ "[Gd]": 339,
458
+ "[18F-]": 340,
459
+ "[SH3+]": 341,
460
+ "[SnH4]": 342,
461
+ "[TeH]": 343,
462
+ "[Si@@H]": 344,
463
+ "[Ga+3]": 345,
464
+ "[CaH2]": 346,
465
+ "[Tl]": 347,
466
+ "[Ta+5]": 348,
467
+ "[GeH]": 349,
468
+ "[Br+]": 350,
469
+ "[Sr]": 351,
470
+ "[Tl+3]": 352,
471
+ "[Sm+2]": 353,
472
+ "[PH5]": 354,
473
+ "%16": 355,
474
+ "[N@@+]": 356,
475
+ "[Au+3]": 357,
476
+ "[C-4]": 358,
477
+ "[Nd]": 359,
478
+ "[Ti+]": 360,
479
+ "[IH]": 361,
480
+ "[N@+]": 362,
481
+ "[125I]": 363,
482
+ "[Eu]": 364,
483
+ "[Sn+3]": 365,
484
+ "[Nb]": 366,
485
+ "[Er+3]": 367,
486
+ "[123I-]": 368,
487
+ "[14c]": 369,
488
+ "%17": 370,
489
+ "[SnH2]": 371,
490
+ "[YH]": 372,
491
+ "[Sb+5]": 373,
492
+ "[Pr+3]": 374,
493
+ "[Ir+]": 375,
494
+ "[N+3]": 376,
495
+ "[AlH2]": 377,
496
+ "[19F]": 378,
497
+ "%18": 379,
498
+ "[Tb]": 380,
499
+ "[14CH]": 381,
500
+ "[Mo+4]": 382,
501
+ "[Si+]": 383,
502
+ "[BH]": 384,
503
+ "[Be]": 385,
504
+ "[Rb]": 386,
505
+ "[pH]": 387,
506
+ "%19": 388,
507
+ "%20": 389,
508
+ "[Xe]": 390,
509
+ "[Ir-]": 391,
510
+ "[Be+2]": 392,
511
+ "[C+4]": 393,
512
+ "[RuH2]": 394,
513
+ "[15NH]": 395,
514
+ "[U+2]": 396,
515
+ "[Au-]": 397,
516
+ "%21": 398,
517
+ "%22": 399,
518
+ "[Au+]": 400,
519
+ "[15n]": 401,
520
+ "[Al+2]": 402,
521
+ "[Tb+3]": 403,
522
+ "[15N]": 404,
523
+ "[V+3]": 405,
524
+ "[W+6]": 406,
525
+ "[14CH3]": 407,
526
+ "[Cr+4]": 408,
527
+ "[ClH+]": 409,
528
+ "b": 410,
529
+ "[Ti+6]": 411,
530
+ "[Nd+]": 412,
531
+ "[Zr+]": 413,
532
+ "[PH2+]": 414,
533
+ "[Fm]": 415,
534
+ "[N@H+]": 416,
535
+ "[RuH]": 417,
536
+ "[Dy+3]": 418,
537
+ "%23": 419,
538
+ "[Hf+3]": 420,
539
+ "[W+4]": 421,
540
+ "[11C]": 422,
541
+ "[13CH]": 423,
542
+ "[Er]": 424,
543
+ "[124I]": 425,
544
+ "[LaH]": 426,
545
+ "[F]": 427,
546
+ "[siH]": 428,
547
+ "[Ga+]": 429,
548
+ "[Cm]": 430,
549
+ "[GeH3]": 431,
550
+ "[IH-]": 432,
551
+ "[U+6]": 433,
552
+ "[SeH+]": 434,
553
+ "[32P]": 435,
554
+ "[SeH-]": 436,
555
+ "[Pt-]": 437,
556
+ "[Ir+2]": 438,
557
+ "[se+]": 439,
558
+ "[U]": 440,
559
+ "[F+]": 441,
560
+ "[BH2]": 442,
561
+ "[As+]": 443,
562
+ "[Cf]": 444,
563
+ "[ClH2+]": 445,
564
+ "[Ni+]": 446,
565
+ "[TeH3]": 447,
566
+ "[SbH2]": 448,
567
+ "[Ag+3]": 449,
568
+ "%24": 450,
569
+ "[18O]": 451,
570
+ "[PH4]": 452,
571
+ "[Os+2]": 453,
572
+ "[Na-]": 454,
573
+ "[Sb+2]": 455,
574
+ "[V+4]": 456,
575
+ "[Ho+3]": 457,
576
+ "[68Ga]": 458,
577
+ "[PH-]": 459,
578
+ "[Bi+2]": 460,
579
+ "[Ce+2]": 461,
580
+ "[Pd+3]": 462,
581
+ "[99Tc]": 463,
582
+ "[13C@@H]": 464,
583
+ "[Fe+6]": 465,
584
+ "[c]": 466,
585
+ "[GeH2]": 467,
586
+ "[10B]": 468,
587
+ "[Cu+3]": 469,
588
+ "[Mo+2]": 470,
589
+ "[Cr+]": 471,
590
+ "[Pd+4]": 472,
591
+ "[Dy]": 473,
592
+ "[AsH]": 474,
593
+ "[Ba+]": 475,
594
+ "[SeH2]": 476,
595
+ "[In+]": 477,
596
+ "[TeH2]": 478,
597
+ "[BrH+]": 479,
598
+ "[14cH]": 480,
599
+ "[W+]": 481,
600
+ "[13C@H]": 482,
601
+ "[AsH2]": 483,
602
+ "[In+2]": 484,
603
+ "[N+2]": 485,
604
+ "[N@@H+]": 486,
605
+ "[SbH]": 487,
606
+ "[60Co]": 488,
607
+ "[AsH4+]": 489,
608
+ "[AsH3]": 490,
609
+ "[18OH]": 491,
610
+ "[Ru-2]": 492,
611
+ "[Na-2]": 493,
612
+ "[CuH2]": 494,
613
+ "[31P]": 495,
614
+ "[Ti+5]": 496,
615
+ "[35S]": 497,
616
+ "[P@@H]": 498,
617
+ "[ArH]": 499,
618
+ "[Co+]": 500,
619
+ "[Zr-2]": 501,
620
+ "[BH2-]": 502,
621
+ "[131I]": 503,
622
+ "[SH5]": 504,
623
+ "[VH]": 505,
624
+ "[B+2]": 506,
625
+ "[Yb+2]": 507,
626
+ "[14C@H]": 508,
627
+ "[211At]": 509,
628
+ "[NH3+2]": 510,
629
+ "[IrH]": 511,
630
+ "[IrH2]": 512,
631
+ "[Rh-]": 513,
632
+ "[Cr-]": 514,
633
+ "[Sb+]": 515,
634
+ "[Ni+3]": 516,
635
+ "[TaH3]": 517,
636
+ "[Tl+2]": 518,
637
+ "[64Cu]": 519,
638
+ "[Tc]": 520,
639
+ "[Cd+]": 521,
640
+ "[1H]": 522,
641
+ "[15nH]": 523,
642
+ "[AlH2+]": 524,
643
+ "[FH+2]": 525,
644
+ "[BiH3]": 526,
645
+ "[Ru-]": 527,
646
+ "[Mo+6]": 528,
647
+ "[AsH+]": 529,
648
+ "[BaH2]": 530,
649
+ "[BaH]": 531,
650
+ "[Fe+4]": 532,
651
+ "[229Th]": 533,
652
+ "[Th+4]": 534,
653
+ "[As+3]": 535,
654
+ "[NH+3]": 536,
655
+ "[P@H]": 537,
656
+ "[Li-]": 538,
657
+ "[7NaH]": 539,
658
+ "[Bi+]": 540,
659
+ "[PtH+2]": 541,
660
+ "[p-]": 542,
661
+ "[Re+5]": 543,
662
+ "[NiH]": 544,
663
+ "[Ni-]": 545,
664
+ "[Xe+]": 546,
665
+ "[Ca+]": 547,
666
+ "[11c]": 548,
667
+ "[Rh+4]": 549,
668
+ "[AcH]": 550,
669
+ "[HeH]": 551,
670
+ "[Sc+2]": 552,
671
+ "[Mn+]": 553,
672
+ "[UH]": 554,
673
+ "[14CH2]": 555,
674
+ "[SiH4+]": 556,
675
+ "[18OH2]": 557,
676
+ "[Ac-]": 558,
677
+ "[Re+4]": 559,
678
+ "[118Sn]": 560,
679
+ "[153Sm]": 561,
680
+ "[P+2]": 562,
681
+ "[9CH]": 563,
682
+ "[9CH3]": 564,
683
+ "[Y-]": 565,
684
+ "[NiH2]": 566,
685
+ "[Si+2]": 567,
686
+ "[Mn+6]": 568,
687
+ "[ZrH2]": 569,
688
+ "[C-2]": 570,
689
+ "[Bi+5]": 571,
690
+ "[24NaH]": 572,
691
+ "[Fr]": 573,
692
+ "[15CH]": 574,
693
+ "[Se+]": 575,
694
+ "[At]": 576,
695
+ "[P-3]": 577,
696
+ "[124I-]": 578,
697
+ "[CuH2-]": 579,
698
+ "[Nb+4]": 580,
699
+ "[Nb+3]": 581,
700
+ "[MgH]": 582,
701
+ "[Ir+4]": 583,
702
+ "[67Ga+3]": 584,
703
+ "[67Ga]": 585,
704
+ "[13N]": 586,
705
+ "[15OH2]": 587,
706
+ "[2NH]": 588,
707
+ "[Ho]": 589,
708
+ "[Cn]": 590
709
+ },
710
+ "merges": []
711
+ }
712
+ }
tokenizer_config.json ADDED
@@ -0,0 +1,76 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "[PAD]",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "11": {
13
+ "content": "[UNK]",
14
+ "lstrip": false,
15
+ "normalized": false,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "12": {
21
+ "content": "[CLS]",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "13": {
29
+ "content": "[SEP]",
30
+ "lstrip": false,
31
+ "normalized": false,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "14": {
37
+ "content": "[MASK]",
38
+ "lstrip": true,
39
+ "normalized": false,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ },
44
+ "591": {
45
+ "content": "<s>",
46
+ "lstrip": false,
47
+ "normalized": true,
48
+ "rstrip": false,
49
+ "single_word": false,
50
+ "special": true
51
+ },
52
+ "592": {
53
+ "content": "</s>",
54
+ "lstrip": false,
55
+ "normalized": true,
56
+ "rstrip": false,
57
+ "single_word": false,
58
+ "special": true
59
+ }
60
+ },
61
+ "bos_token": "<s>",
62
+ "clean_up_tokenization_spaces": false,
63
+ "cls_token": "[CLS]",
64
+ "eos_token": "</s>",
65
+ "errors": "replace",
66
+ "extra_special_tokens": {},
67
+ "full_tokenizer_file": null,
68
+ "mask_token": "[MASK]",
69
+ "max_len": 512,
70
+ "model_max_length": 512,
71
+ "pad_token": "[PAD]",
72
+ "sep_token": "[SEP]",
73
+ "tokenizer_class": "RobertaTokenizer",
74
+ "trim_offsets": true,
75
+ "unk_token": "[UNK]"
76
+ }
vocab.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"[PAD]":0,"[unused1]":1,"[unused2]":2,"[unused3]":3,"[unused4]":4,"[unused5]":5,"[unused6]":6,"[unused7]":7,"[unused8]":8,"[unused9]":9,"[unused10]":10,"[UNK]":11,"[CLS]":12,"[SEP]":13,"[MASK]":14,"c":15,"C":16,"(":17,")":18,"O":19,"1":20,"2":21,"=":22,"N":23,".":24,"n":25,"3":26,"F":27,"Cl":28,">>":29,"~":30,"-":31,"4":32,"[C@H]":33,"S":34,"[C@@H]":35,"[O-]":36,"Br":37,"#":38,"/":39,"[nH]":40,"[N+]":41,"s":42,"5":43,"o":44,"P":45,"[Na+]":46,"[Si]":47,"I":48,"[Na]":49,"[Pd]":50,"[K+]":51,"[K]":52,"[P]":53,"B":54,"[C@]":55,"[C@@]":56,"[Cl-]":57,"6":58,"[OH-]":59,"\\":60,"[N-]":61,"[Li]":62,"[H]":63,"[2H]":64,"[NH4+]":65,"[c-]":66,"[P-]":67,"[Cs+]":68,"[Li+]":69,"[Cs]":70,"[NaH]":71,"[H-]":72,"[O+]":73,"[BH4-]":74,"[Cu]":75,"7":76,"[Mg]":77,"[Fe+2]":78,"[n+]":79,"[Sn]":80,"[BH-]":81,"[Pd+2]":82,"[CH]":83,"[I-]":84,"[Br-]":85,"[C-]":86,"[Zn]":87,"[B-]":88,"[F-]":89,"[Al]":90,"[P+]":91,"[BH3-]":92,"[Fe]":93,"[C]":94,"[AlH4]":95,"[Ni]":96,"[SiH]":97,"8":98,"[Cu+2]":99,"[Mn]":100,"[AlH]":101,"[nH+]":102,"[AlH4-]":103,"[O-2]":104,"[Cr]":105,"[Mg+2]":106,"[NH3+]":107,"[S@]":108,"[Pt]":109,"[Al+3]":110,"[S@@]":111,"[S-]":112,"[Ti]":113,"[Zn+2]":114,"[PH]":115,"[NH2+]":116,"[Ru]":117,"[Ag+]":118,"[S+]":119,"[I+3]":120,"[NH+]":121,"[Ca+2]":122,"[Ag]":123,"9":124,"[Os]":125,"[Se]":126,"[SiH2]":127,"[Ca]":128,"[Ti+4]":129,"[Ac]":130,"[Cu+]":131,"[S]":132,"[Rh]":133,"[Cl+3]":134,"[cH-]":135,"[Zn+]":136,"[O]":137,"[Cl+]":138,"[SH]":139,"[H+]":140,"[Pd+]":141,"[se]":142,"[PH+]":143,"[I]":144,"[Pt+2]":145,"[C+]":146,"[Mg+]":147,"[Hg]":148,"[W]":149,"[SnH]":150,"[SiH3]":151,"[Fe+3]":152,"[NH]":153,"[Mo]":154,"[CH2+]":155,"%10":156,"[CH2-]":157,"[CH2]":158,"[n-]":159,"[Ce+4]":160,"[NH-]":161,"[Co]":162,"[I+]":163,"[PH2]":164,"[Pt+4]":165,"[Ce]":166,"[B]":167,"[Sn+2]":168,"[Ba+2]":169,"%11":170,"[Fe-3]":171,"[18F]":172,"[SH-]":173,"[Pb+2]":174,"[Os-2]":175,"[Zr+4]":176,"[N]":177,"[Ir]":178,"[Bi]":179,"[Ni+2]":180,"[P@]":181,"[Co+2]":182,"[s+]":183,"[As]":184,"[P+3]":185,"[Hg+2]":186,"[Yb+3]":187,"[CH-]":188,"[Zr+2]":189,"[Mn+2]":190,"[CH+]":191,"[In]":192,"[KH]":193,"[Ce+3]":194,"[Zr]":195,"[AlH2-]":196,"[OH2+]":197,"[Ti+3]":198,"[Rh+2]":199,"[Sb]":200,"[S-2]":201,"%12":202,"[P@@]":203,"[Si@H]":204,"[Mn+4]":205,"p":206,"[Ba]":207,"[NH2-]":208,"[Ge]":209,"[Pb+4]":210,"[Cr+3]":211,"[Au]":212,"[LiH]":213,"[Sc+3]":214,"[o+]":215,"[Rh-3]":216,"%13":217,"[Br]":218,"[Sb-]":219,"[S@+]":220,"[I+2]":221,"[Ar]":222,"[V]":223,"[Cu-]":224,"[Al-]":225,"[Te]":226,"[13c]":227,"[13C]":228,"[Cl]":229,"[PH4+]":230,"[SiH4]":231,"[te]":232,"[CH3-]":233,"[S@@+]":234,"[Rh+3]":235,"[SH+]":236,"[Bi+3]":237,"[Br+2]":238,"[La]":239,"[La+3]":240,"[Pt-2]":241,"[N@@]":242,"[PH3+]":243,"[N@]":244,"[Si+4]":245,"[Sr+2]":246,"[Al+]":247,"[Pb]":248,"[SeH]":249,"[Si-]":250,"[V+5]":251,"[Y+3]":252,"[Re]":253,"[Ru+]":254,"[Sm]":255,"*":256,"[3H]":257,"[NH2]":258,"[Ag-]":259,"[13CH3]":260,"[OH+]":261,"[Ru+3]":262,"[OH]":263,"[Gd+3]":264,"[13CH2]":265,"[In+3]":266,"[Si@@]":267,"[Si@]":268,"[Ti+2]":269,"[Sn+]":270,"[Cl+2]":271,"[AlH-]":272,"[Pd-2]":273,"[SnH3]":274,"[B+3]":275,"[Cu-2]":276,"[Nd+3]":277,"[Pb+3]":278,"[13cH]":279,"[Fe-4]":280,"[Ga]":281,"[Sn+4]":282,"[Hg+]":283,"[11CH3]":284,"[Hf]":285,"[Pr]":286,"[Y]":287,"[S+2]":288,"[Cd]":289,"[Cr+6]":290,"[Zr+3]":291,"[Rh+]":292,"[CH3]":293,"[N-3]":294,"[Hf+2]":295,"[Th]":296,"[Sb+3]":297,"%14":298,"[Cr+2]":299,"[Ru+2]":300,"[Hf+4]":301,"[14C]":302,"[Ta]":303,"[Tl+]":304,"[B+]":305,"[Os+4]":306,"[PdH2]":307,"[Pd-]":308,"[Cd+2]":309,"[Co+3]":310,"[S+4]":311,"[Nb+5]":312,"[123I]":313,"[c+]":314,"[Rb+]":315,"[V+2]":316,"[CH3+]":317,"[Ag+2]":318,"[cH+]":319,"[Mn+3]":320,"[Se-]":321,"[As-]":322,"[Eu+3]":323,"[SH2]":324,"[Sm+3]":325,"[IH+]":326,"%15":327,"[OH3+]":328,"[PH3]":329,"[IH2+]":330,"[SH2+]":331,"[Ir+3]":332,"[AlH3]":333,"[Sc]":334,"[Yb]":335,"[15NH2]":336,"[Lu]":337,"[sH+]":338,"[Gd]":339,"[18F-]":340,"[SH3+]":341,"[SnH4]":342,"[TeH]":343,"[Si@@H]":344,"[Ga+3]":345,"[CaH2]":346,"[Tl]":347,"[Ta+5]":348,"[GeH]":349,"[Br+]":350,"[Sr]":351,"[Tl+3]":352,"[Sm+2]":353,"[PH5]":354,"%16":355,"[N@@+]":356,"[Au+3]":357,"[C-4]":358,"[Nd]":359,"[Ti+]":360,"[IH]":361,"[N@+]":362,"[125I]":363,"[Eu]":364,"[Sn+3]":365,"[Nb]":366,"[Er+3]":367,"[123I-]":368,"[14c]":369,"%17":370,"[SnH2]":371,"[YH]":372,"[Sb+5]":373,"[Pr+3]":374,"[Ir+]":375,"[N+3]":376,"[AlH2]":377,"[19F]":378,"%18":379,"[Tb]":380,"[14CH]":381,"[Mo+4]":382,"[Si+]":383,"[BH]":384,"[Be]":385,"[Rb]":386,"[pH]":387,"%19":388,"%20":389,"[Xe]":390,"[Ir-]":391,"[Be+2]":392,"[C+4]":393,"[RuH2]":394,"[15NH]":395,"[U+2]":396,"[Au-]":397,"%21":398,"%22":399,"[Au+]":400,"[15n]":401,"[Al+2]":402,"[Tb+3]":403,"[15N]":404,"[V+3]":405,"[W+6]":406,"[14CH3]":407,"[Cr+4]":408,"[ClH+]":409,"b":410,"[Ti+6]":411,"[Nd+]":412,"[Zr+]":413,"[PH2+]":414,"[Fm]":415,"[N@H+]":416,"[RuH]":417,"[Dy+3]":418,"%23":419,"[Hf+3]":420,"[W+4]":421,"[11C]":422,"[13CH]":423,"[Er]":424,"[124I]":425,"[LaH]":426,"[F]":427,"[siH]":428,"[Ga+]":429,"[Cm]":430,"[GeH3]":431,"[IH-]":432,"[U+6]":433,"[SeH+]":434,"[32P]":435,"[SeH-]":436,"[Pt-]":437,"[Ir+2]":438,"[se+]":439,"[U]":440,"[F+]":441,"[BH2]":442,"[As+]":443,"[Cf]":444,"[ClH2+]":445,"[Ni+]":446,"[TeH3]":447,"[SbH2]":448,"[Ag+3]":449,"%24":450,"[18O]":451,"[PH4]":452,"[Os+2]":453,"[Na-]":454,"[Sb+2]":455,"[V+4]":456,"[Ho+3]":457,"[68Ga]":458,"[PH-]":459,"[Bi+2]":460,"[Ce+2]":461,"[Pd+3]":462,"[99Tc]":463,"[13C@@H]":464,"[Fe+6]":465,"[c]":466,"[GeH2]":467,"[10B]":468,"[Cu+3]":469,"[Mo+2]":470,"[Cr+]":471,"[Pd+4]":472,"[Dy]":473,"[AsH]":474,"[Ba+]":475,"[SeH2]":476,"[In+]":477,"[TeH2]":478,"[BrH+]":479,"[14cH]":480,"[W+]":481,"[13C@H]":482,"[AsH2]":483,"[In+2]":484,"[N+2]":485,"[N@@H+]":486,"[SbH]":487,"[60Co]":488,"[AsH4+]":489,"[AsH3]":490,"[18OH]":491,"[Ru-2]":492,"[Na-2]":493,"[CuH2]":494,"[31P]":495,"[Ti+5]":496,"[35S]":497,"[P@@H]":498,"[ArH]":499,"[Co+]":500,"[Zr-2]":501,"[BH2-]":502,"[131I]":503,"[SH5]":504,"[VH]":505,"[B+2]":506,"[Yb+2]":507,"[14C@H]":508,"[211At]":509,"[NH3+2]":510,"[IrH]":511,"[IrH2]":512,"[Rh-]":513,"[Cr-]":514,"[Sb+]":515,"[Ni+3]":516,"[TaH3]":517,"[Tl+2]":518,"[64Cu]":519,"[Tc]":520,"[Cd+]":521,"[1H]":522,"[15nH]":523,"[AlH2+]":524,"[FH+2]":525,"[BiH3]":526,"[Ru-]":527,"[Mo+6]":528,"[AsH+]":529,"[BaH2]":530,"[BaH]":531,"[Fe+4]":532,"[229Th]":533,"[Th+4]":534,"[As+3]":535,"[NH+3]":536,"[P@H]":537,"[Li-]":538,"[7NaH]":539,"[Bi+]":540,"[PtH+2]":541,"[p-]":542,"[Re+5]":543,"[NiH]":544,"[Ni-]":545,"[Xe+]":546,"[Ca+]":547,"[11c]":548,"[Rh+4]":549,"[AcH]":550,"[HeH]":551,"[Sc+2]":552,"[Mn+]":553,"[UH]":554,"[14CH2]":555,"[SiH4+]":556,"[18OH2]":557,"[Ac-]":558,"[Re+4]":559,"[118Sn]":560,"[153Sm]":561,"[P+2]":562,"[9CH]":563,"[9CH3]":564,"[Y-]":565,"[NiH2]":566,"[Si+2]":567,"[Mn+6]":568,"[ZrH2]":569,"[C-2]":570,"[Bi+5]":571,"[24NaH]":572,"[Fr]":573,"[15CH]":574,"[Se+]":575,"[At]":576,"[P-3]":577,"[124I-]":578,"[CuH2-]":579,"[Nb+4]":580,"[Nb+3]":581,"[MgH]":582,"[Ir+4]":583,"[67Ga+3]":584,"[67Ga]":585,"[13N]":586,"[15OH2]":587,"[2NH]":588,"[Ho]":589,"[Cn]":590}