HassanCS commited on
Commit
23bc246
·
verified ·
1 Parent(s): a65a93f

Add new SentenceTransformer model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 384,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,453 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:10000
8
+ - loss:ContrastiveLoss
9
+ base_model: DeepChem/ChemBERTa-77M-MLM
10
+ widget:
11
+ - source_sentence: CC(C)N=c1cc2n(-c3ccc(Cl)cc3)c3ccccc3nc-2cc1Nc1ccc(Cl)cc1
12
+ sentences:
13
+ - C[NH+]1CCC(=C2c3ccccc3CCn3c(C=O)c[nH+]c32)CC1
14
+ - COc1ccc(C(=O)CC(=O)c2ccc(C(C)(C)C)cc2)cc1
15
+ - CC1CNc2c(cccc2S(=O)(=O)NC(CCC[NH+]=C(N)N)C(=O)N2CCC(C)CC2C(=O)[O-])C1
16
+ - source_sentence: CC(C)c1ccc2oc3nc(N)c(C(=O)[O-])cc3c(=O)c2c1
17
+ sentences:
18
+ - COC1=CC(=O)CC(C)C12Oc1c(Cl)c(OC)cc(OC)c1C2=O
19
+ - CON=C(C(=O)NC1C(=O)N2C(C(=O)[O-])=C(C[N+]3(C)CCCC3)CSC12)c1csc(N)n1
20
+ - CC1C=CC=CC=CC=CC=CC=CC=CC(OC2OC(C)C(O)C([NH3+])C2O)CC2OC(O)(CC(O)CC(O)C(O)CCC(O)CC(O)CC(=O)OC(C)C(C)C1O)CC(O)C2C(=O)[O-]
21
+ - source_sentence: C[NH2+]C1CCc2[nH]c3ccc(C(N)=O)cc3c2C1
22
+ sentences:
23
+ - CC(OC(=O)c1ccccc1)C1=CCC23OCC[NH+](C)CC12CC(O)C12OC4(O)CCC1(C)C(CC=C32)C4
24
+ - CC(=O)NC(Cc1ccc2ccccc2c1)C(=O)NC(Cc1ccc(Cl)cc1)C(=O)NC(Cc1cccnc1)C(=O)NC(CO)C(=O)NC(Cc1ccc(NC(=O)C2CC(=O)NC(=O)N2)cc1)C(=O)NC(Cc1ccc(NC(N)=O)cc1)C(=O)NC(CC(C)C)C(=O)NC(CCCC[NH2+]C(C)C)C(=O)N1CCCC1C(=O)NC(C)C(N)=O
25
+ - C[NH+](C)CCOC(=O)C(c1ccccc1)C1(O)CCCC1
26
+ - source_sentence: CC(C)n1c(C=CC(O)CC(O)CC(=O)[O-])c(-c2ccc(F)cc2)c2ccccc21
27
+ sentences:
28
+ - C#CC1(O)CCC2C3CCC4=C(CCC(=O)C4)C3CCC21C
29
+ - CC(C=CC(C)C(C)(C)O)C1CCC2C(=CC=C3CC(O)CC(O)C3)CCCC21C
30
+ - CC(C)CNCc1ccc(-c2ccccc2S(=O)(=O)N2CCCC2)cc1
31
+ - source_sentence: CC#CCn1c(N2CCCC([NH3+])C2)nc2c1c(=O)n(Cc1nc(C)c3ccccc3n1)c(=O)n2C
32
+ sentences:
33
+ - C[N+]1(C)CCCC(OC(=O)C(O)(c2ccccc2)c2ccccc2)C1
34
+ - CC(Cc1ccc(O)c(O)c1)C(C)Cc1ccc(O)c(O)c1
35
+ - CC12CCC(=O)C=C1CCC1C2C(O)CC2(C)C1CCC2(O)C(=O)COC(=O)CCC1CCCC1
36
+ pipeline_tag: sentence-similarity
37
+ library_name: sentence-transformers
38
+ metrics:
39
+ - cosine_accuracy
40
+ - cosine_accuracy_threshold
41
+ - cosine_f1
42
+ - cosine_f1_threshold
43
+ - cosine_precision
44
+ - cosine_recall
45
+ - cosine_ap
46
+ model-index:
47
+ - name: SentenceTransformer based on DeepChem/ChemBERTa-77M-MLM
48
+ results:
49
+ - task:
50
+ type: binary-classification
51
+ name: Binary Classification
52
+ dataset:
53
+ name: all dev
54
+ type: all-dev
55
+ metrics:
56
+ - type: cosine_accuracy
57
+ value: 0.9066
58
+ name: Cosine Accuracy
59
+ - type: cosine_accuracy_threshold
60
+ value: 0.5664876699447632
61
+ name: Cosine Accuracy Threshold
62
+ - type: cosine_f1
63
+ value: 0.9510122731564041
64
+ name: Cosine F1
65
+ - type: cosine_f1_threshold
66
+ value: 0.5664876699447632
67
+ name: Cosine F1 Threshold
68
+ - type: cosine_precision
69
+ value: 0.9067813562712542
70
+ name: Cosine Precision
71
+ - type: cosine_recall
72
+ value: 0.9997794441993825
73
+ name: Cosine Recall
74
+ - type: cosine_ap
75
+ value: 0.9523113003188102
76
+ name: Cosine Ap
77
+ ---
78
+
79
+ # SentenceTransformer based on DeepChem/ChemBERTa-77M-MLM
80
+
81
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [DeepChem/ChemBERTa-77M-MLM](https://huggingface.co/DeepChem/ChemBERTa-77M-MLM). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
82
+
83
+ ## Model Details
84
+
85
+ ### Model Description
86
+ - **Model Type:** Sentence Transformer
87
+ - **Base model:** [DeepChem/ChemBERTa-77M-MLM](https://huggingface.co/DeepChem/ChemBERTa-77M-MLM) <!-- at revision ed8a5374f2024ec8da53760af91a33fb8f6a15ff -->
88
+ - **Maximum Sequence Length:** 512 tokens
89
+ - **Output Dimensionality:** 384 dimensions
90
+ - **Similarity Function:** Cosine Similarity
91
+ <!-- - **Training Dataset:** Unknown -->
92
+ <!-- - **Language:** Unknown -->
93
+ <!-- - **License:** Unknown -->
94
+
95
+ ### Model Sources
96
+
97
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
98
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
99
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
100
+
101
+ ### Full Model Architecture
102
+
103
+ ```
104
+ SentenceTransformer(
105
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: RobertaModel
106
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
107
+ )
108
+ ```
109
+
110
+ ## Usage
111
+
112
+ ### Direct Usage (Sentence Transformers)
113
+
114
+ First install the Sentence Transformers library:
115
+
116
+ ```bash
117
+ pip install -U sentence-transformers
118
+ ```
119
+
120
+ Then you can load this model and run inference.
121
+ ```python
122
+ from sentence_transformers import SentenceTransformer
123
+
124
+ # Download from the 🤗 Hub
125
+ model = SentenceTransformer("HassanCS/chemBERTa-tuned-on-ClinTox-3")
126
+ # Run inference
127
+ sentences = [
128
+ 'CC#CCn1c(N2CCCC([NH3+])C2)nc2c1c(=O)n(Cc1nc(C)c3ccccc3n1)c(=O)n2C',
129
+ 'CC12CCC(=O)C=C1CCC1C2C(O)CC2(C)C1CCC2(O)C(=O)COC(=O)CCC1CCCC1',
130
+ 'CC(Cc1ccc(O)c(O)c1)C(C)Cc1ccc(O)c(O)c1',
131
+ ]
132
+ embeddings = model.encode(sentences)
133
+ print(embeddings.shape)
134
+ # [3, 384]
135
+
136
+ # Get the similarity scores for the embeddings
137
+ similarities = model.similarity(embeddings, embeddings)
138
+ print(similarities.shape)
139
+ # [3, 3]
140
+ ```
141
+
142
+ <!--
143
+ ### Direct Usage (Transformers)
144
+
145
+ <details><summary>Click to see the direct usage in Transformers</summary>
146
+
147
+ </details>
148
+ -->
149
+
150
+ <!--
151
+ ### Downstream Usage (Sentence Transformers)
152
+
153
+ You can finetune this model on your own dataset.
154
+
155
+ <details><summary>Click to expand</summary>
156
+
157
+ </details>
158
+ -->
159
+
160
+ <!--
161
+ ### Out-of-Scope Use
162
+
163
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
164
+ -->
165
+
166
+ ## Evaluation
167
+
168
+ ### Metrics
169
+
170
+ #### Binary Classification
171
+
172
+ * Dataset: `all-dev`
173
+ * Evaluated with [<code>BinaryClassificationEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator)
174
+
175
+ | Metric | Value |
176
+ |:--------------------------|:-----------|
177
+ | cosine_accuracy | 0.9066 |
178
+ | cosine_accuracy_threshold | 0.5665 |
179
+ | cosine_f1 | 0.951 |
180
+ | cosine_f1_threshold | 0.5665 |
181
+ | cosine_precision | 0.9068 |
182
+ | cosine_recall | 0.9998 |
183
+ | **cosine_ap** | **0.9523** |
184
+
185
+ <!--
186
+ ## Bias, Risks and Limitations
187
+
188
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
189
+ -->
190
+
191
+ <!--
192
+ ### Recommendations
193
+
194
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
195
+ -->
196
+
197
+ ## Training Details
198
+
199
+ ### Training Dataset
200
+
201
+ #### Unnamed Dataset
202
+
203
+
204
+ * Size: 10,000 training samples
205
+ * Columns: <code>smiles1</code>, <code>smiles2</code>, and <code>label</code>
206
+ * Approximate statistics based on the first 1000 samples:
207
+ | | smiles1 | smiles2 | label |
208
+ |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------|
209
+ | type | string | string | int |
210
+ | details | <ul><li>min: 3 tokens</li><li>mean: 40.69 tokens</li><li>max: 221 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 51.43 tokens</li><li>max: 221 tokens</li></ul> | <ul><li>0: ~14.90%</li><li>1: ~85.10%</li></ul> |
211
+ * Samples:
212
+ | smiles1 | smiles2 | label |
213
+ |:----------------------------------------|:-------------------------------------------------------------|:---------------|
214
+ | <code>Cn1c(=O)c2c(ncn2C)n(C)c1=O</code> | <code>Cc1cc2c(s1)=Nc1ccccc1NC=2N1CC[NH+](C)CC1</code> | <code>1</code> |
215
+ | <code>Oc1ccc(OCc2ccccc2)cc1</code> | <code>Oc1ccc(CCCC[NH2+]CC(O)c2ccc(O)c(O)c2)cc1</code> | <code>1</code> |
216
+ | <code>OCC(S)CS</code> | <code>CC12CCC(=O)C=C1CCC1C2C(O)CC2(C)C1CCC2(O)C(=O)CO</code> | <code>0</code> |
217
+ * Loss: [<code>ContrastiveLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#contrastiveloss) with these parameters:
218
+ ```json
219
+ {
220
+ "distance_metric": "SiameseDistanceMetric.COSINE_DISTANCE",
221
+ "margin": 0.5,
222
+ "size_average": true
223
+ }
224
+ ```
225
+
226
+ ### Evaluation Dataset
227
+
228
+ #### Unnamed Dataset
229
+
230
+
231
+ * Size: 5,000 evaluation samples
232
+ * Columns: <code>smiles1</code>, <code>smiles2</code>, and <code>label</code>
233
+ * Approximate statistics based on the first 1000 samples:
234
+ | | smiles1 | smiles2 | label |
235
+ |:--------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------|
236
+ | type | string | string | int |
237
+ | details | <ul><li>min: 18 tokens</li><li>mean: 56.96 tokens</li><li>max: 209 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 61.21 tokens</li><li>max: 244 tokens</li></ul> | <ul><li>0: ~10.00%</li><li>1: ~90.00%</li></ul> |
238
+ * Samples:
239
+ | smiles1 | smiles2 | label |
240
+ |:---------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------|:---------------|
241
+ | <code>CC(=CC(=O)OCCCCCCCCC(=O)[O-])CC1OCC(CC2OC2C(C)C(C)O)C(O)C1O</code> | <code>CC(C=CC(C)C(C)(C)O)C1CCC2C(=CC=C3CC(O)CC(O)C3)CCCC21C</code> | <code>1</code> |
242
+ | <code>C=C1c2cccc([O-])c2C(=O)C2=C([O-])C3(O)C(=O)C(C(N)=O)=C([O-])C([NH+](C)C)C3C(O)C12</code> | <code>CC(c1ncncc1F)C(O)(Cn1cncn1)c1ccc(F)cc1F</code> | <code>1</code> |
243
+ | <code>CC(C)CC1C(=O)N2CCCC2C2(O)OC(NC(=O)C3C=C4c5cccc6[nH]c(Br)c(c56)CC4[NH+](C)C3)(C(C)C)C(=O)N12</code> | <code>C[NH+](C)CCC=C1c2ccccc2Sc2ccc(Cl)cc21</code> | <code>1</code> |
244
+ * Loss: [<code>ContrastiveLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#contrastiveloss) with these parameters:
245
+ ```json
246
+ {
247
+ "distance_metric": "SiameseDistanceMetric.COSINE_DISTANCE",
248
+ "margin": 0.5,
249
+ "size_average": true
250
+ }
251
+ ```
252
+
253
+ ### Training Hyperparameters
254
+ #### Non-Default Hyperparameters
255
+
256
+ - `eval_strategy`: steps
257
+ - `per_device_train_batch_size`: 16
258
+ - `per_device_eval_batch_size`: 16
259
+ - `learning_rate`: 2e-05
260
+ - `num_train_epochs`: 5
261
+ - `warmup_ratio`: 0.1
262
+ - `fp16`: True
263
+ - `batch_sampler`: no_duplicates
264
+
265
+ #### All Hyperparameters
266
+ <details><summary>Click to expand</summary>
267
+
268
+ - `overwrite_output_dir`: False
269
+ - `do_predict`: False
270
+ - `eval_strategy`: steps
271
+ - `prediction_loss_only`: True
272
+ - `per_device_train_batch_size`: 16
273
+ - `per_device_eval_batch_size`: 16
274
+ - `per_gpu_train_batch_size`: None
275
+ - `per_gpu_eval_batch_size`: None
276
+ - `gradient_accumulation_steps`: 1
277
+ - `eval_accumulation_steps`: None
278
+ - `torch_empty_cache_steps`: None
279
+ - `learning_rate`: 2e-05
280
+ - `weight_decay`: 0.0
281
+ - `adam_beta1`: 0.9
282
+ - `adam_beta2`: 0.999
283
+ - `adam_epsilon`: 1e-08
284
+ - `max_grad_norm`: 1.0
285
+ - `num_train_epochs`: 5
286
+ - `max_steps`: -1
287
+ - `lr_scheduler_type`: linear
288
+ - `lr_scheduler_kwargs`: {}
289
+ - `warmup_ratio`: 0.1
290
+ - `warmup_steps`: 0
291
+ - `log_level`: passive
292
+ - `log_level_replica`: warning
293
+ - `log_on_each_node`: True
294
+ - `logging_nan_inf_filter`: True
295
+ - `save_safetensors`: True
296
+ - `save_on_each_node`: False
297
+ - `save_only_model`: False
298
+ - `restore_callback_states_from_checkpoint`: False
299
+ - `no_cuda`: False
300
+ - `use_cpu`: False
301
+ - `use_mps_device`: False
302
+ - `seed`: 42
303
+ - `data_seed`: None
304
+ - `jit_mode_eval`: False
305
+ - `use_ipex`: False
306
+ - `bf16`: False
307
+ - `fp16`: True
308
+ - `fp16_opt_level`: O1
309
+ - `half_precision_backend`: auto
310
+ - `bf16_full_eval`: False
311
+ - `fp16_full_eval`: False
312
+ - `tf32`: None
313
+ - `local_rank`: 0
314
+ - `ddp_backend`: None
315
+ - `tpu_num_cores`: None
316
+ - `tpu_metrics_debug`: False
317
+ - `debug`: []
318
+ - `dataloader_drop_last`: False
319
+ - `dataloader_num_workers`: 0
320
+ - `dataloader_prefetch_factor`: None
321
+ - `past_index`: -1
322
+ - `disable_tqdm`: False
323
+ - `remove_unused_columns`: True
324
+ - `label_names`: None
325
+ - `load_best_model_at_end`: False
326
+ - `ignore_data_skip`: False
327
+ - `fsdp`: []
328
+ - `fsdp_min_num_params`: 0
329
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
330
+ - `fsdp_transformer_layer_cls_to_wrap`: None
331
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
332
+ - `deepspeed`: None
333
+ - `label_smoothing_factor`: 0.0
334
+ - `optim`: adamw_torch
335
+ - `optim_args`: None
336
+ - `adafactor`: False
337
+ - `group_by_length`: False
338
+ - `length_column_name`: length
339
+ - `ddp_find_unused_parameters`: None
340
+ - `ddp_bucket_cap_mb`: None
341
+ - `ddp_broadcast_buffers`: False
342
+ - `dataloader_pin_memory`: True
343
+ - `dataloader_persistent_workers`: False
344
+ - `skip_memory_metrics`: True
345
+ - `use_legacy_prediction_loop`: False
346
+ - `push_to_hub`: False
347
+ - `resume_from_checkpoint`: None
348
+ - `hub_model_id`: None
349
+ - `hub_strategy`: every_save
350
+ - `hub_private_repo`: None
351
+ - `hub_always_push`: False
352
+ - `gradient_checkpointing`: False
353
+ - `gradient_checkpointing_kwargs`: None
354
+ - `include_inputs_for_metrics`: False
355
+ - `include_for_metrics`: []
356
+ - `eval_do_concat_batches`: True
357
+ - `fp16_backend`: auto
358
+ - `push_to_hub_model_id`: None
359
+ - `push_to_hub_organization`: None
360
+ - `mp_parameters`:
361
+ - `auto_find_batch_size`: False
362
+ - `full_determinism`: False
363
+ - `torchdynamo`: None
364
+ - `ray_scope`: last
365
+ - `ddp_timeout`: 1800
366
+ - `torch_compile`: False
367
+ - `torch_compile_backend`: None
368
+ - `torch_compile_mode`: None
369
+ - `dispatch_batches`: None
370
+ - `split_batches`: None
371
+ - `include_tokens_per_second`: False
372
+ - `include_num_input_tokens_seen`: False
373
+ - `neftune_noise_alpha`: None
374
+ - `optim_target_modules`: None
375
+ - `batch_eval_metrics`: False
376
+ - `eval_on_start`: False
377
+ - `use_liger_kernel`: False
378
+ - `eval_use_gather_object`: False
379
+ - `average_tokens_across_devices`: False
380
+ - `prompts`: None
381
+ - `batch_sampler`: no_duplicates
382
+ - `multi_dataset_batch_sampler`: proportional
383
+
384
+ </details>
385
+
386
+ ### Training Logs
387
+ | Epoch | Step | Training Loss | Validation Loss | all-dev_cosine_ap |
388
+ |:-----:|:----:|:-------------:|:---------------:|:-----------------:|
389
+ | 0.8 | 500 | 0.0264 | 0.0112 | 0.9213 |
390
+ | 1.6 | 1000 | 0.0152 | 0.0122 | 0.9362 |
391
+ | 2.4 | 1500 | 0.0134 | 0.0128 | 0.9463 |
392
+ | 3.2 | 2000 | 0.0112 | 0.0134 | 0.9502 |
393
+ | 4.0 | 2500 | 0.01 | 0.0125 | 0.9513 |
394
+ | 4.8 | 3000 | 0.0097 | 0.0132 | 0.9523 |
395
+
396
+
397
+ ### Framework Versions
398
+ - Python: 3.11.11
399
+ - Sentence Transformers: 3.3.1
400
+ - Transformers: 4.47.1
401
+ - PyTorch: 2.5.1+cu124
402
+ - Accelerate: 1.2.1
403
+ - Datasets: 3.2.0
404
+ - Tokenizers: 0.21.0
405
+
406
+ ## Citation
407
+
408
+ ### BibTeX
409
+
410
+ #### Sentence Transformers
411
+ ```bibtex
412
+ @inproceedings{reimers-2019-sentence-bert,
413
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
414
+ author = "Reimers, Nils and Gurevych, Iryna",
415
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
416
+ month = "11",
417
+ year = "2019",
418
+ publisher = "Association for Computational Linguistics",
419
+ url = "https://arxiv.org/abs/1908.10084",
420
+ }
421
+ ```
422
+
423
+ #### ContrastiveLoss
424
+ ```bibtex
425
+ @inproceedings{hadsell2006dimensionality,
426
+ author={Hadsell, R. and Chopra, S. and LeCun, Y.},
427
+ booktitle={2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06)},
428
+ title={Dimensionality Reduction by Learning an Invariant Mapping},
429
+ year={2006},
430
+ volume={2},
431
+ number={},
432
+ pages={1735-1742},
433
+ doi={10.1109/CVPR.2006.100}
434
+ }
435
+ ```
436
+
437
+ <!--
438
+ ## Glossary
439
+
440
+ *Clearly define terms in order to be accessible across audiences.*
441
+ -->
442
+
443
+ <!--
444
+ ## Model Card Authors
445
+
446
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
447
+ -->
448
+
449
+ <!--
450
+ ## Model Card Contact
451
+
452
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
453
+ -->
added_tokens.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "</s>": 592,
3
+ "<s>": 591
4
+ }
config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "DeepChem/ChemBERTa-77M-MLM",
3
+ "architectures": [
4
+ "RobertaModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.109,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "eos_token_id": 2,
10
+ "gradient_checkpointing": false,
11
+ "hidden_act": "gelu",
12
+ "hidden_dropout_prob": 0.144,
13
+ "hidden_size": 384,
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 464,
16
+ "is_gpu": true,
17
+ "layer_norm_eps": 1e-12,
18
+ "max_position_embeddings": 515,
19
+ "model_type": "roberta",
20
+ "num_attention_heads": 12,
21
+ "num_hidden_layers": 3,
22
+ "pad_token_id": 1,
23
+ "position_embedding_type": "absolute",
24
+ "torch_dtype": "float32",
25
+ "transformers_version": "4.47.1",
26
+ "type_vocab_size": 1,
27
+ "use_cache": true,
28
+ "vocab_size": 600
29
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.3.1",
4
+ "transformers": "4.47.1",
5
+ "pytorch": "2.5.1+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
merges.txt ADDED
@@ -0,0 +1 @@
 
 
1
+ #version: 0.2
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7716c13627ab5706fd378e04e6cafe71328c03f53347fa5ecab7f3c33756767b
3
+ size 13715688
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "[CLS]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": true,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "[MASK]",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "[PAD]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "[SEP]",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "[UNK]",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
tokenizer.json ADDED
@@ -0,0 +1,712 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "version": "1.0",
3
+ "truncation": {
4
+ "direction": "Right",
5
+ "max_length": 512,
6
+ "strategy": "LongestFirst",
7
+ "stride": 0
8
+ },
9
+ "padding": {
10
+ "strategy": "BatchLongest",
11
+ "direction": "Right",
12
+ "pad_to_multiple_of": null,
13
+ "pad_id": 0,
14
+ "pad_type_id": 0,
15
+ "pad_token": "[PAD]"
16
+ },
17
+ "added_tokens": [
18
+ {
19
+ "id": 0,
20
+ "content": "[PAD]",
21
+ "single_word": false,
22
+ "lstrip": false,
23
+ "rstrip": false,
24
+ "normalized": false,
25
+ "special": true
26
+ },
27
+ {
28
+ "id": 11,
29
+ "content": "[UNK]",
30
+ "single_word": false,
31
+ "lstrip": false,
32
+ "rstrip": false,
33
+ "normalized": false,
34
+ "special": true
35
+ },
36
+ {
37
+ "id": 12,
38
+ "content": "[CLS]",
39
+ "single_word": false,
40
+ "lstrip": false,
41
+ "rstrip": false,
42
+ "normalized": false,
43
+ "special": true
44
+ },
45
+ {
46
+ "id": 13,
47
+ "content": "[SEP]",
48
+ "single_word": false,
49
+ "lstrip": false,
50
+ "rstrip": false,
51
+ "normalized": false,
52
+ "special": true
53
+ },
54
+ {
55
+ "id": 14,
56
+ "content": "[MASK]",
57
+ "single_word": false,
58
+ "lstrip": true,
59
+ "rstrip": false,
60
+ "normalized": false,
61
+ "special": true
62
+ },
63
+ {
64
+ "id": 591,
65
+ "content": "<s>",
66
+ "single_word": false,
67
+ "lstrip": false,
68
+ "rstrip": false,
69
+ "normalized": true,
70
+ "special": true
71
+ },
72
+ {
73
+ "id": 592,
74
+ "content": "</s>",
75
+ "single_word": false,
76
+ "lstrip": false,
77
+ "rstrip": false,
78
+ "normalized": true,
79
+ "special": true
80
+ }
81
+ ],
82
+ "normalizer": null,
83
+ "pre_tokenizer": {
84
+ "type": "ByteLevel",
85
+ "add_prefix_space": false,
86
+ "trim_offsets": true,
87
+ "use_regex": true
88
+ },
89
+ "post_processor": {
90
+ "type": "RobertaProcessing",
91
+ "sep": [
92
+ "[SEP]",
93
+ 13
94
+ ],
95
+ "cls": [
96
+ "[CLS]",
97
+ 12
98
+ ],
99
+ "trim_offsets": true,
100
+ "add_prefix_space": false
101
+ },
102
+ "decoder": {
103
+ "type": "ByteLevel",
104
+ "add_prefix_space": true,
105
+ "trim_offsets": true,
106
+ "use_regex": true
107
+ },
108
+ "model": {
109
+ "type": "BPE",
110
+ "dropout": null,
111
+ "unk_token": null,
112
+ "continuing_subword_prefix": "",
113
+ "end_of_word_suffix": "",
114
+ "fuse_unk": false,
115
+ "byte_fallback": false,
116
+ "ignore_merges": false,
117
+ "vocab": {
118
+ "[PAD]": 0,
119
+ "[unused1]": 1,
120
+ "[unused2]": 2,
121
+ "[unused3]": 3,
122
+ "[unused4]": 4,
123
+ "[unused5]": 5,
124
+ "[unused6]": 6,
125
+ "[unused7]": 7,
126
+ "[unused8]": 8,
127
+ "[unused9]": 9,
128
+ "[unused10]": 10,
129
+ "[UNK]": 11,
130
+ "[CLS]": 12,
131
+ "[SEP]": 13,
132
+ "[MASK]": 14,
133
+ "c": 15,
134
+ "C": 16,
135
+ "(": 17,
136
+ ")": 18,
137
+ "O": 19,
138
+ "1": 20,
139
+ "2": 21,
140
+ "=": 22,
141
+ "N": 23,
142
+ ".": 24,
143
+ "n": 25,
144
+ "3": 26,
145
+ "F": 27,
146
+ "Cl": 28,
147
+ ">>": 29,
148
+ "~": 30,
149
+ "-": 31,
150
+ "4": 32,
151
+ "[C@H]": 33,
152
+ "S": 34,
153
+ "[C@@H]": 35,
154
+ "[O-]": 36,
155
+ "Br": 37,
156
+ "#": 38,
157
+ "/": 39,
158
+ "[nH]": 40,
159
+ "[N+]": 41,
160
+ "s": 42,
161
+ "5": 43,
162
+ "o": 44,
163
+ "P": 45,
164
+ "[Na+]": 46,
165
+ "[Si]": 47,
166
+ "I": 48,
167
+ "[Na]": 49,
168
+ "[Pd]": 50,
169
+ "[K+]": 51,
170
+ "[K]": 52,
171
+ "[P]": 53,
172
+ "B": 54,
173
+ "[C@]": 55,
174
+ "[C@@]": 56,
175
+ "[Cl-]": 57,
176
+ "6": 58,
177
+ "[OH-]": 59,
178
+ "\\": 60,
179
+ "[N-]": 61,
180
+ "[Li]": 62,
181
+ "[H]": 63,
182
+ "[2H]": 64,
183
+ "[NH4+]": 65,
184
+ "[c-]": 66,
185
+ "[P-]": 67,
186
+ "[Cs+]": 68,
187
+ "[Li+]": 69,
188
+ "[Cs]": 70,
189
+ "[NaH]": 71,
190
+ "[H-]": 72,
191
+ "[O+]": 73,
192
+ "[BH4-]": 74,
193
+ "[Cu]": 75,
194
+ "7": 76,
195
+ "[Mg]": 77,
196
+ "[Fe+2]": 78,
197
+ "[n+]": 79,
198
+ "[Sn]": 80,
199
+ "[BH-]": 81,
200
+ "[Pd+2]": 82,
201
+ "[CH]": 83,
202
+ "[I-]": 84,
203
+ "[Br-]": 85,
204
+ "[C-]": 86,
205
+ "[Zn]": 87,
206
+ "[B-]": 88,
207
+ "[F-]": 89,
208
+ "[Al]": 90,
209
+ "[P+]": 91,
210
+ "[BH3-]": 92,
211
+ "[Fe]": 93,
212
+ "[C]": 94,
213
+ "[AlH4]": 95,
214
+ "[Ni]": 96,
215
+ "[SiH]": 97,
216
+ "8": 98,
217
+ "[Cu+2]": 99,
218
+ "[Mn]": 100,
219
+ "[AlH]": 101,
220
+ "[nH+]": 102,
221
+ "[AlH4-]": 103,
222
+ "[O-2]": 104,
223
+ "[Cr]": 105,
224
+ "[Mg+2]": 106,
225
+ "[NH3+]": 107,
226
+ "[S@]": 108,
227
+ "[Pt]": 109,
228
+ "[Al+3]": 110,
229
+ "[S@@]": 111,
230
+ "[S-]": 112,
231
+ "[Ti]": 113,
232
+ "[Zn+2]": 114,
233
+ "[PH]": 115,
234
+ "[NH2+]": 116,
235
+ "[Ru]": 117,
236
+ "[Ag+]": 118,
237
+ "[S+]": 119,
238
+ "[I+3]": 120,
239
+ "[NH+]": 121,
240
+ "[Ca+2]": 122,
241
+ "[Ag]": 123,
242
+ "9": 124,
243
+ "[Os]": 125,
244
+ "[Se]": 126,
245
+ "[SiH2]": 127,
246
+ "[Ca]": 128,
247
+ "[Ti+4]": 129,
248
+ "[Ac]": 130,
249
+ "[Cu+]": 131,
250
+ "[S]": 132,
251
+ "[Rh]": 133,
252
+ "[Cl+3]": 134,
253
+ "[cH-]": 135,
254
+ "[Zn+]": 136,
255
+ "[O]": 137,
256
+ "[Cl+]": 138,
257
+ "[SH]": 139,
258
+ "[H+]": 140,
259
+ "[Pd+]": 141,
260
+ "[se]": 142,
261
+ "[PH+]": 143,
262
+ "[I]": 144,
263
+ "[Pt+2]": 145,
264
+ "[C+]": 146,
265
+ "[Mg+]": 147,
266
+ "[Hg]": 148,
267
+ "[W]": 149,
268
+ "[SnH]": 150,
269
+ "[SiH3]": 151,
270
+ "[Fe+3]": 152,
271
+ "[NH]": 153,
272
+ "[Mo]": 154,
273
+ "[CH2+]": 155,
274
+ "%10": 156,
275
+ "[CH2-]": 157,
276
+ "[CH2]": 158,
277
+ "[n-]": 159,
278
+ "[Ce+4]": 160,
279
+ "[NH-]": 161,
280
+ "[Co]": 162,
281
+ "[I+]": 163,
282
+ "[PH2]": 164,
283
+ "[Pt+4]": 165,
284
+ "[Ce]": 166,
285
+ "[B]": 167,
286
+ "[Sn+2]": 168,
287
+ "[Ba+2]": 169,
288
+ "%11": 170,
289
+ "[Fe-3]": 171,
290
+ "[18F]": 172,
291
+ "[SH-]": 173,
292
+ "[Pb+2]": 174,
293
+ "[Os-2]": 175,
294
+ "[Zr+4]": 176,
295
+ "[N]": 177,
296
+ "[Ir]": 178,
297
+ "[Bi]": 179,
298
+ "[Ni+2]": 180,
299
+ "[P@]": 181,
300
+ "[Co+2]": 182,
301
+ "[s+]": 183,
302
+ "[As]": 184,
303
+ "[P+3]": 185,
304
+ "[Hg+2]": 186,
305
+ "[Yb+3]": 187,
306
+ "[CH-]": 188,
307
+ "[Zr+2]": 189,
308
+ "[Mn+2]": 190,
309
+ "[CH+]": 191,
310
+ "[In]": 192,
311
+ "[KH]": 193,
312
+ "[Ce+3]": 194,
313
+ "[Zr]": 195,
314
+ "[AlH2-]": 196,
315
+ "[OH2+]": 197,
316
+ "[Ti+3]": 198,
317
+ "[Rh+2]": 199,
318
+ "[Sb]": 200,
319
+ "[S-2]": 201,
320
+ "%12": 202,
321
+ "[P@@]": 203,
322
+ "[Si@H]": 204,
323
+ "[Mn+4]": 205,
324
+ "p": 206,
325
+ "[Ba]": 207,
326
+ "[NH2-]": 208,
327
+ "[Ge]": 209,
328
+ "[Pb+4]": 210,
329
+ "[Cr+3]": 211,
330
+ "[Au]": 212,
331
+ "[LiH]": 213,
332
+ "[Sc+3]": 214,
333
+ "[o+]": 215,
334
+ "[Rh-3]": 216,
335
+ "%13": 217,
336
+ "[Br]": 218,
337
+ "[Sb-]": 219,
338
+ "[S@+]": 220,
339
+ "[I+2]": 221,
340
+ "[Ar]": 222,
341
+ "[V]": 223,
342
+ "[Cu-]": 224,
343
+ "[Al-]": 225,
344
+ "[Te]": 226,
345
+ "[13c]": 227,
346
+ "[13C]": 228,
347
+ "[Cl]": 229,
348
+ "[PH4+]": 230,
349
+ "[SiH4]": 231,
350
+ "[te]": 232,
351
+ "[CH3-]": 233,
352
+ "[S@@+]": 234,
353
+ "[Rh+3]": 235,
354
+ "[SH+]": 236,
355
+ "[Bi+3]": 237,
356
+ "[Br+2]": 238,
357
+ "[La]": 239,
358
+ "[La+3]": 240,
359
+ "[Pt-2]": 241,
360
+ "[N@@]": 242,
361
+ "[PH3+]": 243,
362
+ "[N@]": 244,
363
+ "[Si+4]": 245,
364
+ "[Sr+2]": 246,
365
+ "[Al+]": 247,
366
+ "[Pb]": 248,
367
+ "[SeH]": 249,
368
+ "[Si-]": 250,
369
+ "[V+5]": 251,
370
+ "[Y+3]": 252,
371
+ "[Re]": 253,
372
+ "[Ru+]": 254,
373
+ "[Sm]": 255,
374
+ "*": 256,
375
+ "[3H]": 257,
376
+ "[NH2]": 258,
377
+ "[Ag-]": 259,
378
+ "[13CH3]": 260,
379
+ "[OH+]": 261,
380
+ "[Ru+3]": 262,
381
+ "[OH]": 263,
382
+ "[Gd+3]": 264,
383
+ "[13CH2]": 265,
384
+ "[In+3]": 266,
385
+ "[Si@@]": 267,
386
+ "[Si@]": 268,
387
+ "[Ti+2]": 269,
388
+ "[Sn+]": 270,
389
+ "[Cl+2]": 271,
390
+ "[AlH-]": 272,
391
+ "[Pd-2]": 273,
392
+ "[SnH3]": 274,
393
+ "[B+3]": 275,
394
+ "[Cu-2]": 276,
395
+ "[Nd+3]": 277,
396
+ "[Pb+3]": 278,
397
+ "[13cH]": 279,
398
+ "[Fe-4]": 280,
399
+ "[Ga]": 281,
400
+ "[Sn+4]": 282,
401
+ "[Hg+]": 283,
402
+ "[11CH3]": 284,
403
+ "[Hf]": 285,
404
+ "[Pr]": 286,
405
+ "[Y]": 287,
406
+ "[S+2]": 288,
407
+ "[Cd]": 289,
408
+ "[Cr+6]": 290,
409
+ "[Zr+3]": 291,
410
+ "[Rh+]": 292,
411
+ "[CH3]": 293,
412
+ "[N-3]": 294,
413
+ "[Hf+2]": 295,
414
+ "[Th]": 296,
415
+ "[Sb+3]": 297,
416
+ "%14": 298,
417
+ "[Cr+2]": 299,
418
+ "[Ru+2]": 300,
419
+ "[Hf+4]": 301,
420
+ "[14C]": 302,
421
+ "[Ta]": 303,
422
+ "[Tl+]": 304,
423
+ "[B+]": 305,
424
+ "[Os+4]": 306,
425
+ "[PdH2]": 307,
426
+ "[Pd-]": 308,
427
+ "[Cd+2]": 309,
428
+ "[Co+3]": 310,
429
+ "[S+4]": 311,
430
+ "[Nb+5]": 312,
431
+ "[123I]": 313,
432
+ "[c+]": 314,
433
+ "[Rb+]": 315,
434
+ "[V+2]": 316,
435
+ "[CH3+]": 317,
436
+ "[Ag+2]": 318,
437
+ "[cH+]": 319,
438
+ "[Mn+3]": 320,
439
+ "[Se-]": 321,
440
+ "[As-]": 322,
441
+ "[Eu+3]": 323,
442
+ "[SH2]": 324,
443
+ "[Sm+3]": 325,
444
+ "[IH+]": 326,
445
+ "%15": 327,
446
+ "[OH3+]": 328,
447
+ "[PH3]": 329,
448
+ "[IH2+]": 330,
449
+ "[SH2+]": 331,
450
+ "[Ir+3]": 332,
451
+ "[AlH3]": 333,
452
+ "[Sc]": 334,
453
+ "[Yb]": 335,
454
+ "[15NH2]": 336,
455
+ "[Lu]": 337,
456
+ "[sH+]": 338,
457
+ "[Gd]": 339,
458
+ "[18F-]": 340,
459
+ "[SH3+]": 341,
460
+ "[SnH4]": 342,
461
+ "[TeH]": 343,
462
+ "[Si@@H]": 344,
463
+ "[Ga+3]": 345,
464
+ "[CaH2]": 346,
465
+ "[Tl]": 347,
466
+ "[Ta+5]": 348,
467
+ "[GeH]": 349,
468
+ "[Br+]": 350,
469
+ "[Sr]": 351,
470
+ "[Tl+3]": 352,
471
+ "[Sm+2]": 353,
472
+ "[PH5]": 354,
473
+ "%16": 355,
474
+ "[N@@+]": 356,
475
+ "[Au+3]": 357,
476
+ "[C-4]": 358,
477
+ "[Nd]": 359,
478
+ "[Ti+]": 360,
479
+ "[IH]": 361,
480
+ "[N@+]": 362,
481
+ "[125I]": 363,
482
+ "[Eu]": 364,
483
+ "[Sn+3]": 365,
484
+ "[Nb]": 366,
485
+ "[Er+3]": 367,
486
+ "[123I-]": 368,
487
+ "[14c]": 369,
488
+ "%17": 370,
489
+ "[SnH2]": 371,
490
+ "[YH]": 372,
491
+ "[Sb+5]": 373,
492
+ "[Pr+3]": 374,
493
+ "[Ir+]": 375,
494
+ "[N+3]": 376,
495
+ "[AlH2]": 377,
496
+ "[19F]": 378,
497
+ "%18": 379,
498
+ "[Tb]": 380,
499
+ "[14CH]": 381,
500
+ "[Mo+4]": 382,
501
+ "[Si+]": 383,
502
+ "[BH]": 384,
503
+ "[Be]": 385,
504
+ "[Rb]": 386,
505
+ "[pH]": 387,
506
+ "%19": 388,
507
+ "%20": 389,
508
+ "[Xe]": 390,
509
+ "[Ir-]": 391,
510
+ "[Be+2]": 392,
511
+ "[C+4]": 393,
512
+ "[RuH2]": 394,
513
+ "[15NH]": 395,
514
+ "[U+2]": 396,
515
+ "[Au-]": 397,
516
+ "%21": 398,
517
+ "%22": 399,
518
+ "[Au+]": 400,
519
+ "[15n]": 401,
520
+ "[Al+2]": 402,
521
+ "[Tb+3]": 403,
522
+ "[15N]": 404,
523
+ "[V+3]": 405,
524
+ "[W+6]": 406,
525
+ "[14CH3]": 407,
526
+ "[Cr+4]": 408,
527
+ "[ClH+]": 409,
528
+ "b": 410,
529
+ "[Ti+6]": 411,
530
+ "[Nd+]": 412,
531
+ "[Zr+]": 413,
532
+ "[PH2+]": 414,
533
+ "[Fm]": 415,
534
+ "[N@H+]": 416,
535
+ "[RuH]": 417,
536
+ "[Dy+3]": 418,
537
+ "%23": 419,
538
+ "[Hf+3]": 420,
539
+ "[W+4]": 421,
540
+ "[11C]": 422,
541
+ "[13CH]": 423,
542
+ "[Er]": 424,
543
+ "[124I]": 425,
544
+ "[LaH]": 426,
545
+ "[F]": 427,
546
+ "[siH]": 428,
547
+ "[Ga+]": 429,
548
+ "[Cm]": 430,
549
+ "[GeH3]": 431,
550
+ "[IH-]": 432,
551
+ "[U+6]": 433,
552
+ "[SeH+]": 434,
553
+ "[32P]": 435,
554
+ "[SeH-]": 436,
555
+ "[Pt-]": 437,
556
+ "[Ir+2]": 438,
557
+ "[se+]": 439,
558
+ "[U]": 440,
559
+ "[F+]": 441,
560
+ "[BH2]": 442,
561
+ "[As+]": 443,
562
+ "[Cf]": 444,
563
+ "[ClH2+]": 445,
564
+ "[Ni+]": 446,
565
+ "[TeH3]": 447,
566
+ "[SbH2]": 448,
567
+ "[Ag+3]": 449,
568
+ "%24": 450,
569
+ "[18O]": 451,
570
+ "[PH4]": 452,
571
+ "[Os+2]": 453,
572
+ "[Na-]": 454,
573
+ "[Sb+2]": 455,
574
+ "[V+4]": 456,
575
+ "[Ho+3]": 457,
576
+ "[68Ga]": 458,
577
+ "[PH-]": 459,
578
+ "[Bi+2]": 460,
579
+ "[Ce+2]": 461,
580
+ "[Pd+3]": 462,
581
+ "[99Tc]": 463,
582
+ "[13C@@H]": 464,
583
+ "[Fe+6]": 465,
584
+ "[c]": 466,
585
+ "[GeH2]": 467,
586
+ "[10B]": 468,
587
+ "[Cu+3]": 469,
588
+ "[Mo+2]": 470,
589
+ "[Cr+]": 471,
590
+ "[Pd+4]": 472,
591
+ "[Dy]": 473,
592
+ "[AsH]": 474,
593
+ "[Ba+]": 475,
594
+ "[SeH2]": 476,
595
+ "[In+]": 477,
596
+ "[TeH2]": 478,
597
+ "[BrH+]": 479,
598
+ "[14cH]": 480,
599
+ "[W+]": 481,
600
+ "[13C@H]": 482,
601
+ "[AsH2]": 483,
602
+ "[In+2]": 484,
603
+ "[N+2]": 485,
604
+ "[N@@H+]": 486,
605
+ "[SbH]": 487,
606
+ "[60Co]": 488,
607
+ "[AsH4+]": 489,
608
+ "[AsH3]": 490,
609
+ "[18OH]": 491,
610
+ "[Ru-2]": 492,
611
+ "[Na-2]": 493,
612
+ "[CuH2]": 494,
613
+ "[31P]": 495,
614
+ "[Ti+5]": 496,
615
+ "[35S]": 497,
616
+ "[P@@H]": 498,
617
+ "[ArH]": 499,
618
+ "[Co+]": 500,
619
+ "[Zr-2]": 501,
620
+ "[BH2-]": 502,
621
+ "[131I]": 503,
622
+ "[SH5]": 504,
623
+ "[VH]": 505,
624
+ "[B+2]": 506,
625
+ "[Yb+2]": 507,
626
+ "[14C@H]": 508,
627
+ "[211At]": 509,
628
+ "[NH3+2]": 510,
629
+ "[IrH]": 511,
630
+ "[IrH2]": 512,
631
+ "[Rh-]": 513,
632
+ "[Cr-]": 514,
633
+ "[Sb+]": 515,
634
+ "[Ni+3]": 516,
635
+ "[TaH3]": 517,
636
+ "[Tl+2]": 518,
637
+ "[64Cu]": 519,
638
+ "[Tc]": 520,
639
+ "[Cd+]": 521,
640
+ "[1H]": 522,
641
+ "[15nH]": 523,
642
+ "[AlH2+]": 524,
643
+ "[FH+2]": 525,
644
+ "[BiH3]": 526,
645
+ "[Ru-]": 527,
646
+ "[Mo+6]": 528,
647
+ "[AsH+]": 529,
648
+ "[BaH2]": 530,
649
+ "[BaH]": 531,
650
+ "[Fe+4]": 532,
651
+ "[229Th]": 533,
652
+ "[Th+4]": 534,
653
+ "[As+3]": 535,
654
+ "[NH+3]": 536,
655
+ "[P@H]": 537,
656
+ "[Li-]": 538,
657
+ "[7NaH]": 539,
658
+ "[Bi+]": 540,
659
+ "[PtH+2]": 541,
660
+ "[p-]": 542,
661
+ "[Re+5]": 543,
662
+ "[NiH]": 544,
663
+ "[Ni-]": 545,
664
+ "[Xe+]": 546,
665
+ "[Ca+]": 547,
666
+ "[11c]": 548,
667
+ "[Rh+4]": 549,
668
+ "[AcH]": 550,
669
+ "[HeH]": 551,
670
+ "[Sc+2]": 552,
671
+ "[Mn+]": 553,
672
+ "[UH]": 554,
673
+ "[14CH2]": 555,
674
+ "[SiH4+]": 556,
675
+ "[18OH2]": 557,
676
+ "[Ac-]": 558,
677
+ "[Re+4]": 559,
678
+ "[118Sn]": 560,
679
+ "[153Sm]": 561,
680
+ "[P+2]": 562,
681
+ "[9CH]": 563,
682
+ "[9CH3]": 564,
683
+ "[Y-]": 565,
684
+ "[NiH2]": 566,
685
+ "[Si+2]": 567,
686
+ "[Mn+6]": 568,
687
+ "[ZrH2]": 569,
688
+ "[C-2]": 570,
689
+ "[Bi+5]": 571,
690
+ "[24NaH]": 572,
691
+ "[Fr]": 573,
692
+ "[15CH]": 574,
693
+ "[Se+]": 575,
694
+ "[At]": 576,
695
+ "[P-3]": 577,
696
+ "[124I-]": 578,
697
+ "[CuH2-]": 579,
698
+ "[Nb+4]": 580,
699
+ "[Nb+3]": 581,
700
+ "[MgH]": 582,
701
+ "[Ir+4]": 583,
702
+ "[67Ga+3]": 584,
703
+ "[67Ga]": 585,
704
+ "[13N]": 586,
705
+ "[15OH2]": 587,
706
+ "[2NH]": 588,
707
+ "[Ho]": 589,
708
+ "[Cn]": 590
709
+ },
710
+ "merges": []
711
+ }
712
+ }
tokenizer_config.json ADDED
@@ -0,0 +1,76 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "[PAD]",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "11": {
13
+ "content": "[UNK]",
14
+ "lstrip": false,
15
+ "normalized": false,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "12": {
21
+ "content": "[CLS]",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "13": {
29
+ "content": "[SEP]",
30
+ "lstrip": false,
31
+ "normalized": false,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "14": {
37
+ "content": "[MASK]",
38
+ "lstrip": true,
39
+ "normalized": false,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ },
44
+ "591": {
45
+ "content": "<s>",
46
+ "lstrip": false,
47
+ "normalized": true,
48
+ "rstrip": false,
49
+ "single_word": false,
50
+ "special": true
51
+ },
52
+ "592": {
53
+ "content": "</s>",
54
+ "lstrip": false,
55
+ "normalized": true,
56
+ "rstrip": false,
57
+ "single_word": false,
58
+ "special": true
59
+ }
60
+ },
61
+ "bos_token": "<s>",
62
+ "clean_up_tokenization_spaces": false,
63
+ "cls_token": "[CLS]",
64
+ "eos_token": "</s>",
65
+ "errors": "replace",
66
+ "extra_special_tokens": {},
67
+ "full_tokenizer_file": null,
68
+ "mask_token": "[MASK]",
69
+ "max_len": 512,
70
+ "model_max_length": 512,
71
+ "pad_token": "[PAD]",
72
+ "sep_token": "[SEP]",
73
+ "tokenizer_class": "RobertaTokenizer",
74
+ "trim_offsets": true,
75
+ "unk_token": "[UNK]"
76
+ }
vocab.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"[PAD]":0,"[unused1]":1,"[unused2]":2,"[unused3]":3,"[unused4]":4,"[unused5]":5,"[unused6]":6,"[unused7]":7,"[unused8]":8,"[unused9]":9,"[unused10]":10,"[UNK]":11,"[CLS]":12,"[SEP]":13,"[MASK]":14,"c":15,"C":16,"(":17,")":18,"O":19,"1":20,"2":21,"=":22,"N":23,".":24,"n":25,"3":26,"F":27,"Cl":28,">>":29,"~":30,"-":31,"4":32,"[C@H]":33,"S":34,"[C@@H]":35,"[O-]":36,"Br":37,"#":38,"/":39,"[nH]":40,"[N+]":41,"s":42,"5":43,"o":44,"P":45,"[Na+]":46,"[Si]":47,"I":48,"[Na]":49,"[Pd]":50,"[K+]":51,"[K]":52,"[P]":53,"B":54,"[C@]":55,"[C@@]":56,"[Cl-]":57,"6":58,"[OH-]":59,"\\":60,"[N-]":61,"[Li]":62,"[H]":63,"[2H]":64,"[NH4+]":65,"[c-]":66,"[P-]":67,"[Cs+]":68,"[Li+]":69,"[Cs]":70,"[NaH]":71,"[H-]":72,"[O+]":73,"[BH4-]":74,"[Cu]":75,"7":76,"[Mg]":77,"[Fe+2]":78,"[n+]":79,"[Sn]":80,"[BH-]":81,"[Pd+2]":82,"[CH]":83,"[I-]":84,"[Br-]":85,"[C-]":86,"[Zn]":87,"[B-]":88,"[F-]":89,"[Al]":90,"[P+]":91,"[BH3-]":92,"[Fe]":93,"[C]":94,"[AlH4]":95,"[Ni]":96,"[SiH]":97,"8":98,"[Cu+2]":99,"[Mn]":100,"[AlH]":101,"[nH+]":102,"[AlH4-]":103,"[O-2]":104,"[Cr]":105,"[Mg+2]":106,"[NH3+]":107,"[S@]":108,"[Pt]":109,"[Al+3]":110,"[S@@]":111,"[S-]":112,"[Ti]":113,"[Zn+2]":114,"[PH]":115,"[NH2+]":116,"[Ru]":117,"[Ag+]":118,"[S+]":119,"[I+3]":120,"[NH+]":121,"[Ca+2]":122,"[Ag]":123,"9":124,"[Os]":125,"[Se]":126,"[SiH2]":127,"[Ca]":128,"[Ti+4]":129,"[Ac]":130,"[Cu+]":131,"[S]":132,"[Rh]":133,"[Cl+3]":134,"[cH-]":135,"[Zn+]":136,"[O]":137,"[Cl+]":138,"[SH]":139,"[H+]":140,"[Pd+]":141,"[se]":142,"[PH+]":143,"[I]":144,"[Pt+2]":145,"[C+]":146,"[Mg+]":147,"[Hg]":148,"[W]":149,"[SnH]":150,"[SiH3]":151,"[Fe+3]":152,"[NH]":153,"[Mo]":154,"[CH2+]":155,"%10":156,"[CH2-]":157,"[CH2]":158,"[n-]":159,"[Ce+4]":160,"[NH-]":161,"[Co]":162,"[I+]":163,"[PH2]":164,"[Pt+4]":165,"[Ce]":166,"[B]":167,"[Sn+2]":168,"[Ba+2]":169,"%11":170,"[Fe-3]":171,"[18F]":172,"[SH-]":173,"[Pb+2]":174,"[Os-2]":175,"[Zr+4]":176,"[N]":177,"[Ir]":178,"[Bi]":179,"[Ni+2]":180,"[P@]":181,"[Co+2]":182,"[s+]":183,"[As]":184,"[P+3]":185,"[Hg+2]":186,"[Yb+3]":187,"[CH-]":188,"[Zr+2]":189,"[Mn+2]":190,"[CH+]":191,"[In]":192,"[KH]":193,"[Ce+3]":194,"[Zr]":195,"[AlH2-]":196,"[OH2+]":197,"[Ti+3]":198,"[Rh+2]":199,"[Sb]":200,"[S-2]":201,"%12":202,"[P@@]":203,"[Si@H]":204,"[Mn+4]":205,"p":206,"[Ba]":207,"[NH2-]":208,"[Ge]":209,"[Pb+4]":210,"[Cr+3]":211,"[Au]":212,"[LiH]":213,"[Sc+3]":214,"[o+]":215,"[Rh-3]":216,"%13":217,"[Br]":218,"[Sb-]":219,"[S@+]":220,"[I+2]":221,"[Ar]":222,"[V]":223,"[Cu-]":224,"[Al-]":225,"[Te]":226,"[13c]":227,"[13C]":228,"[Cl]":229,"[PH4+]":230,"[SiH4]":231,"[te]":232,"[CH3-]":233,"[S@@+]":234,"[Rh+3]":235,"[SH+]":236,"[Bi+3]":237,"[Br+2]":238,"[La]":239,"[La+3]":240,"[Pt-2]":241,"[N@@]":242,"[PH3+]":243,"[N@]":244,"[Si+4]":245,"[Sr+2]":246,"[Al+]":247,"[Pb]":248,"[SeH]":249,"[Si-]":250,"[V+5]":251,"[Y+3]":252,"[Re]":253,"[Ru+]":254,"[Sm]":255,"*":256,"[3H]":257,"[NH2]":258,"[Ag-]":259,"[13CH3]":260,"[OH+]":261,"[Ru+3]":262,"[OH]":263,"[Gd+3]":264,"[13CH2]":265,"[In+3]":266,"[Si@@]":267,"[Si@]":268,"[Ti+2]":269,"[Sn+]":270,"[Cl+2]":271,"[AlH-]":272,"[Pd-2]":273,"[SnH3]":274,"[B+3]":275,"[Cu-2]":276,"[Nd+3]":277,"[Pb+3]":278,"[13cH]":279,"[Fe-4]":280,"[Ga]":281,"[Sn+4]":282,"[Hg+]":283,"[11CH3]":284,"[Hf]":285,"[Pr]":286,"[Y]":287,"[S+2]":288,"[Cd]":289,"[Cr+6]":290,"[Zr+3]":291,"[Rh+]":292,"[CH3]":293,"[N-3]":294,"[Hf+2]":295,"[Th]":296,"[Sb+3]":297,"%14":298,"[Cr+2]":299,"[Ru+2]":300,"[Hf+4]":301,"[14C]":302,"[Ta]":303,"[Tl+]":304,"[B+]":305,"[Os+4]":306,"[PdH2]":307,"[Pd-]":308,"[Cd+2]":309,"[Co+3]":310,"[S+4]":311,"[Nb+5]":312,"[123I]":313,"[c+]":314,"[Rb+]":315,"[V+2]":316,"[CH3+]":317,"[Ag+2]":318,"[cH+]":319,"[Mn+3]":320,"[Se-]":321,"[As-]":322,"[Eu+3]":323,"[SH2]":324,"[Sm+3]":325,"[IH+]":326,"%15":327,"[OH3+]":328,"[PH3]":329,"[IH2+]":330,"[SH2+]":331,"[Ir+3]":332,"[AlH3]":333,"[Sc]":334,"[Yb]":335,"[15NH2]":336,"[Lu]":337,"[sH+]":338,"[Gd]":339,"[18F-]":340,"[SH3+]":341,"[SnH4]":342,"[TeH]":343,"[Si@@H]":344,"[Ga+3]":345,"[CaH2]":346,"[Tl]":347,"[Ta+5]":348,"[GeH]":349,"[Br+]":350,"[Sr]":351,"[Tl+3]":352,"[Sm+2]":353,"[PH5]":354,"%16":355,"[N@@+]":356,"[Au+3]":357,"[C-4]":358,"[Nd]":359,"[Ti+]":360,"[IH]":361,"[N@+]":362,"[125I]":363,"[Eu]":364,"[Sn+3]":365,"[Nb]":366,"[Er+3]":367,"[123I-]":368,"[14c]":369,"%17":370,"[SnH2]":371,"[YH]":372,"[Sb+5]":373,"[Pr+3]":374,"[Ir+]":375,"[N+3]":376,"[AlH2]":377,"[19F]":378,"%18":379,"[Tb]":380,"[14CH]":381,"[Mo+4]":382,"[Si+]":383,"[BH]":384,"[Be]":385,"[Rb]":386,"[pH]":387,"%19":388,"%20":389,"[Xe]":390,"[Ir-]":391,"[Be+2]":392,"[C+4]":393,"[RuH2]":394,"[15NH]":395,"[U+2]":396,"[Au-]":397,"%21":398,"%22":399,"[Au+]":400,"[15n]":401,"[Al+2]":402,"[Tb+3]":403,"[15N]":404,"[V+3]":405,"[W+6]":406,"[14CH3]":407,"[Cr+4]":408,"[ClH+]":409,"b":410,"[Ti+6]":411,"[Nd+]":412,"[Zr+]":413,"[PH2+]":414,"[Fm]":415,"[N@H+]":416,"[RuH]":417,"[Dy+3]":418,"%23":419,"[Hf+3]":420,"[W+4]":421,"[11C]":422,"[13CH]":423,"[Er]":424,"[124I]":425,"[LaH]":426,"[F]":427,"[siH]":428,"[Ga+]":429,"[Cm]":430,"[GeH3]":431,"[IH-]":432,"[U+6]":433,"[SeH+]":434,"[32P]":435,"[SeH-]":436,"[Pt-]":437,"[Ir+2]":438,"[se+]":439,"[U]":440,"[F+]":441,"[BH2]":442,"[As+]":443,"[Cf]":444,"[ClH2+]":445,"[Ni+]":446,"[TeH3]":447,"[SbH2]":448,"[Ag+3]":449,"%24":450,"[18O]":451,"[PH4]":452,"[Os+2]":453,"[Na-]":454,"[Sb+2]":455,"[V+4]":456,"[Ho+3]":457,"[68Ga]":458,"[PH-]":459,"[Bi+2]":460,"[Ce+2]":461,"[Pd+3]":462,"[99Tc]":463,"[13C@@H]":464,"[Fe+6]":465,"[c]":466,"[GeH2]":467,"[10B]":468,"[Cu+3]":469,"[Mo+2]":470,"[Cr+]":471,"[Pd+4]":472,"[Dy]":473,"[AsH]":474,"[Ba+]":475,"[SeH2]":476,"[In+]":477,"[TeH2]":478,"[BrH+]":479,"[14cH]":480,"[W+]":481,"[13C@H]":482,"[AsH2]":483,"[In+2]":484,"[N+2]":485,"[N@@H+]":486,"[SbH]":487,"[60Co]":488,"[AsH4+]":489,"[AsH3]":490,"[18OH]":491,"[Ru-2]":492,"[Na-2]":493,"[CuH2]":494,"[31P]":495,"[Ti+5]":496,"[35S]":497,"[P@@H]":498,"[ArH]":499,"[Co+]":500,"[Zr-2]":501,"[BH2-]":502,"[131I]":503,"[SH5]":504,"[VH]":505,"[B+2]":506,"[Yb+2]":507,"[14C@H]":508,"[211At]":509,"[NH3+2]":510,"[IrH]":511,"[IrH2]":512,"[Rh-]":513,"[Cr-]":514,"[Sb+]":515,"[Ni+3]":516,"[TaH3]":517,"[Tl+2]":518,"[64Cu]":519,"[Tc]":520,"[Cd+]":521,"[1H]":522,"[15nH]":523,"[AlH2+]":524,"[FH+2]":525,"[BiH3]":526,"[Ru-]":527,"[Mo+6]":528,"[AsH+]":529,"[BaH2]":530,"[BaH]":531,"[Fe+4]":532,"[229Th]":533,"[Th+4]":534,"[As+3]":535,"[NH+3]":536,"[P@H]":537,"[Li-]":538,"[7NaH]":539,"[Bi+]":540,"[PtH+2]":541,"[p-]":542,"[Re+5]":543,"[NiH]":544,"[Ni-]":545,"[Xe+]":546,"[Ca+]":547,"[11c]":548,"[Rh+4]":549,"[AcH]":550,"[HeH]":551,"[Sc+2]":552,"[Mn+]":553,"[UH]":554,"[14CH2]":555,"[SiH4+]":556,"[18OH2]":557,"[Ac-]":558,"[Re+4]":559,"[118Sn]":560,"[153Sm]":561,"[P+2]":562,"[9CH]":563,"[9CH3]":564,"[Y-]":565,"[NiH2]":566,"[Si+2]":567,"[Mn+6]":568,"[ZrH2]":569,"[C-2]":570,"[Bi+5]":571,"[24NaH]":572,"[Fr]":573,"[15CH]":574,"[Se+]":575,"[At]":576,"[P-3]":577,"[124I-]":578,"[CuH2-]":579,"[Nb+4]":580,"[Nb+3]":581,"[MgH]":582,"[Ir+4]":583,"[67Ga+3]":584,"[67Ga]":585,"[13N]":586,"[15OH2]":587,"[2NH]":588,"[Ho]":589,"[Cn]":590}