HassanCS commited on
Commit
6135a03
·
verified ·
1 Parent(s): 3118a9c

Add new SentenceTransformer model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 384,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,485 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:118400
8
+ - loss:TripletLoss
9
+ base_model: DeepChem/ChemBERTa-77M-MLM
10
+ widget:
11
+ - source_sentence: CC(C)C1CCC(C(=O)NC(Cc2ccccc2)C(=O)[O-])CC1
12
+ sentences:
13
+ - C[NH2+]CCCC1c2ccccc2C=Cc2ccccc21
14
+ - COC(=O)NC(C(=O)NC(Cc1ccccc1)C(O)CN(Cc1ccc(-c2ccccn2)cc1)NC(=O)C(NC(=O)OC)C(C)(C)C)C(C)(C)C
15
+ - CC1C=CC=CCCC=CC=CC=CC=CC(OC2OC(C)C(O)C([NH3+])C2O)CC(O)C(C(=O)[O-])C(O)CC(=O)CC(O)C(O)CCC(O)CC(O)CC(O)CC(=O)OC(C)C(C)C1O
16
+ - source_sentence: C[NH+]1CCCC1Cc1c[nH]c2ccc(CCS(=O)(=O)c3ccccc3)cc12
17
+ sentences:
18
+ - CC(C)CC([NH+](C)C)C1(c2ccc(Cl)cc2)CCC1
19
+ - CC(C)CNCc1ccc(-c2ccccc2S(=O)(=O)N2CCCC2)cc1
20
+ - CC(Oc1cc(-c2cnn(C3CC[NH2+]CC3)c2)cnc1N)c1c(Cl)ccc(F)c1Cl
21
+ - source_sentence: C[NH+]1C2CCC1CC(OC(c1ccccc1)c1ccccc1)C2
22
+ sentences:
23
+ - C[NH2+]C1C(O)C([NH2+]C)C2OC3(O)C(=O)CC(C)OC3OC2C1O
24
+ - C=CC1(C)CC(OC(=O)CSC2CC3CCC(C2)[NH+]3C)C2(C)C(C)CCC3(CCC(=O)C32)C(C)C1O
25
+ - CC(C)CC(NC(=O)C(CCc1ccccc1)NC(=O)CN1CCOCC1)C(=O)NC(Cc1ccccc1)C(=O)NC(CC(C)C)C(=O)C1(C)CO1
26
+ - source_sentence: CC(C)CC(NC(=O)C(Cc1ccc2ccccc2c1)NC(=O)C(Cc1ccc(O)cc1)NC(=O)C(CO)NC(=O)C(Cc1c[nH]c2ccccc12)NC(=O)C(Cc1c[nH]cn1)NC(=O)C1CCC(=O)N1)C(=O)NC(CCCNC(N)=[NH2+])C(=O)N1CCCC1C(=O)NCC(N)=O
27
+ sentences:
28
+ - C[NH2+]C1CCC(c2ccc(Cl)c(Cl)c2)c2ccccc21
29
+ - C=C1CC2CCC34CC5OC6C(OC7CCC(CC(=O)CC8C(CC9OC(CCC1O2)CC(C)C9=C)OC(CC(O)CN)C8OC)OC7C6O3)C5O4
30
+ - C[N+]1(C)CCC(=C(c2ccccc2)c2ccccc2)CC1
31
+ - source_sentence: CON=C(C(=O)NC1C(=O)N2C(C(=O)[O-])=C(C[N+]3(C)CCCC3)CSC12)c1csc(N)n1
32
+ sentences:
33
+ - CC1CNc2c(cccc2S(=O)(=O)NC(CCC[NH+]=C(N)N)C(=O)N2CCC(C)CC2C(=O)[O-])C1
34
+ - CC1C=CC=CCCC=CC=CC=CC=CC(OC2OC(C)C(O)C([NH3+])C2O)CC(O)C(C(=O)[O-])C(O)CC(=O)CC(O)C(O)CCC(O)CC(O)CC(O)CC(=O)OC(C)C(C)C1O
35
+ - CC(C)C1(C(=O)NC2CC(=O)OC2(O)CF)CC(c2nccc3ccccc23)=NO1
36
+ pipeline_tag: sentence-similarity
37
+ library_name: sentence-transformers
38
+ metrics:
39
+ - cosine_accuracy
40
+ model-index:
41
+ - name: SentenceTransformer based on DeepChem/ChemBERTa-77M-MLM
42
+ results:
43
+ - task:
44
+ type: triplet
45
+ name: Triplet
46
+ dataset:
47
+ name: all dev
48
+ type: all-dev
49
+ metrics:
50
+ - type: cosine_accuracy
51
+ value: 0.7135134935379028
52
+ name: Cosine Accuracy
53
+ ---
54
+
55
+ # SentenceTransformer based on DeepChem/ChemBERTa-77M-MLM
56
+
57
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [DeepChem/ChemBERTa-77M-MLM](https://huggingface.co/DeepChem/ChemBERTa-77M-MLM). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
58
+
59
+ ## Model Details
60
+
61
+ ### Model Description
62
+ - **Model Type:** Sentence Transformer
63
+ - **Base model:** [DeepChem/ChemBERTa-77M-MLM](https://huggingface.co/DeepChem/ChemBERTa-77M-MLM) <!-- at revision ed8a5374f2024ec8da53760af91a33fb8f6a15ff -->
64
+ - **Maximum Sequence Length:** 512 tokens
65
+ - **Output Dimensionality:** 384 dimensions
66
+ - **Similarity Function:** Cosine Similarity
67
+ <!-- - **Training Dataset:** Unknown -->
68
+ <!-- - **Language:** Unknown -->
69
+ <!-- - **License:** Unknown -->
70
+
71
+ ### Model Sources
72
+
73
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
74
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
75
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
76
+
77
+ ### Full Model Architecture
78
+
79
+ ```
80
+ SentenceTransformer(
81
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: RobertaModel
82
+ (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
83
+ )
84
+ ```
85
+
86
+ ## Usage
87
+
88
+ ### Direct Usage (Sentence Transformers)
89
+
90
+ First install the Sentence Transformers library:
91
+
92
+ ```bash
93
+ pip install -U sentence-transformers
94
+ ```
95
+
96
+ Then you can load this model and run inference.
97
+ ```python
98
+ from sentence_transformers import SentenceTransformer
99
+
100
+ # Download from the 🤗 Hub
101
+ model = SentenceTransformer("HassanCS/chemBERTa-tuned-on-ClinTox-using-triplet-loss")
102
+ # Run inference
103
+ sentences = [
104
+ 'CON=C(C(=O)NC1C(=O)N2C(C(=O)[O-])=C(C[N+]3(C)CCCC3)CSC12)c1csc(N)n1',
105
+ 'CC1CNc2c(cccc2S(=O)(=O)NC(CCC[NH+]=C(N)N)C(=O)N2CCC(C)CC2C(=O)[O-])C1',
106
+ 'CC(C)C1(C(=O)NC2CC(=O)OC2(O)CF)CC(c2nccc3ccccc23)=NO1',
107
+ ]
108
+ embeddings = model.encode(sentences)
109
+ print(embeddings.shape)
110
+ # [3, 384]
111
+
112
+ # Get the similarity scores for the embeddings
113
+ similarities = model.similarity(embeddings, embeddings)
114
+ print(similarities.shape)
115
+ # [3, 3]
116
+ ```
117
+
118
+ <!--
119
+ ### Direct Usage (Transformers)
120
+
121
+ <details><summary>Click to see the direct usage in Transformers</summary>
122
+
123
+ </details>
124
+ -->
125
+
126
+ <!--
127
+ ### Downstream Usage (Sentence Transformers)
128
+
129
+ You can finetune this model on your own dataset.
130
+
131
+ <details><summary>Click to expand</summary>
132
+
133
+ </details>
134
+ -->
135
+
136
+ <!--
137
+ ### Out-of-Scope Use
138
+
139
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
140
+ -->
141
+
142
+ ## Evaluation
143
+
144
+ ### Metrics
145
+
146
+ #### Triplet
147
+
148
+ * Dataset: `all-dev`
149
+ * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)
150
+
151
+ | Metric | Value |
152
+ |:--------------------|:-----------|
153
+ | **cosine_accuracy** | **0.7135** |
154
+
155
+ <!--
156
+ ## Bias, Risks and Limitations
157
+
158
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
159
+ -->
160
+
161
+ <!--
162
+ ### Recommendations
163
+
164
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
165
+ -->
166
+
167
+ ## Training Details
168
+
169
+ ### Training Dataset
170
+
171
+ #### Unnamed Dataset
172
+
173
+ * Size: 118,400 training samples
174
+ * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
175
+ * Approximate statistics based on the first 1000 samples:
176
+ | | anchor | positive | negative |
177
+ |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
178
+ | type | string | string | string |
179
+ | details | <ul><li>min: 20 tokens</li><li>mean: 33.0 tokens</li><li>max: 60 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 47.34 tokens</li><li>max: 212 tokens</li></ul> | <ul><li>min: 3 tokens</li><li>mean: 53.88 tokens</li><li>max: 212 tokens</li></ul> |
180
+ * Samples:
181
+ | anchor | positive | negative |
182
+ |:-------------------------------------------------------|:---------------------------------------------------------------------------------------------|:-----------------------------------------------------|
183
+ | <code>CC(C)CC(NC(=O)CNC(=O)c1cc(Cl)ccc1Cl)B(O)O</code> | <code>CC(=O)OC1CCC2(C)C(=CCC3C2CCC2(C)C(c4cccnc4)=CCC32)C1</code> | <code>CCOC(=O)c1ncn2c1CN(C)C(=O)c1cc(F)ccc1-2</code> |
184
+ | <code>CC(C)CC(NC(=O)CNC(=O)c1cc(Cl)ccc1Cl)B(O)O</code> | <code>COc1ccc(C(CN(C)C)C2(O)CCCCC2)cc1</code> | <code>C[NH2+]C1(C)C2CCC(C2)C1(C)C</code> |
185
+ | <code>CC(C)CC(NC(=O)CNC(=O)c1cc(Cl)ccc1Cl)B(O)O</code> | <code>CNC(=O)c1cc(Oc2ccc(NC(=O)Nc3ccc(Cl)c(C(F)(F)F)c3)cc2)ccn1.Cc1ccc(S(=O)(=O)O)cc1</code> | <code>Nc1ncnc2c1ncn2C1OC(CO)C(O)C1O</code> |
186
+ * Loss: [<code>TripletLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#tripletloss) with these parameters:
187
+ ```json
188
+ {
189
+ "distance_metric": "TripletDistanceMetric.EUCLIDEAN",
190
+ "triplet_margin": 5
191
+ }
192
+ ```
193
+
194
+ ### Evaluation Dataset
195
+
196
+ #### Unnamed Dataset
197
+
198
+ * Size: 1,480 evaluation samples
199
+ * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
200
+ * Approximate statistics based on the first 1000 samples:
201
+ | | anchor | positive | negative |
202
+ |:--------|:------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
203
+ | type | string | string | string |
204
+ | details | <ul><li>min: 18 tokens</li><li>mean: 54.07 tokens</li><li>max: 169 tokens</li></ul> | <ul><li>min: 18 tokens</li><li>mean: 60.4 tokens</li><li>max: 244 tokens</li></ul> | <ul><li>min: 30 tokens</li><li>mean: 71.25 tokens</li><li>max: 141 tokens</li></ul> |
205
+ * Samples:
206
+ | anchor | positive | negative |
207
+ |:------------------------------------------------------------------------|:------------------------------------------------------------|:-------------------------------------------------------------------------|
208
+ | <code>CC(C)OC(=O)CCCC=CCC1C(O)CC(O)C1C=CC(O)COc1cccc(C(F)(F)F)c1</code> | <code>CC12CCCCCC(Cc3ccc(O)cc31)C2[NH3+]</code> | <code>CC(C)C(CN1CCC(C)(c2cccc(O)c2)C(C)C1)NC(=O)C1Cc2ccc(O)cc2CN1</code> |
209
+ | <code>CC(C)OC(=O)CCCC=CCC1C(O)CC(O)C1C=CC(O)COc1cccc(C(F)(F)F)c1</code> | <code>COc1cc2c(cc1OC)C1CC(=O)C(CC(C)C)C[NH+]1CC2</code> | <code>CC(C)C(CN1CCC(C)(c2cccc(O)c2)C(C)C1)NC(=O)C1Cc2ccc(O)cc2CN1</code> |
210
+ | <code>CC(C)OC(=O)CCCC=CCC1C(O)CC(O)C1C=CC(O)COc1cccc(C(F)(F)F)c1</code> | <code>C[NH+](C)CCC=C1c2ccccc2COc2ccc(CC(=O)[O-])cc21</code> | <code>CC(C)C1(C(=O)NC2CC(=O)OC2(O)CF)CC(c2nccc3ccccc23)=NO1</code> |
211
+ * Loss: [<code>TripletLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#tripletloss) with these parameters:
212
+ ```json
213
+ {
214
+ "distance_metric": "TripletDistanceMetric.EUCLIDEAN",
215
+ "triplet_margin": 5
216
+ }
217
+ ```
218
+
219
+ ### Training Hyperparameters
220
+ #### Non-Default Hyperparameters
221
+
222
+ - `eval_strategy`: steps
223
+ - `per_device_train_batch_size`: 16
224
+ - `per_device_eval_batch_size`: 16
225
+ - `learning_rate`: 2e-05
226
+ - `num_train_epochs`: 5
227
+ - `warmup_ratio`: 0.1
228
+ - `fp16`: True
229
+ - `batch_sampler`: no_duplicates
230
+
231
+ #### All Hyperparameters
232
+ <details><summary>Click to expand</summary>
233
+
234
+ - `overwrite_output_dir`: False
235
+ - `do_predict`: False
236
+ - `eval_strategy`: steps
237
+ - `prediction_loss_only`: True
238
+ - `per_device_train_batch_size`: 16
239
+ - `per_device_eval_batch_size`: 16
240
+ - `per_gpu_train_batch_size`: None
241
+ - `per_gpu_eval_batch_size`: None
242
+ - `gradient_accumulation_steps`: 1
243
+ - `eval_accumulation_steps`: None
244
+ - `torch_empty_cache_steps`: None
245
+ - `learning_rate`: 2e-05
246
+ - `weight_decay`: 0.0
247
+ - `adam_beta1`: 0.9
248
+ - `adam_beta2`: 0.999
249
+ - `adam_epsilon`: 1e-08
250
+ - `max_grad_norm`: 1.0
251
+ - `num_train_epochs`: 5
252
+ - `max_steps`: -1
253
+ - `lr_scheduler_type`: linear
254
+ - `lr_scheduler_kwargs`: {}
255
+ - `warmup_ratio`: 0.1
256
+ - `warmup_steps`: 0
257
+ - `log_level`: passive
258
+ - `log_level_replica`: warning
259
+ - `log_on_each_node`: True
260
+ - `logging_nan_inf_filter`: True
261
+ - `save_safetensors`: True
262
+ - `save_on_each_node`: False
263
+ - `save_only_model`: False
264
+ - `restore_callback_states_from_checkpoint`: False
265
+ - `no_cuda`: False
266
+ - `use_cpu`: False
267
+ - `use_mps_device`: False
268
+ - `seed`: 42
269
+ - `data_seed`: None
270
+ - `jit_mode_eval`: False
271
+ - `use_ipex`: False
272
+ - `bf16`: False
273
+ - `fp16`: True
274
+ - `fp16_opt_level`: O1
275
+ - `half_precision_backend`: auto
276
+ - `bf16_full_eval`: False
277
+ - `fp16_full_eval`: False
278
+ - `tf32`: None
279
+ - `local_rank`: 0
280
+ - `ddp_backend`: None
281
+ - `tpu_num_cores`: None
282
+ - `tpu_metrics_debug`: False
283
+ - `debug`: []
284
+ - `dataloader_drop_last`: False
285
+ - `dataloader_num_workers`: 0
286
+ - `dataloader_prefetch_factor`: None
287
+ - `past_index`: -1
288
+ - `disable_tqdm`: False
289
+ - `remove_unused_columns`: True
290
+ - `label_names`: None
291
+ - `load_best_model_at_end`: False
292
+ - `ignore_data_skip`: False
293
+ - `fsdp`: []
294
+ - `fsdp_min_num_params`: 0
295
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
296
+ - `fsdp_transformer_layer_cls_to_wrap`: None
297
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
298
+ - `deepspeed`: None
299
+ - `label_smoothing_factor`: 0.0
300
+ - `optim`: adamw_torch
301
+ - `optim_args`: None
302
+ - `adafactor`: False
303
+ - `group_by_length`: False
304
+ - `length_column_name`: length
305
+ - `ddp_find_unused_parameters`: None
306
+ - `ddp_bucket_cap_mb`: None
307
+ - `ddp_broadcast_buffers`: False
308
+ - `dataloader_pin_memory`: True
309
+ - `dataloader_persistent_workers`: False
310
+ - `skip_memory_metrics`: True
311
+ - `use_legacy_prediction_loop`: False
312
+ - `push_to_hub`: False
313
+ - `resume_from_checkpoint`: None
314
+ - `hub_model_id`: None
315
+ - `hub_strategy`: every_save
316
+ - `hub_private_repo`: None
317
+ - `hub_always_push`: False
318
+ - `gradient_checkpointing`: False
319
+ - `gradient_checkpointing_kwargs`: None
320
+ - `include_inputs_for_metrics`: False
321
+ - `include_for_metrics`: []
322
+ - `eval_do_concat_batches`: True
323
+ - `fp16_backend`: auto
324
+ - `push_to_hub_model_id`: None
325
+ - `push_to_hub_organization`: None
326
+ - `mp_parameters`:
327
+ - `auto_find_batch_size`: False
328
+ - `full_determinism`: False
329
+ - `torchdynamo`: None
330
+ - `ray_scope`: last
331
+ - `ddp_timeout`: 1800
332
+ - `torch_compile`: False
333
+ - `torch_compile_backend`: None
334
+ - `torch_compile_mode`: None
335
+ - `dispatch_batches`: None
336
+ - `split_batches`: None
337
+ - `include_tokens_per_second`: False
338
+ - `include_num_input_tokens_seen`: False
339
+ - `neftune_noise_alpha`: None
340
+ - `optim_target_modules`: None
341
+ - `batch_eval_metrics`: False
342
+ - `eval_on_start`: False
343
+ - `use_liger_kernel`: False
344
+ - `eval_use_gather_object`: False
345
+ - `average_tokens_across_devices`: False
346
+ - `prompts`: None
347
+ - `batch_sampler`: no_duplicates
348
+ - `multi_dataset_batch_sampler`: proportional
349
+
350
+ </details>
351
+
352
+ ### Training Logs
353
+ | Epoch | Step | Training Loss | Validation Loss | all-dev_cosine_accuracy |
354
+ |:------:|:-----:|:-------------:|:---------------:|:-----------------------:|
355
+ | 0.0676 | 500 | 5.0821 | 5.1737 | 0.4047 |
356
+ | 0.1351 | 1000 | 4.9869 | 5.1766 | 0.4230 |
357
+ | 0.2027 | 1500 | 4.5562 | 4.9102 | 0.5345 |
358
+ | 0.2703 | 2000 | 3.2364 | 4.3712 | 0.6534 |
359
+ | 0.3378 | 2500 | 2.0738 | 4.0704 | 0.6736 |
360
+ | 0.4054 | 3000 | 1.4239 | 4.0200 | 0.6635 |
361
+ | 0.4730 | 3500 | 1.1578 | 3.7202 | 0.6791 |
362
+ | 0.5405 | 4000 | 0.9669 | 3.7197 | 0.6831 |
363
+ | 0.6081 | 4500 | 0.714 | 3.8818 | 0.6547 |
364
+ | 0.6757 | 5000 | 0.5359 | 4.0987 | 0.6243 |
365
+ | 0.7432 | 5500 | 0.5663 | 3.8127 | 0.6500 |
366
+ | 0.8108 | 6000 | 0.4827 | 3.8346 | 0.6676 |
367
+ | 0.8784 | 6500 | 0.4758 | 3.8333 | 0.6507 |
368
+ | 0.9459 | 7000 | 0.4759 | 3.6872 | 0.6912 |
369
+ | 1.0135 | 7500 | 0.4651 | 3.7229 | 0.6831 |
370
+ | 1.0811 | 8000 | 0.4739 | 3.8041 | 0.6662 |
371
+ | 1.1486 | 8500 | 0.4458 | 3.8235 | 0.6703 |
372
+ | 1.2162 | 9000 | 0.4189 | 3.7957 | 0.6716 |
373
+ | 1.2838 | 9500 | 0.4504 | 3.7422 | 0.6784 |
374
+ | 1.3514 | 10000 | 0.413 | 3.7588 | 0.6770 |
375
+ | 1.4189 | 10500 | 0.3808 | 3.9750 | 0.6615 |
376
+ | 1.4865 | 11000 | 0.3853 | 3.7417 | 0.6953 |
377
+ | 1.5541 | 11500 | 0.379 | 3.7319 | 0.6993 |
378
+ | 1.6216 | 12000 | 0.429 | 3.5620 | 0.7209 |
379
+ | 1.6892 | 12500 | 0.3735 | 3.6900 | 0.7020 |
380
+ | 1.7568 | 13000 | 0.3908 | 3.8182 | 0.6932 |
381
+ | 1.8243 | 13500 | 0.3848 | 3.7228 | 0.7101 |
382
+ | 1.8919 | 14000 | 0.3777 | 3.6604 | 0.7149 |
383
+ | 1.9595 | 14500 | 0.3912 | 3.7849 | 0.6946 |
384
+ | 2.0269 | 15000 | 0.3282 | 3.8607 | 0.7014 |
385
+ | 2.0945 | 15500 | 0.3324 | 3.8573 | 0.6953 |
386
+ | 2.1620 | 16000 | 0.3852 | 3.9420 | 0.7000 |
387
+ | 2.2296 | 16500 | 0.3633 | 3.7928 | 0.7189 |
388
+ | 2.2972 | 17000 | 0.3493 | 3.8217 | 0.7216 |
389
+ | 2.3647 | 17500 | 0.3554 | 3.8546 | 0.6993 |
390
+ | 2.4323 | 18000 | 0.3363 | 3.7764 | 0.6993 |
391
+ | 2.4999 | 18500 | 0.377 | 3.8224 | 0.6959 |
392
+ | 2.5674 | 19000 | 0.3569 | 3.8376 | 0.7155 |
393
+ | 2.635 | 19500 | 0.3414 | 4.0017 | 0.7034 |
394
+ | 2.7026 | 20000 | 0.3567 | 3.7405 | 0.7135 |
395
+ | 2.7701 | 20500 | 0.3524 | 3.9446 | 0.7189 |
396
+ | 2.8377 | 21000 | 0.3347 | 3.8140 | 0.7169 |
397
+ | 2.9053 | 21500 | 0.3458 | 4.0700 | 0.7088 |
398
+ | 2.9728 | 22000 | 0.3632 | 3.7930 | 0.7081 |
399
+ | 3.0404 | 22500 | 0.3496 | 3.9884 | 0.7236 |
400
+ | 3.1080 | 23000 | 0.3426 | 3.7102 | 0.7155 |
401
+ | 3.1755 | 23500 | 0.3579 | 3.9201 | 0.7135 |
402
+ | 3.2431 | 24000 | 0.3553 | 4.2237 | 0.7270 |
403
+ | 3.3107 | 24500 | 0.345 | 3.8090 | 0.7189 |
404
+ | 3.3782 | 25000 | 0.3475 | 3.7802 | 0.7284 |
405
+ | 3.4458 | 25500 | 0.3326 | 3.7549 | 0.7250 |
406
+ | 3.5134 | 26000 | 0.3228 | 3.6717 | 0.7216 |
407
+ | 3.5809 | 26500 | 0.3311 | 3.8241 | 0.7155 |
408
+ | 3.6485 | 27000 | 0.3215 | 3.8151 | 0.7142 |
409
+ | 3.7161 | 27500 | 0.3534 | 3.8639 | 0.7149 |
410
+ | 3.7836 | 28000 | 0.3369 | 4.0947 | 0.7101 |
411
+ | 3.8512 | 28500 | 0.3229 | 4.0495 | 0.7101 |
412
+ | 3.9188 | 29000 | 0.3442 | 4.0408 | 0.7169 |
413
+ | 3.9864 | 29500 | 0.3059 | 3.9493 | 0.6959 |
414
+ | 4.0538 | 30000 | 0.3349 | 4.0431 | 0.7108 |
415
+ | 4.1214 | 30500 | 0.3266 | 4.0224 | 0.7189 |
416
+ | 4.1889 | 31000 | 0.3501 | 3.9502 | 0.7169 |
417
+ | 4.2565 | 31500 | 0.3676 | 3.8903 | 0.7196 |
418
+ | 4.3241 | 32000 | 0.3191 | 3.7994 | 0.7162 |
419
+ | 4.3916 | 32500 | 0.3317 | 3.7889 | 0.7182 |
420
+ | 4.4592 | 33000 | 0.3304 | 3.8661 | 0.7108 |
421
+ | 4.5268 | 33500 | 0.3332 | 3.8822 | 0.7115 |
422
+ | 4.5943 | 34000 | 0.3435 | 3.7945 | 0.7088 |
423
+ | 4.6619 | 34500 | 0.317 | 3.8721 | 0.7243 |
424
+ | 4.7295 | 35000 | 0.3038 | 3.8615 | 0.7209 |
425
+ | 4.7970 | 35500 | 0.3093 | 3.8360 | 0.7162 |
426
+ | 4.8646 | 36000 | 0.3309 | 3.8277 | 0.7155 |
427
+ | 4.9322 | 36500 | 0.3378 | 3.7988 | 0.7128 |
428
+ | 4.9997 | 37000 | 0.311 | 3.8015 | 0.7135 |
429
+
430
+
431
+ ### Framework Versions
432
+ - Python: 3.11.11
433
+ - Sentence Transformers: 3.4.1
434
+ - Transformers: 4.47.1
435
+ - PyTorch: 2.5.1+cu124
436
+ - Accelerate: 1.2.1
437
+ - Datasets: 3.2.0
438
+ - Tokenizers: 0.21.0
439
+
440
+ ## Citation
441
+
442
+ ### BibTeX
443
+
444
+ #### Sentence Transformers
445
+ ```bibtex
446
+ @inproceedings{reimers-2019-sentence-bert,
447
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
448
+ author = "Reimers, Nils and Gurevych, Iryna",
449
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
450
+ month = "11",
451
+ year = "2019",
452
+ publisher = "Association for Computational Linguistics",
453
+ url = "https://arxiv.org/abs/1908.10084",
454
+ }
455
+ ```
456
+
457
+ #### TripletLoss
458
+ ```bibtex
459
+ @misc{hermans2017defense,
460
+ title={In Defense of the Triplet Loss for Person Re-Identification},
461
+ author={Alexander Hermans and Lucas Beyer and Bastian Leibe},
462
+ year={2017},
463
+ eprint={1703.07737},
464
+ archivePrefix={arXiv},
465
+ primaryClass={cs.CV}
466
+ }
467
+ ```
468
+
469
+ <!--
470
+ ## Glossary
471
+
472
+ *Clearly define terms in order to be accessible across audiences.*
473
+ -->
474
+
475
+ <!--
476
+ ## Model Card Authors
477
+
478
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
479
+ -->
480
+
481
+ <!--
482
+ ## Model Card Contact
483
+
484
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
485
+ -->
added_tokens.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "</s>": 592,
3
+ "<s>": 591
4
+ }
config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "DeepChem/ChemBERTa-77M-MLM",
3
+ "architectures": [
4
+ "RobertaModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.109,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": null,
9
+ "eos_token_id": 2,
10
+ "gradient_checkpointing": false,
11
+ "hidden_act": "gelu",
12
+ "hidden_dropout_prob": 0.144,
13
+ "hidden_size": 384,
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 464,
16
+ "is_gpu": true,
17
+ "layer_norm_eps": 1e-12,
18
+ "max_position_embeddings": 515,
19
+ "model_type": "roberta",
20
+ "num_attention_heads": 12,
21
+ "num_hidden_layers": 3,
22
+ "pad_token_id": 1,
23
+ "position_embedding_type": "absolute",
24
+ "torch_dtype": "float32",
25
+ "transformers_version": "4.47.1",
26
+ "type_vocab_size": 1,
27
+ "use_cache": true,
28
+ "vocab_size": 600
29
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.4.1",
4
+ "transformers": "4.47.1",
5
+ "pytorch": "2.5.1+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
merges.txt ADDED
@@ -0,0 +1 @@
 
 
1
+ #version: 0.2
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:db59b3c1992909fd850d296a0ed49b40978771e008e8f437a74d46bde7cafac2
3
+ size 13715688
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "[CLS]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": true,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "[MASK]",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "[PAD]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "[SEP]",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "[UNK]",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
tokenizer.json ADDED
@@ -0,0 +1,712 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "version": "1.0",
3
+ "truncation": {
4
+ "direction": "Right",
5
+ "max_length": 512,
6
+ "strategy": "LongestFirst",
7
+ "stride": 0
8
+ },
9
+ "padding": {
10
+ "strategy": "BatchLongest",
11
+ "direction": "Right",
12
+ "pad_to_multiple_of": null,
13
+ "pad_id": 0,
14
+ "pad_type_id": 0,
15
+ "pad_token": "[PAD]"
16
+ },
17
+ "added_tokens": [
18
+ {
19
+ "id": 0,
20
+ "content": "[PAD]",
21
+ "single_word": false,
22
+ "lstrip": false,
23
+ "rstrip": false,
24
+ "normalized": false,
25
+ "special": true
26
+ },
27
+ {
28
+ "id": 11,
29
+ "content": "[UNK]",
30
+ "single_word": false,
31
+ "lstrip": false,
32
+ "rstrip": false,
33
+ "normalized": false,
34
+ "special": true
35
+ },
36
+ {
37
+ "id": 12,
38
+ "content": "[CLS]",
39
+ "single_word": false,
40
+ "lstrip": false,
41
+ "rstrip": false,
42
+ "normalized": false,
43
+ "special": true
44
+ },
45
+ {
46
+ "id": 13,
47
+ "content": "[SEP]",
48
+ "single_word": false,
49
+ "lstrip": false,
50
+ "rstrip": false,
51
+ "normalized": false,
52
+ "special": true
53
+ },
54
+ {
55
+ "id": 14,
56
+ "content": "[MASK]",
57
+ "single_word": false,
58
+ "lstrip": true,
59
+ "rstrip": false,
60
+ "normalized": false,
61
+ "special": true
62
+ },
63
+ {
64
+ "id": 591,
65
+ "content": "<s>",
66
+ "single_word": false,
67
+ "lstrip": false,
68
+ "rstrip": false,
69
+ "normalized": true,
70
+ "special": true
71
+ },
72
+ {
73
+ "id": 592,
74
+ "content": "</s>",
75
+ "single_word": false,
76
+ "lstrip": false,
77
+ "rstrip": false,
78
+ "normalized": true,
79
+ "special": true
80
+ }
81
+ ],
82
+ "normalizer": null,
83
+ "pre_tokenizer": {
84
+ "type": "ByteLevel",
85
+ "add_prefix_space": false,
86
+ "trim_offsets": true,
87
+ "use_regex": true
88
+ },
89
+ "post_processor": {
90
+ "type": "RobertaProcessing",
91
+ "sep": [
92
+ "[SEP]",
93
+ 13
94
+ ],
95
+ "cls": [
96
+ "[CLS]",
97
+ 12
98
+ ],
99
+ "trim_offsets": true,
100
+ "add_prefix_space": false
101
+ },
102
+ "decoder": {
103
+ "type": "ByteLevel",
104
+ "add_prefix_space": true,
105
+ "trim_offsets": true,
106
+ "use_regex": true
107
+ },
108
+ "model": {
109
+ "type": "BPE",
110
+ "dropout": null,
111
+ "unk_token": null,
112
+ "continuing_subword_prefix": "",
113
+ "end_of_word_suffix": "",
114
+ "fuse_unk": false,
115
+ "byte_fallback": false,
116
+ "ignore_merges": false,
117
+ "vocab": {
118
+ "[PAD]": 0,
119
+ "[unused1]": 1,
120
+ "[unused2]": 2,
121
+ "[unused3]": 3,
122
+ "[unused4]": 4,
123
+ "[unused5]": 5,
124
+ "[unused6]": 6,
125
+ "[unused7]": 7,
126
+ "[unused8]": 8,
127
+ "[unused9]": 9,
128
+ "[unused10]": 10,
129
+ "[UNK]": 11,
130
+ "[CLS]": 12,
131
+ "[SEP]": 13,
132
+ "[MASK]": 14,
133
+ "c": 15,
134
+ "C": 16,
135
+ "(": 17,
136
+ ")": 18,
137
+ "O": 19,
138
+ "1": 20,
139
+ "2": 21,
140
+ "=": 22,
141
+ "N": 23,
142
+ ".": 24,
143
+ "n": 25,
144
+ "3": 26,
145
+ "F": 27,
146
+ "Cl": 28,
147
+ ">>": 29,
148
+ "~": 30,
149
+ "-": 31,
150
+ "4": 32,
151
+ "[C@H]": 33,
152
+ "S": 34,
153
+ "[C@@H]": 35,
154
+ "[O-]": 36,
155
+ "Br": 37,
156
+ "#": 38,
157
+ "/": 39,
158
+ "[nH]": 40,
159
+ "[N+]": 41,
160
+ "s": 42,
161
+ "5": 43,
162
+ "o": 44,
163
+ "P": 45,
164
+ "[Na+]": 46,
165
+ "[Si]": 47,
166
+ "I": 48,
167
+ "[Na]": 49,
168
+ "[Pd]": 50,
169
+ "[K+]": 51,
170
+ "[K]": 52,
171
+ "[P]": 53,
172
+ "B": 54,
173
+ "[C@]": 55,
174
+ "[C@@]": 56,
175
+ "[Cl-]": 57,
176
+ "6": 58,
177
+ "[OH-]": 59,
178
+ "\\": 60,
179
+ "[N-]": 61,
180
+ "[Li]": 62,
181
+ "[H]": 63,
182
+ "[2H]": 64,
183
+ "[NH4+]": 65,
184
+ "[c-]": 66,
185
+ "[P-]": 67,
186
+ "[Cs+]": 68,
187
+ "[Li+]": 69,
188
+ "[Cs]": 70,
189
+ "[NaH]": 71,
190
+ "[H-]": 72,
191
+ "[O+]": 73,
192
+ "[BH4-]": 74,
193
+ "[Cu]": 75,
194
+ "7": 76,
195
+ "[Mg]": 77,
196
+ "[Fe+2]": 78,
197
+ "[n+]": 79,
198
+ "[Sn]": 80,
199
+ "[BH-]": 81,
200
+ "[Pd+2]": 82,
201
+ "[CH]": 83,
202
+ "[I-]": 84,
203
+ "[Br-]": 85,
204
+ "[C-]": 86,
205
+ "[Zn]": 87,
206
+ "[B-]": 88,
207
+ "[F-]": 89,
208
+ "[Al]": 90,
209
+ "[P+]": 91,
210
+ "[BH3-]": 92,
211
+ "[Fe]": 93,
212
+ "[C]": 94,
213
+ "[AlH4]": 95,
214
+ "[Ni]": 96,
215
+ "[SiH]": 97,
216
+ "8": 98,
217
+ "[Cu+2]": 99,
218
+ "[Mn]": 100,
219
+ "[AlH]": 101,
220
+ "[nH+]": 102,
221
+ "[AlH4-]": 103,
222
+ "[O-2]": 104,
223
+ "[Cr]": 105,
224
+ "[Mg+2]": 106,
225
+ "[NH3+]": 107,
226
+ "[S@]": 108,
227
+ "[Pt]": 109,
228
+ "[Al+3]": 110,
229
+ "[S@@]": 111,
230
+ "[S-]": 112,
231
+ "[Ti]": 113,
232
+ "[Zn+2]": 114,
233
+ "[PH]": 115,
234
+ "[NH2+]": 116,
235
+ "[Ru]": 117,
236
+ "[Ag+]": 118,
237
+ "[S+]": 119,
238
+ "[I+3]": 120,
239
+ "[NH+]": 121,
240
+ "[Ca+2]": 122,
241
+ "[Ag]": 123,
242
+ "9": 124,
243
+ "[Os]": 125,
244
+ "[Se]": 126,
245
+ "[SiH2]": 127,
246
+ "[Ca]": 128,
247
+ "[Ti+4]": 129,
248
+ "[Ac]": 130,
249
+ "[Cu+]": 131,
250
+ "[S]": 132,
251
+ "[Rh]": 133,
252
+ "[Cl+3]": 134,
253
+ "[cH-]": 135,
254
+ "[Zn+]": 136,
255
+ "[O]": 137,
256
+ "[Cl+]": 138,
257
+ "[SH]": 139,
258
+ "[H+]": 140,
259
+ "[Pd+]": 141,
260
+ "[se]": 142,
261
+ "[PH+]": 143,
262
+ "[I]": 144,
263
+ "[Pt+2]": 145,
264
+ "[C+]": 146,
265
+ "[Mg+]": 147,
266
+ "[Hg]": 148,
267
+ "[W]": 149,
268
+ "[SnH]": 150,
269
+ "[SiH3]": 151,
270
+ "[Fe+3]": 152,
271
+ "[NH]": 153,
272
+ "[Mo]": 154,
273
+ "[CH2+]": 155,
274
+ "%10": 156,
275
+ "[CH2-]": 157,
276
+ "[CH2]": 158,
277
+ "[n-]": 159,
278
+ "[Ce+4]": 160,
279
+ "[NH-]": 161,
280
+ "[Co]": 162,
281
+ "[I+]": 163,
282
+ "[PH2]": 164,
283
+ "[Pt+4]": 165,
284
+ "[Ce]": 166,
285
+ "[B]": 167,
286
+ "[Sn+2]": 168,
287
+ "[Ba+2]": 169,
288
+ "%11": 170,
289
+ "[Fe-3]": 171,
290
+ "[18F]": 172,
291
+ "[SH-]": 173,
292
+ "[Pb+2]": 174,
293
+ "[Os-2]": 175,
294
+ "[Zr+4]": 176,
295
+ "[N]": 177,
296
+ "[Ir]": 178,
297
+ "[Bi]": 179,
298
+ "[Ni+2]": 180,
299
+ "[P@]": 181,
300
+ "[Co+2]": 182,
301
+ "[s+]": 183,
302
+ "[As]": 184,
303
+ "[P+3]": 185,
304
+ "[Hg+2]": 186,
305
+ "[Yb+3]": 187,
306
+ "[CH-]": 188,
307
+ "[Zr+2]": 189,
308
+ "[Mn+2]": 190,
309
+ "[CH+]": 191,
310
+ "[In]": 192,
311
+ "[KH]": 193,
312
+ "[Ce+3]": 194,
313
+ "[Zr]": 195,
314
+ "[AlH2-]": 196,
315
+ "[OH2+]": 197,
316
+ "[Ti+3]": 198,
317
+ "[Rh+2]": 199,
318
+ "[Sb]": 200,
319
+ "[S-2]": 201,
320
+ "%12": 202,
321
+ "[P@@]": 203,
322
+ "[Si@H]": 204,
323
+ "[Mn+4]": 205,
324
+ "p": 206,
325
+ "[Ba]": 207,
326
+ "[NH2-]": 208,
327
+ "[Ge]": 209,
328
+ "[Pb+4]": 210,
329
+ "[Cr+3]": 211,
330
+ "[Au]": 212,
331
+ "[LiH]": 213,
332
+ "[Sc+3]": 214,
333
+ "[o+]": 215,
334
+ "[Rh-3]": 216,
335
+ "%13": 217,
336
+ "[Br]": 218,
337
+ "[Sb-]": 219,
338
+ "[S@+]": 220,
339
+ "[I+2]": 221,
340
+ "[Ar]": 222,
341
+ "[V]": 223,
342
+ "[Cu-]": 224,
343
+ "[Al-]": 225,
344
+ "[Te]": 226,
345
+ "[13c]": 227,
346
+ "[13C]": 228,
347
+ "[Cl]": 229,
348
+ "[PH4+]": 230,
349
+ "[SiH4]": 231,
350
+ "[te]": 232,
351
+ "[CH3-]": 233,
352
+ "[S@@+]": 234,
353
+ "[Rh+3]": 235,
354
+ "[SH+]": 236,
355
+ "[Bi+3]": 237,
356
+ "[Br+2]": 238,
357
+ "[La]": 239,
358
+ "[La+3]": 240,
359
+ "[Pt-2]": 241,
360
+ "[N@@]": 242,
361
+ "[PH3+]": 243,
362
+ "[N@]": 244,
363
+ "[Si+4]": 245,
364
+ "[Sr+2]": 246,
365
+ "[Al+]": 247,
366
+ "[Pb]": 248,
367
+ "[SeH]": 249,
368
+ "[Si-]": 250,
369
+ "[V+5]": 251,
370
+ "[Y+3]": 252,
371
+ "[Re]": 253,
372
+ "[Ru+]": 254,
373
+ "[Sm]": 255,
374
+ "*": 256,
375
+ "[3H]": 257,
376
+ "[NH2]": 258,
377
+ "[Ag-]": 259,
378
+ "[13CH3]": 260,
379
+ "[OH+]": 261,
380
+ "[Ru+3]": 262,
381
+ "[OH]": 263,
382
+ "[Gd+3]": 264,
383
+ "[13CH2]": 265,
384
+ "[In+3]": 266,
385
+ "[Si@@]": 267,
386
+ "[Si@]": 268,
387
+ "[Ti+2]": 269,
388
+ "[Sn+]": 270,
389
+ "[Cl+2]": 271,
390
+ "[AlH-]": 272,
391
+ "[Pd-2]": 273,
392
+ "[SnH3]": 274,
393
+ "[B+3]": 275,
394
+ "[Cu-2]": 276,
395
+ "[Nd+3]": 277,
396
+ "[Pb+3]": 278,
397
+ "[13cH]": 279,
398
+ "[Fe-4]": 280,
399
+ "[Ga]": 281,
400
+ "[Sn+4]": 282,
401
+ "[Hg+]": 283,
402
+ "[11CH3]": 284,
403
+ "[Hf]": 285,
404
+ "[Pr]": 286,
405
+ "[Y]": 287,
406
+ "[S+2]": 288,
407
+ "[Cd]": 289,
408
+ "[Cr+6]": 290,
409
+ "[Zr+3]": 291,
410
+ "[Rh+]": 292,
411
+ "[CH3]": 293,
412
+ "[N-3]": 294,
413
+ "[Hf+2]": 295,
414
+ "[Th]": 296,
415
+ "[Sb+3]": 297,
416
+ "%14": 298,
417
+ "[Cr+2]": 299,
418
+ "[Ru+2]": 300,
419
+ "[Hf+4]": 301,
420
+ "[14C]": 302,
421
+ "[Ta]": 303,
422
+ "[Tl+]": 304,
423
+ "[B+]": 305,
424
+ "[Os+4]": 306,
425
+ "[PdH2]": 307,
426
+ "[Pd-]": 308,
427
+ "[Cd+2]": 309,
428
+ "[Co+3]": 310,
429
+ "[S+4]": 311,
430
+ "[Nb+5]": 312,
431
+ "[123I]": 313,
432
+ "[c+]": 314,
433
+ "[Rb+]": 315,
434
+ "[V+2]": 316,
435
+ "[CH3+]": 317,
436
+ "[Ag+2]": 318,
437
+ "[cH+]": 319,
438
+ "[Mn+3]": 320,
439
+ "[Se-]": 321,
440
+ "[As-]": 322,
441
+ "[Eu+3]": 323,
442
+ "[SH2]": 324,
443
+ "[Sm+3]": 325,
444
+ "[IH+]": 326,
445
+ "%15": 327,
446
+ "[OH3+]": 328,
447
+ "[PH3]": 329,
448
+ "[IH2+]": 330,
449
+ "[SH2+]": 331,
450
+ "[Ir+3]": 332,
451
+ "[AlH3]": 333,
452
+ "[Sc]": 334,
453
+ "[Yb]": 335,
454
+ "[15NH2]": 336,
455
+ "[Lu]": 337,
456
+ "[sH+]": 338,
457
+ "[Gd]": 339,
458
+ "[18F-]": 340,
459
+ "[SH3+]": 341,
460
+ "[SnH4]": 342,
461
+ "[TeH]": 343,
462
+ "[Si@@H]": 344,
463
+ "[Ga+3]": 345,
464
+ "[CaH2]": 346,
465
+ "[Tl]": 347,
466
+ "[Ta+5]": 348,
467
+ "[GeH]": 349,
468
+ "[Br+]": 350,
469
+ "[Sr]": 351,
470
+ "[Tl+3]": 352,
471
+ "[Sm+2]": 353,
472
+ "[PH5]": 354,
473
+ "%16": 355,
474
+ "[N@@+]": 356,
475
+ "[Au+3]": 357,
476
+ "[C-4]": 358,
477
+ "[Nd]": 359,
478
+ "[Ti+]": 360,
479
+ "[IH]": 361,
480
+ "[N@+]": 362,
481
+ "[125I]": 363,
482
+ "[Eu]": 364,
483
+ "[Sn+3]": 365,
484
+ "[Nb]": 366,
485
+ "[Er+3]": 367,
486
+ "[123I-]": 368,
487
+ "[14c]": 369,
488
+ "%17": 370,
489
+ "[SnH2]": 371,
490
+ "[YH]": 372,
491
+ "[Sb+5]": 373,
492
+ "[Pr+3]": 374,
493
+ "[Ir+]": 375,
494
+ "[N+3]": 376,
495
+ "[AlH2]": 377,
496
+ "[19F]": 378,
497
+ "%18": 379,
498
+ "[Tb]": 380,
499
+ "[14CH]": 381,
500
+ "[Mo+4]": 382,
501
+ "[Si+]": 383,
502
+ "[BH]": 384,
503
+ "[Be]": 385,
504
+ "[Rb]": 386,
505
+ "[pH]": 387,
506
+ "%19": 388,
507
+ "%20": 389,
508
+ "[Xe]": 390,
509
+ "[Ir-]": 391,
510
+ "[Be+2]": 392,
511
+ "[C+4]": 393,
512
+ "[RuH2]": 394,
513
+ "[15NH]": 395,
514
+ "[U+2]": 396,
515
+ "[Au-]": 397,
516
+ "%21": 398,
517
+ "%22": 399,
518
+ "[Au+]": 400,
519
+ "[15n]": 401,
520
+ "[Al+2]": 402,
521
+ "[Tb+3]": 403,
522
+ "[15N]": 404,
523
+ "[V+3]": 405,
524
+ "[W+6]": 406,
525
+ "[14CH3]": 407,
526
+ "[Cr+4]": 408,
527
+ "[ClH+]": 409,
528
+ "b": 410,
529
+ "[Ti+6]": 411,
530
+ "[Nd+]": 412,
531
+ "[Zr+]": 413,
532
+ "[PH2+]": 414,
533
+ "[Fm]": 415,
534
+ "[N@H+]": 416,
535
+ "[RuH]": 417,
536
+ "[Dy+3]": 418,
537
+ "%23": 419,
538
+ "[Hf+3]": 420,
539
+ "[W+4]": 421,
540
+ "[11C]": 422,
541
+ "[13CH]": 423,
542
+ "[Er]": 424,
543
+ "[124I]": 425,
544
+ "[LaH]": 426,
545
+ "[F]": 427,
546
+ "[siH]": 428,
547
+ "[Ga+]": 429,
548
+ "[Cm]": 430,
549
+ "[GeH3]": 431,
550
+ "[IH-]": 432,
551
+ "[U+6]": 433,
552
+ "[SeH+]": 434,
553
+ "[32P]": 435,
554
+ "[SeH-]": 436,
555
+ "[Pt-]": 437,
556
+ "[Ir+2]": 438,
557
+ "[se+]": 439,
558
+ "[U]": 440,
559
+ "[F+]": 441,
560
+ "[BH2]": 442,
561
+ "[As+]": 443,
562
+ "[Cf]": 444,
563
+ "[ClH2+]": 445,
564
+ "[Ni+]": 446,
565
+ "[TeH3]": 447,
566
+ "[SbH2]": 448,
567
+ "[Ag+3]": 449,
568
+ "%24": 450,
569
+ "[18O]": 451,
570
+ "[PH4]": 452,
571
+ "[Os+2]": 453,
572
+ "[Na-]": 454,
573
+ "[Sb+2]": 455,
574
+ "[V+4]": 456,
575
+ "[Ho+3]": 457,
576
+ "[68Ga]": 458,
577
+ "[PH-]": 459,
578
+ "[Bi+2]": 460,
579
+ "[Ce+2]": 461,
580
+ "[Pd+3]": 462,
581
+ "[99Tc]": 463,
582
+ "[13C@@H]": 464,
583
+ "[Fe+6]": 465,
584
+ "[c]": 466,
585
+ "[GeH2]": 467,
586
+ "[10B]": 468,
587
+ "[Cu+3]": 469,
588
+ "[Mo+2]": 470,
589
+ "[Cr+]": 471,
590
+ "[Pd+4]": 472,
591
+ "[Dy]": 473,
592
+ "[AsH]": 474,
593
+ "[Ba+]": 475,
594
+ "[SeH2]": 476,
595
+ "[In+]": 477,
596
+ "[TeH2]": 478,
597
+ "[BrH+]": 479,
598
+ "[14cH]": 480,
599
+ "[W+]": 481,
600
+ "[13C@H]": 482,
601
+ "[AsH2]": 483,
602
+ "[In+2]": 484,
603
+ "[N+2]": 485,
604
+ "[N@@H+]": 486,
605
+ "[SbH]": 487,
606
+ "[60Co]": 488,
607
+ "[AsH4+]": 489,
608
+ "[AsH3]": 490,
609
+ "[18OH]": 491,
610
+ "[Ru-2]": 492,
611
+ "[Na-2]": 493,
612
+ "[CuH2]": 494,
613
+ "[31P]": 495,
614
+ "[Ti+5]": 496,
615
+ "[35S]": 497,
616
+ "[P@@H]": 498,
617
+ "[ArH]": 499,
618
+ "[Co+]": 500,
619
+ "[Zr-2]": 501,
620
+ "[BH2-]": 502,
621
+ "[131I]": 503,
622
+ "[SH5]": 504,
623
+ "[VH]": 505,
624
+ "[B+2]": 506,
625
+ "[Yb+2]": 507,
626
+ "[14C@H]": 508,
627
+ "[211At]": 509,
628
+ "[NH3+2]": 510,
629
+ "[IrH]": 511,
630
+ "[IrH2]": 512,
631
+ "[Rh-]": 513,
632
+ "[Cr-]": 514,
633
+ "[Sb+]": 515,
634
+ "[Ni+3]": 516,
635
+ "[TaH3]": 517,
636
+ "[Tl+2]": 518,
637
+ "[64Cu]": 519,
638
+ "[Tc]": 520,
639
+ "[Cd+]": 521,
640
+ "[1H]": 522,
641
+ "[15nH]": 523,
642
+ "[AlH2+]": 524,
643
+ "[FH+2]": 525,
644
+ "[BiH3]": 526,
645
+ "[Ru-]": 527,
646
+ "[Mo+6]": 528,
647
+ "[AsH+]": 529,
648
+ "[BaH2]": 530,
649
+ "[BaH]": 531,
650
+ "[Fe+4]": 532,
651
+ "[229Th]": 533,
652
+ "[Th+4]": 534,
653
+ "[As+3]": 535,
654
+ "[NH+3]": 536,
655
+ "[P@H]": 537,
656
+ "[Li-]": 538,
657
+ "[7NaH]": 539,
658
+ "[Bi+]": 540,
659
+ "[PtH+2]": 541,
660
+ "[p-]": 542,
661
+ "[Re+5]": 543,
662
+ "[NiH]": 544,
663
+ "[Ni-]": 545,
664
+ "[Xe+]": 546,
665
+ "[Ca+]": 547,
666
+ "[11c]": 548,
667
+ "[Rh+4]": 549,
668
+ "[AcH]": 550,
669
+ "[HeH]": 551,
670
+ "[Sc+2]": 552,
671
+ "[Mn+]": 553,
672
+ "[UH]": 554,
673
+ "[14CH2]": 555,
674
+ "[SiH4+]": 556,
675
+ "[18OH2]": 557,
676
+ "[Ac-]": 558,
677
+ "[Re+4]": 559,
678
+ "[118Sn]": 560,
679
+ "[153Sm]": 561,
680
+ "[P+2]": 562,
681
+ "[9CH]": 563,
682
+ "[9CH3]": 564,
683
+ "[Y-]": 565,
684
+ "[NiH2]": 566,
685
+ "[Si+2]": 567,
686
+ "[Mn+6]": 568,
687
+ "[ZrH2]": 569,
688
+ "[C-2]": 570,
689
+ "[Bi+5]": 571,
690
+ "[24NaH]": 572,
691
+ "[Fr]": 573,
692
+ "[15CH]": 574,
693
+ "[Se+]": 575,
694
+ "[At]": 576,
695
+ "[P-3]": 577,
696
+ "[124I-]": 578,
697
+ "[CuH2-]": 579,
698
+ "[Nb+4]": 580,
699
+ "[Nb+3]": 581,
700
+ "[MgH]": 582,
701
+ "[Ir+4]": 583,
702
+ "[67Ga+3]": 584,
703
+ "[67Ga]": 585,
704
+ "[13N]": 586,
705
+ "[15OH2]": 587,
706
+ "[2NH]": 588,
707
+ "[Ho]": 589,
708
+ "[Cn]": 590
709
+ },
710
+ "merges": []
711
+ }
712
+ }
tokenizer_config.json ADDED
@@ -0,0 +1,76 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "[PAD]",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "11": {
13
+ "content": "[UNK]",
14
+ "lstrip": false,
15
+ "normalized": false,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "12": {
21
+ "content": "[CLS]",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": true
27
+ },
28
+ "13": {
29
+ "content": "[SEP]",
30
+ "lstrip": false,
31
+ "normalized": false,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": true
35
+ },
36
+ "14": {
37
+ "content": "[MASK]",
38
+ "lstrip": true,
39
+ "normalized": false,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": true
43
+ },
44
+ "591": {
45
+ "content": "<s>",
46
+ "lstrip": false,
47
+ "normalized": true,
48
+ "rstrip": false,
49
+ "single_word": false,
50
+ "special": true
51
+ },
52
+ "592": {
53
+ "content": "</s>",
54
+ "lstrip": false,
55
+ "normalized": true,
56
+ "rstrip": false,
57
+ "single_word": false,
58
+ "special": true
59
+ }
60
+ },
61
+ "bos_token": "<s>",
62
+ "clean_up_tokenization_spaces": false,
63
+ "cls_token": "[CLS]",
64
+ "eos_token": "</s>",
65
+ "errors": "replace",
66
+ "extra_special_tokens": {},
67
+ "full_tokenizer_file": null,
68
+ "mask_token": "[MASK]",
69
+ "max_len": 512,
70
+ "model_max_length": 512,
71
+ "pad_token": "[PAD]",
72
+ "sep_token": "[SEP]",
73
+ "tokenizer_class": "RobertaTokenizer",
74
+ "trim_offsets": true,
75
+ "unk_token": "[UNK]"
76
+ }
vocab.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"[PAD]":0,"[unused1]":1,"[unused2]":2,"[unused3]":3,"[unused4]":4,"[unused5]":5,"[unused6]":6,"[unused7]":7,"[unused8]":8,"[unused9]":9,"[unused10]":10,"[UNK]":11,"[CLS]":12,"[SEP]":13,"[MASK]":14,"c":15,"C":16,"(":17,")":18,"O":19,"1":20,"2":21,"=":22,"N":23,".":24,"n":25,"3":26,"F":27,"Cl":28,">>":29,"~":30,"-":31,"4":32,"[C@H]":33,"S":34,"[C@@H]":35,"[O-]":36,"Br":37,"#":38,"/":39,"[nH]":40,"[N+]":41,"s":42,"5":43,"o":44,"P":45,"[Na+]":46,"[Si]":47,"I":48,"[Na]":49,"[Pd]":50,"[K+]":51,"[K]":52,"[P]":53,"B":54,"[C@]":55,"[C@@]":56,"[Cl-]":57,"6":58,"[OH-]":59,"\\":60,"[N-]":61,"[Li]":62,"[H]":63,"[2H]":64,"[NH4+]":65,"[c-]":66,"[P-]":67,"[Cs+]":68,"[Li+]":69,"[Cs]":70,"[NaH]":71,"[H-]":72,"[O+]":73,"[BH4-]":74,"[Cu]":75,"7":76,"[Mg]":77,"[Fe+2]":78,"[n+]":79,"[Sn]":80,"[BH-]":81,"[Pd+2]":82,"[CH]":83,"[I-]":84,"[Br-]":85,"[C-]":86,"[Zn]":87,"[B-]":88,"[F-]":89,"[Al]":90,"[P+]":91,"[BH3-]":92,"[Fe]":93,"[C]":94,"[AlH4]":95,"[Ni]":96,"[SiH]":97,"8":98,"[Cu+2]":99,"[Mn]":100,"[AlH]":101,"[nH+]":102,"[AlH4-]":103,"[O-2]":104,"[Cr]":105,"[Mg+2]":106,"[NH3+]":107,"[S@]":108,"[Pt]":109,"[Al+3]":110,"[S@@]":111,"[S-]":112,"[Ti]":113,"[Zn+2]":114,"[PH]":115,"[NH2+]":116,"[Ru]":117,"[Ag+]":118,"[S+]":119,"[I+3]":120,"[NH+]":121,"[Ca+2]":122,"[Ag]":123,"9":124,"[Os]":125,"[Se]":126,"[SiH2]":127,"[Ca]":128,"[Ti+4]":129,"[Ac]":130,"[Cu+]":131,"[S]":132,"[Rh]":133,"[Cl+3]":134,"[cH-]":135,"[Zn+]":136,"[O]":137,"[Cl+]":138,"[SH]":139,"[H+]":140,"[Pd+]":141,"[se]":142,"[PH+]":143,"[I]":144,"[Pt+2]":145,"[C+]":146,"[Mg+]":147,"[Hg]":148,"[W]":149,"[SnH]":150,"[SiH3]":151,"[Fe+3]":152,"[NH]":153,"[Mo]":154,"[CH2+]":155,"%10":156,"[CH2-]":157,"[CH2]":158,"[n-]":159,"[Ce+4]":160,"[NH-]":161,"[Co]":162,"[I+]":163,"[PH2]":164,"[Pt+4]":165,"[Ce]":166,"[B]":167,"[Sn+2]":168,"[Ba+2]":169,"%11":170,"[Fe-3]":171,"[18F]":172,"[SH-]":173,"[Pb+2]":174,"[Os-2]":175,"[Zr+4]":176,"[N]":177,"[Ir]":178,"[Bi]":179,"[Ni+2]":180,"[P@]":181,"[Co+2]":182,"[s+]":183,"[As]":184,"[P+3]":185,"[Hg+2]":186,"[Yb+3]":187,"[CH-]":188,"[Zr+2]":189,"[Mn+2]":190,"[CH+]":191,"[In]":192,"[KH]":193,"[Ce+3]":194,"[Zr]":195,"[AlH2-]":196,"[OH2+]":197,"[Ti+3]":198,"[Rh+2]":199,"[Sb]":200,"[S-2]":201,"%12":202,"[P@@]":203,"[Si@H]":204,"[Mn+4]":205,"p":206,"[Ba]":207,"[NH2-]":208,"[Ge]":209,"[Pb+4]":210,"[Cr+3]":211,"[Au]":212,"[LiH]":213,"[Sc+3]":214,"[o+]":215,"[Rh-3]":216,"%13":217,"[Br]":218,"[Sb-]":219,"[S@+]":220,"[I+2]":221,"[Ar]":222,"[V]":223,"[Cu-]":224,"[Al-]":225,"[Te]":226,"[13c]":227,"[13C]":228,"[Cl]":229,"[PH4+]":230,"[SiH4]":231,"[te]":232,"[CH3-]":233,"[S@@+]":234,"[Rh+3]":235,"[SH+]":236,"[Bi+3]":237,"[Br+2]":238,"[La]":239,"[La+3]":240,"[Pt-2]":241,"[N@@]":242,"[PH3+]":243,"[N@]":244,"[Si+4]":245,"[Sr+2]":246,"[Al+]":247,"[Pb]":248,"[SeH]":249,"[Si-]":250,"[V+5]":251,"[Y+3]":252,"[Re]":253,"[Ru+]":254,"[Sm]":255,"*":256,"[3H]":257,"[NH2]":258,"[Ag-]":259,"[13CH3]":260,"[OH+]":261,"[Ru+3]":262,"[OH]":263,"[Gd+3]":264,"[13CH2]":265,"[In+3]":266,"[Si@@]":267,"[Si@]":268,"[Ti+2]":269,"[Sn+]":270,"[Cl+2]":271,"[AlH-]":272,"[Pd-2]":273,"[SnH3]":274,"[B+3]":275,"[Cu-2]":276,"[Nd+3]":277,"[Pb+3]":278,"[13cH]":279,"[Fe-4]":280,"[Ga]":281,"[Sn+4]":282,"[Hg+]":283,"[11CH3]":284,"[Hf]":285,"[Pr]":286,"[Y]":287,"[S+2]":288,"[Cd]":289,"[Cr+6]":290,"[Zr+3]":291,"[Rh+]":292,"[CH3]":293,"[N-3]":294,"[Hf+2]":295,"[Th]":296,"[Sb+3]":297,"%14":298,"[Cr+2]":299,"[Ru+2]":300,"[Hf+4]":301,"[14C]":302,"[Ta]":303,"[Tl+]":304,"[B+]":305,"[Os+4]":306,"[PdH2]":307,"[Pd-]":308,"[Cd+2]":309,"[Co+3]":310,"[S+4]":311,"[Nb+5]":312,"[123I]":313,"[c+]":314,"[Rb+]":315,"[V+2]":316,"[CH3+]":317,"[Ag+2]":318,"[cH+]":319,"[Mn+3]":320,"[Se-]":321,"[As-]":322,"[Eu+3]":323,"[SH2]":324,"[Sm+3]":325,"[IH+]":326,"%15":327,"[OH3+]":328,"[PH3]":329,"[IH2+]":330,"[SH2+]":331,"[Ir+3]":332,"[AlH3]":333,"[Sc]":334,"[Yb]":335,"[15NH2]":336,"[Lu]":337,"[sH+]":338,"[Gd]":339,"[18F-]":340,"[SH3+]":341,"[SnH4]":342,"[TeH]":343,"[Si@@H]":344,"[Ga+3]":345,"[CaH2]":346,"[Tl]":347,"[Ta+5]":348,"[GeH]":349,"[Br+]":350,"[Sr]":351,"[Tl+3]":352,"[Sm+2]":353,"[PH5]":354,"%16":355,"[N@@+]":356,"[Au+3]":357,"[C-4]":358,"[Nd]":359,"[Ti+]":360,"[IH]":361,"[N@+]":362,"[125I]":363,"[Eu]":364,"[Sn+3]":365,"[Nb]":366,"[Er+3]":367,"[123I-]":368,"[14c]":369,"%17":370,"[SnH2]":371,"[YH]":372,"[Sb+5]":373,"[Pr+3]":374,"[Ir+]":375,"[N+3]":376,"[AlH2]":377,"[19F]":378,"%18":379,"[Tb]":380,"[14CH]":381,"[Mo+4]":382,"[Si+]":383,"[BH]":384,"[Be]":385,"[Rb]":386,"[pH]":387,"%19":388,"%20":389,"[Xe]":390,"[Ir-]":391,"[Be+2]":392,"[C+4]":393,"[RuH2]":394,"[15NH]":395,"[U+2]":396,"[Au-]":397,"%21":398,"%22":399,"[Au+]":400,"[15n]":401,"[Al+2]":402,"[Tb+3]":403,"[15N]":404,"[V+3]":405,"[W+6]":406,"[14CH3]":407,"[Cr+4]":408,"[ClH+]":409,"b":410,"[Ti+6]":411,"[Nd+]":412,"[Zr+]":413,"[PH2+]":414,"[Fm]":415,"[N@H+]":416,"[RuH]":417,"[Dy+3]":418,"%23":419,"[Hf+3]":420,"[W+4]":421,"[11C]":422,"[13CH]":423,"[Er]":424,"[124I]":425,"[LaH]":426,"[F]":427,"[siH]":428,"[Ga+]":429,"[Cm]":430,"[GeH3]":431,"[IH-]":432,"[U+6]":433,"[SeH+]":434,"[32P]":435,"[SeH-]":436,"[Pt-]":437,"[Ir+2]":438,"[se+]":439,"[U]":440,"[F+]":441,"[BH2]":442,"[As+]":443,"[Cf]":444,"[ClH2+]":445,"[Ni+]":446,"[TeH3]":447,"[SbH2]":448,"[Ag+3]":449,"%24":450,"[18O]":451,"[PH4]":452,"[Os+2]":453,"[Na-]":454,"[Sb+2]":455,"[V+4]":456,"[Ho+3]":457,"[68Ga]":458,"[PH-]":459,"[Bi+2]":460,"[Ce+2]":461,"[Pd+3]":462,"[99Tc]":463,"[13C@@H]":464,"[Fe+6]":465,"[c]":466,"[GeH2]":467,"[10B]":468,"[Cu+3]":469,"[Mo+2]":470,"[Cr+]":471,"[Pd+4]":472,"[Dy]":473,"[AsH]":474,"[Ba+]":475,"[SeH2]":476,"[In+]":477,"[TeH2]":478,"[BrH+]":479,"[14cH]":480,"[W+]":481,"[13C@H]":482,"[AsH2]":483,"[In+2]":484,"[N+2]":485,"[N@@H+]":486,"[SbH]":487,"[60Co]":488,"[AsH4+]":489,"[AsH3]":490,"[18OH]":491,"[Ru-2]":492,"[Na-2]":493,"[CuH2]":494,"[31P]":495,"[Ti+5]":496,"[35S]":497,"[P@@H]":498,"[ArH]":499,"[Co+]":500,"[Zr-2]":501,"[BH2-]":502,"[131I]":503,"[SH5]":504,"[VH]":505,"[B+2]":506,"[Yb+2]":507,"[14C@H]":508,"[211At]":509,"[NH3+2]":510,"[IrH]":511,"[IrH2]":512,"[Rh-]":513,"[Cr-]":514,"[Sb+]":515,"[Ni+3]":516,"[TaH3]":517,"[Tl+2]":518,"[64Cu]":519,"[Tc]":520,"[Cd+]":521,"[1H]":522,"[15nH]":523,"[AlH2+]":524,"[FH+2]":525,"[BiH3]":526,"[Ru-]":527,"[Mo+6]":528,"[AsH+]":529,"[BaH2]":530,"[BaH]":531,"[Fe+4]":532,"[229Th]":533,"[Th+4]":534,"[As+3]":535,"[NH+3]":536,"[P@H]":537,"[Li-]":538,"[7NaH]":539,"[Bi+]":540,"[PtH+2]":541,"[p-]":542,"[Re+5]":543,"[NiH]":544,"[Ni-]":545,"[Xe+]":546,"[Ca+]":547,"[11c]":548,"[Rh+4]":549,"[AcH]":550,"[HeH]":551,"[Sc+2]":552,"[Mn+]":553,"[UH]":554,"[14CH2]":555,"[SiH4+]":556,"[18OH2]":557,"[Ac-]":558,"[Re+4]":559,"[118Sn]":560,"[153Sm]":561,"[P+2]":562,"[9CH]":563,"[9CH3]":564,"[Y-]":565,"[NiH2]":566,"[Si+2]":567,"[Mn+6]":568,"[ZrH2]":569,"[C-2]":570,"[Bi+5]":571,"[24NaH]":572,"[Fr]":573,"[15CH]":574,"[Se+]":575,"[At]":576,"[P-3]":577,"[124I-]":578,"[CuH2-]":579,"[Nb+4]":580,"[Nb+3]":581,"[MgH]":582,"[Ir+4]":583,"[67Ga+3]":584,"[67Ga]":585,"[13N]":586,"[15OH2]":587,"[2NH]":588,"[Ho]":589,"[Cn]":590}