File size: 54,874 Bytes
afc0495
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- dense
- generated_from_trainer
- dataset_size:800
- loss:MultipleNegativesRankingLoss
base_model: microsoft/mpnet-base
widget:
- source_sentence: What is the department of medicine located at?
  sentences:
  - 'Publisher’s Note: MDPI stays neutral

    with regard to jurisdictional claims in

    published maps and institutional afil-


    iations.


    onon)


    Copyright: © 2021 by the author.

    Licensee MDPI, Basel, Switzerland.

    This article is an open access article

    distributed under the terms and

    conditions of the Creative Commons

    Attribution (CC BY) license (https://

    creativecommons.org/licenses/by/

    4.0/).


    Joan and Sanford I. Weill Department of Medicine, Weill Cornell Medical College,
    525 East 68th Street,

    Room M-522, Box 130, New York, NY 10065, USA; [email protected] or [email protected]'
  - 'Results At the parameters used, the ultrasound did not directly affect bCSC proliferation,
    with no evident changes in

    morphology. In contrast, the ultrasound protocol affected the migration and invasion
    ability of bCSCs, limiting their

    capacity to advance while a major affection was detected on their angiogenic properties.
    LIPUS-treated bCSCs were

    unable to transform into aggressive metastatic cancer cells, by decreasing their
    migration and invasion capacity as

    well as vessel formation. Finally, RNA-seq analysis revealed major changes in
    gene expression, with 676 differentially'
  - 'Tesfaye, M. & Savoldo, B. Adoptive cell therapy in

    treating pediatric solid tumors. Curr. Oncol. Rep. 20,

    73 (2018).


    Marofi, F. et al. CAR T cells in solid tumors: challenges

    and opportunities. Stem Cell Res. Ther. 12, 81 (2021).

    Deng, Q. et al. Characteristics of anti-CD19 CAR T cell

    infusion products associated with efficacy and toxicity


    in patients with large B cell lymphomas. Nat. Med. 26,


    1878-1887 (2020).

    Boulch, M. A cross-talk between CAR T cell subsets

    and the tumor microenvironment is essential for

    sustained cytotoxic activity. Sci. Immunol. 6,

    eabd4344 (2021).'
- source_sentence: What is the result of LIPUS treatment on the formation of new vessels
    and tubes?
  sentences:
  - 'apparatus), and mitochondrial damage, which then leads to eventual cell death
    [112,114].

    Accordingly, alterations that affect the lysosomal-mitochondria relationship and
    their

    metabolic equilibrium generate a defective metabolism, which contributes to disease
    pro-

    gression [115]. Consequently, the identification of regulatory molecular links
    between these

    two organelles will most probably cause the rise of novel targets for the treatment
    of NPC.

    Therefore, we propose that members of the miRNA-17-92 cluster could be relevant
    actors'
  - 'A tube formation assay was conducted on Matrigel to

    study the impact of LIPUS stimulation on bCSCs’ angio-

    genic activity (Fig. 5). After 2 h, both control and LIPUS-

    stimulated cells exhibited signs of angiogenesis (Fig. 5A

    and B). This observation was further confirmed by count-

    ing the number of panel-like structures and vessels in

    both conditions, which were slightly higher in control

    cells (Fig. 5C). Statistical analysis using Student’s t-test

    revealed that LIPUS treatment significantly reduced the

    formation of new vessels and tubes (y=0.0039). These'
  - 'Although a number of preclinical studies, like the ones

    previously described, have shown considerable promise re-

    garding the use of ADSC-therapy, more studies are needed.

    Future studies can continue to work toward determining if

    hADSCs are capable of being used for cell replacement and

    better elucidate the mechanisms by which hADSCs work.


    IV. ADIPOSE TISSUE AS A SOURCE FOR STEM

    CELLS'
- source_sentence: What percentage of cases had malignant lesions?
  sentences:
  - 'Vedolizumab Monoclonal antibody anti «487 integrins, blocks gut homing of T lymphocytes


    “These drugs are used as second line treatments for SR aGvHD, as reviewed by Penack
    et al. (11).

    ’Ruxolitinib has been recently approved by FDA as second line therapy for SR aGVHD.


    TABLE 3 | Major drugs used as second line treatment of cGvHD and their mechanisms.


    Drug* Major mechanisms identified


    Cyclosporin A, tacrolimus Calcineurin inhibitors that block downstrem TCR signalling
    leading to NFAT regulated genes transcription; block T cells

    activation'
  - '--- Page 4 ---

    J. Clin. Med. 2024, 13, 7559


    4 of 13


    lesions were found in 59 cases (70.24%) and malignant lesions in 25 cases (29.76%).
    In DC

    IV, benign lesions were found in 57 cases (81.4%) and malignant lesions in 13
    cases (18.6%).

    There were no statistically significant associations between gender (p = 0.76),
    BMI (p = 0.52),

    and obesity (p = 0.76) and the presence of thyroid malignancy.


    Table 1. Demographic and pathologic features of 521 patients who underwent surgery
    due to


    thyroid nodules.'
  - 'MSCs showed that these exosomes induce angiogenesis in

    endothelial cells via the activation of the NF«B pathway (141).

    However, in another study exosomes derived from hypoxia-

    preconditioned MSCs contributed to the attenuation of the

    injury resulting from an ischemia/reperfusion episode via the

    Wnt signaling pathway (142). Beyond that, hypoxia seems to

    increase exosome secretion in general (141). Also, in a fat

    graft model, co-transplantation of exosomes from hypoxia pre-

    conditioned adipose-derived MSC improved vascularization and

    graft survival (143) (see Table 5).'
- source_sentence: When is routine fine-needle aspiration biopsy (FS) recommended
    during thyroidectomy?
  sentences:
  - 'ing queries about its routine use due to the improved preoperative diagnosis.
    Nowadays, while the use of FS during thyroidectomy

    has decreased, it is still used as an additional method for different purposes
    intraoperatively. FS may not always provide definitive

    results. If FS will alter the surgical plan or extent, it should be applied. Routine
    FS is not recommended for evaluating thyroid nod-

    ules. But in addition to FNAB, if FS results may change the operation plan or
    extent, they can be utilized. FS should not be applied'
  - 'Approximately 15% of FNABs take part in this category.

    After their initial Bethesda | FNAB, the malignancy risk in

    nodules surgically excised, ranges between 5-20%. Repeat

    FNAB is recommended if the initial FNAB result is Bethes-

    da |, and in 60-80% of cases, the repeat FNAB results in a

    diagnostic category.''''?*°! If the second FNAB also yields a

    nondiagnostic result, surgical resection is recommended.

    21] Especially in cases with Bethesda | FNAB and with a sur-

    gical indication, an intraoperative FS can be utilized.® It

    has been reported that FS significantly contributes to the'
  - 'Preconditioning with a myriad of other soluble factors, such

    as growth factors or hormones, seems to also potentiate MSCs

    regenerative capacity, mainly by stimulating angiogenesis and

    inhibiting fibrosis. For example, intracardiac transplantation

    of SDF-1-preconditioned MSCs increased angiogenesis and

    reduced fibrosis in the ischemic area of a post-infarct heart (89).

    The effects observed were attributed to the activation of the Akt

    signaling pathway, similarly to what was described for hypoxia-

    preconditioned MSCs. TGF-a-preconditioned MSCs enhanced'
- source_sentence: What is the number of genes obtained from comparing control and
    LIPUS-stimulated samples?
  sentences:
  - 'Differentially expressed genes (DEGs) were obtained

    between control and LIPUS-stimulated samples using

    an adjusted P<0.05 and|log2FC| > 1 as cutoffs to define

    statistically significant differential expression. 676 genes

    were obtained from which 578 were upregulated when

    stimulated with LIPUS and 98 genes were subregulated

    (Supp. Figure 1). To further understand the functions

    and pathways associated with the differentially expressed

    genes (DEG), Gene Ontology (GO) and Kyoto Encyclo-

    pedia of Genes and Genomes (KEGG) analyses were con-

    ducted using the DAVID database [37, 38].'
  - 'Another advantage of ADSCs is their immune privilege

    status due to a lack of major histocompatibility complex

    II (MHC Il) and costimulatory molecules.42,43,45,.47 Some

    studies have even demonstrated a higher immunosuppres-

    sion capacity in ADSCs compared to BMSCs as ADSCs ex-

    pressed lower levels of human antigen class I (HLA I) anti-

    gen.47 They also have a unique secretome and can produce

    immunomodulatory, anti-apoptotic, hematopoietic, and

    angiogenic factors that can help with repair of tissues -

    characteristics that may support successful transplanta-'
  - 'independent studies have shown a raising trend in both cancer incidence [2] and
    a high-salt

    dietary lifestyle [7], there is no direct correlation between dietary salt intake
    and breast

    cancer. Interestingly, in the human body, certain organs such as the skin and
    lymph nodes

    have a natural tendency to accumulate salt [8]. Although unknown, the pathophysiological

    significance of this selective accumulation of sodium in certain organs and solid
    tumors is

    an area of intense research.'
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy
model-index:
- name: SentenceTransformer based on microsoft/mpnet-base
  results:
  - task:
      type: triplet
      name: Triplet
    dataset:
      name: initial test
      type: initial_test
    metrics:
    - type: cosine_accuracy
      value: 0.9800000190734863
      name: Cosine Accuracy
  - task:
      type: triplet
      name: Triplet
    dataset:
      name: final test
      type: final_test
    metrics:
    - type: cosine_accuracy
      value: 0.9800000190734863
      name: Cosine Accuracy
---

# SentenceTransformer based on microsoft/mpnet-base

This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

## Model Details

### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) <!-- at revision 6996ce1e91bd2a9c7d7f61daec37463394f73f09 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
    - json
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->

### Model Sources

- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)

### Full Model Architecture

```
SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'MPNetModel'})
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```

## Usage

### Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

```bash
pip install -U sentence-transformers
```

Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sahithkumar7/final-mpnet-base-fullfinetuned-epoch3")
# Run inference
sentences = [
    'What is the number of genes obtained from comparing control and LIPUS-stimulated samples?',
    'Differentially expressed genes (DEGs) were obtained\nbetween control and LIPUS-stimulated samples using\nan adjusted P<0.05 and|log2FC| > 1 as cutoffs to define\nstatistically significant differential expression. 676 genes\nwere obtained from which 578 were upregulated when\nstimulated with LIPUS and 98 genes were subregulated\n(Supp. Figure 1). To further understand the functions\nand pathways associated with the differentially expressed\ngenes (DEG), Gene Ontology (GO) and Kyoto Encyclo-\npedia of Genes and Genomes (KEGG) analyses were con-\nducted using the DAVID database [37, 38].',
    'independent studies have shown a raising trend in both cancer incidence [2] and a high-salt\ndietary lifestyle [7], there is no direct correlation between dietary salt intake and breast\ncancer. Interestingly, in the human body, certain organs such as the skin and lymph nodes\nhave a natural tendency to accumulate salt [8]. Although unknown, the pathophysiological\nsignificance of this selective accumulation of sodium in certain organs and solid tumors is\nan area of intense research.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[ 1.0000,  0.6291, -0.0130],
#         [ 0.6291,  1.0000, -0.0026],
#         [-0.0130, -0.0026,  1.0000]])
```

<!--
### Direct Usage (Transformers)

<details><summary>Click to see the direct usage in Transformers</summary>

</details>
-->

<!--
### Downstream Usage (Sentence Transformers)

You can finetune this model on your own dataset.

<details><summary>Click to expand</summary>

</details>
-->

<!--
### Out-of-Scope Use

*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->

## Evaluation

### Metrics

#### Triplet

* Datasets: `initial_test` and `final_test`
* Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)

| Metric              | initial_test | final_test |
|:--------------------|:-------------|:-----------|
| **cosine_accuracy** | **0.98**     | **0.98**   |

<!--
## Bias, Risks and Limitations

*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->

<!--
### Recommendations

*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->

## Training Details

### Training Dataset

#### json

* Dataset: json
* Size: 800 training samples
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
* Approximate statistics based on the first 800 samples:
  |         | anchor                                                                            | positive                                                                             | negative                                                                             |
  |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                               | string                                                                               |
  | details | <ul><li>min: 7 tokens</li><li>mean: 16.79 tokens</li><li>max: 39 tokens</li></ul> | <ul><li>min: 39 tokens</li><li>mean: 117.74 tokens</li><li>max: 265 tokens</li></ul> | <ul><li>min: 40 tokens</li><li>mean: 116.14 tokens</li><li>max: 356 tokens</li></ul> |
* Samples:
  | anchor                                                                               | positive                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                          | negative                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             |
  |:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>What is the limitation of FBG-based sensors in tactile feedback?</code>        | <code>Furthermore, FBG-based 3-axis tactile sensors have been<br>proposed for a more comprehensive haptic perception tool<br>in surgeries (Figure 1D) (16). Five optical fibers merged<br>with FBG sensors are suspended in a deformable medium<br>and measure the compression or tension of the tissue as the<br>sensors are pressed against it, returning a _ surface<br>reaction map. While FBG-based sensors are small, flexible, and<br>sensitive, there are several challenges that need to be<br>addressed for optimal performance for tactile feedback. These<br>sensors are temperature sensitive, requiring temperature</code>                                                                                                          | <code>141]. Therefore, it is not known to what extent spared<br>axons are remyelinated by transplanted Schwann cells,<br>nor is the contribution of this myelin to functional im-<br>provements proven. Transplantation of Schwann cells<br>incapable of producing myelin, such as cells derived<br>from trembler (Pmp22Tr) mutant mice, may be useful<br>in establishing a causal relationship between myelin re-<br>generation and functional improvements. Several MSC<br>transplantations demonstrate an increase of myelin re-<br>tention and the number of myelinated axons in the le-<br>sion site during a chronic post-injury period [57]. Thus,</code>                                     |
  | <code>What are the advantages of strain elastography?</code>                         | <code>frontiersin.org<br><br>--- Page 8 ---<br>Kumar et al.<br><br>TABLE 2 Modalities of ultrasound elastography.<br><br>Modality<br>Strain elastography<br><br>Excitation<br>Applied manual compression (38)<br><br>Advantages<br><br>No additional specialized equipment<br>required (40)<br><br>10.3389/fmedt.2023.1238129<br><br>Limitations<br><br>Qualitative measurements (39)<br><br>Internal physiological mechanism (42)<br><br>Simple low-cost design (40)<br><br>Applied compression is operator-dependent (51)<br><br>More commonly used (52)<br><br>High inter-observer variability (51)<br><br>coustic radiation force impulse Acoustic radiation force (43)<br><br>(ARFI) imaging<br><br>Image beyond slip boundaries (45)</code> | <code>Publisher’s Note: MDPI stays neutral<br>with regard to jurisdictional claims in<br>published maps and institutional afil-<br><br>iations.<br><br>onon)<br><br>Copyright: © 2021 by the author.<br>Licensee MDPI, Basel, Switzerland.<br>This article is an open access article<br>distributed under the terms and<br>conditions of the Creative Commons<br>Attribution (CC BY) license (https://<br>creativecommons.org/licenses/by/<br>4.0/).<br><br>Joan and Sanford I. Weill Department of Medicine, Weill Cornell Medical College, 525 East 68th Street,<br>Room M-522, Box 130, New York, NY 10065, USA; [email protected] or [email protected]</code>                     |
  | <code>What is the material used for the substrate in a piezoelectric element?</code> | <code>gain for biomedical applications.<br><br>frontiersin.org<br><br>--- Page 9 ---<br>Kumar et al.<br><br>><br><br>[PMUT ]<br><br>Electrode: Voltage Electrode2<br><br>© piezoelectric elements<br>o<br><br>—: OSi02<br><br>©) silicon substrate<br><br>B [ CMUT ]<br>AC DC<br><br>membrane<br><br>—————<br><br>vacuum<br>insulator<br><br>substrate<br><br>= ground<br><br>FIGURE 3</code>                                                                                                                                                                                                                                                                                                                                                     | <code>Histopatholo<br>Cytology Total, n (%) Benign, n (%) P ey Cancer, n (%)<br>FA 2 (15.4%) FTC 2 (25%)<br>0 GD (7.7%) PTC 6 (75%)<br>I 21 (4.0%) NG 9 (69.2%)<br>Other diagnosis (7.7%)<br>FA 15 (9.9%) FIC 4 (14.3%)<br>FT-UMP (0.7%) MTC 3 (10.7%)<br>GD (0.7%) PTC 21 (75%)<br>Il 180 (34.5%) OA (0.7%)<br>LT (0.7%)<br>NG 130 (85.5%)<br>NIFTP 2 (1.3%)<br>FA 14 (23.7%) FIC 7 (28.0%)<br>FI-UMP 2 (3.4%) OTC 1 (4.0%)<br>OA (1.7%) PTC 17 (68.0%)<br>Il 84 (16.1%) LT 3 (5.1%)<br>NG 35 (59.3%)<br>NIFTP 2 (3.4%)<br>WDT-UMP 2 (3.4%)<br>FA 15 (26.3%) OTC 1 (7.7%)<br>FT-UMP 5 (8.8%) PTC 12 (92.3%)<br>OA 13 (22.8%)<br>IV 70 (13.4%) LT 2 (3.5%)<br>NG 18 (31.6%)<br>NIFTP 2 (3.5%)</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
  ```json
  {
      "scale": 20.0,
      "similarity_fct": "cos_sim"
  }
  ```

### Evaluation Dataset

#### json

* Dataset: json
* Size: 200 evaluation samples
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
* Approximate statistics based on the first 200 samples:
  |         | anchor                                                                            | positive                                                                            | negative                                                                             |
  |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                              | string                                                                               |
  | details | <ul><li>min: 7 tokens</li><li>mean: 17.14 tokens</li><li>max: 35 tokens</li></ul> | <ul><li>min: 40 tokens</li><li>mean: 121.3 tokens</li><li>max: 356 tokens</li></ul> | <ul><li>min: 45 tokens</li><li>mean: 119.75 tokens</li><li>max: 356 tokens</li></ul> |
* Samples:
  | anchor                                                                     | positive                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             | negative                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                 |
  |:---------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>What can differentiate into a very wide variety of tissues?</code>   | <code>lead to decreased rates of graft-versus-host disease. They<br>also can differentiate into a very wide variety of tissues. For<br>example, when compared with bone marrow stem cells or<br>mobilized peripheral blood, umbilical cord blood stem cells<br>have a greater repopulating ability.5° Cord blood derived<br>CD34+ cells have very potent hematopoietic abilities, and<br>this is attributed to the immaturity of the stem cells rela-<br>tive to adult derived cells. Studies have been done that an-<br>alyze long term survival of children with hematologic dis-<br>orders who were transplanted with umbilical cord blood</code> | <code>metabolic regulation may affect the function of more than one organelle. Therefore, if the<br>miR-17-92 regulatory cluster can perturb genes related to mitochondrial metabolic function,<br>it could be also related, in some way, to genes involved in lysosomal metabolic function.<br>Lysosomes are intracellular organelles that, in form of small vesicles, participate in<br>several cellular functions, mainly digestion, but also vesicle trafficking, autophagy, nutrient<br>sensing, cellular growth, signaling [85], and even enzyme secretion. The membrane-bound</code>                              |
  | <code>What are the two most common types of pluripotent stem cells?</code> | <code>III]. AMNIOTIC CELLS AS A SOURCE FOR STEM<br>CELLS<br><br>Historically, the two most common types of pluripotent<br>stem cells include embryonic stem cells (ESCs) and induced<br>pluripotent stem cells (iPSCs).35 However, despite the many<br>research efforts to improve ESC and iPSC technologies,<br>there are still enormous clinical challenges.°> Two signif-<br>icant issues posed by ESC and iPSC technologies include<br>low survival rate of transplanted cells and tumorigenicity.°><br>Recently, researchers have isolated pluripotent stem cells</code>                                                                        | <code>Explanation: criterion 6 indicates a positive diagnosis only within the DC VI group<br>relative to all other categories. Criterion 5 indicates a positive diagnosis within the DCs VI<br>and V relative to all other categories.<br><br>The highest positive predictive value (PPV) confirming malignancy through histopatho-<br>logical examination for criterion 6 was 0.93, and for criterion 5, it was 0.92. For the subsequent<br>criteria, the PPVs were as follows: criterion 4—0.66; criterion 3—0.55; criterion 2—0.40.</code>                                                                            |
  | <code>What percentage of stem cells are present in bone marrow?</code>     | <code>ing 30% in some tissues.43-45 This is a significant difference<br>from the .0001-.0002% stem cells present in bone marrow.43<br>Given this difference in stem cell concentration between<br>the sources, there will be more ADSCs per sample of WAT</code>                                                                                                                                                                                                                                                                                                                                                                                     | <code>migration of bCSCs. This finding raises the possibil-<br>ity that LIPUS may decrease the ability of these cells to<br>invade adjacent tissues and start the process of metasta-<br>ses. These results also suggested that some of the changes<br>induced by LIPUS take longer to be detected in this type<br>of 2D migration model, possible due to changes in gene<br>expression pattern. To further study this hypothesis, we<br>performed a Transwell invasion assay. The data revealed<br>a reduced number of cells crossing the membrane after<br>LIPUS stimulation, indicating that therapeutic LIPUS</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
  ```json
  {
      "scale": 20.0,
      "similarity_fct": "cos_sim"
  }
  ```

### Training Hyperparameters
#### Non-Default Hyperparameters

- `eval_strategy`: steps
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `warmup_ratio`: 0.1
- `fp16`: True
- `batch_sampler`: no_duplicates

#### All Hyperparameters
<details><summary>Click to expand</summary>

- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 3
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`: 
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
- `router_mapping`: {}
- `learning_rate_mapping`: {}

</details>

### Training Logs
<details><summary>Click to expand</summary>

| Epoch  | Step | Training Loss | Validation Loss | initial_test_cosine_accuracy | final_test_cosine_accuracy |
|:------:|:----:|:-------------:|:---------------:|:----------------------------:|:--------------------------:|
| -1     | -1   | -             | -               | 0.7800                       | -                          |
| 0.02   | 1    | 3.124         | -               | -                            | -                          |
| 0.04   | 2    | 3.2227        | -               | -                            | -                          |
| 0.06   | 3    | 3.1108        | -               | -                            | -                          |
| 0.08   | 4    | 3.1317        | -               | -                            | -                          |
| 0.1    | 5    | 3.3326        | -               | -                            | -                          |
| 0.12   | 6    | 2.9797        | -               | -                            | -                          |
| 0.14   | 7    | 3.0933        | -               | -                            | -                          |
| 0.16   | 8    | 2.7409        | -               | -                            | -                          |
| 0.18   | 9    | 2.7381        | -               | -                            | -                          |
| 0.2    | 10   | 2.6301        | -               | -                            | -                          |
| 0.22   | 11   | 2.005         | -               | -                            | -                          |
| 0.24   | 12   | 2.1863        | -               | -                            | -                          |
| 0.26   | 13   | 2.8065        | -               | -                            | -                          |
| 0.28   | 14   | 1.6524        | -               | -                            | -                          |
| 0.3    | 15   | 1.7121        | -               | -                            | -                          |
| 0.32   | 16   | 1.9863        | -               | -                            | -                          |
| 0.34   | 17   | 1.4783        | -               | -                            | -                          |
| 0.36   | 18   | 1.0542        | -               | -                            | -                          |
| 0.38   | 19   | 1.1223        | -               | -                            | -                          |
| 0.4    | 20   | 1.0425        | 0.9097          | 0.9000                       | -                          |
| 0.42   | 21   | 1.2517        | -               | -                            | -                          |
| 0.44   | 22   | 1.048         | -               | -                            | -                          |
| 0.46   | 23   | 1.0064        | -               | -                            | -                          |
| 0.48   | 24   | 0.9887        | -               | -                            | -                          |
| 0.5    | 25   | 0.6468        | -               | -                            | -                          |
| 0.52   | 26   | 0.8978        | -               | -                            | -                          |
| 0.54   | 27   | 0.439         | -               | -                            | -                          |
| 0.56   | 28   | 0.8051        | -               | -                            | -                          |
| 0.58   | 29   | 0.7684        | -               | -                            | -                          |
| 0.6    | 30   | 0.573         | -               | -                            | -                          |
| 0.62   | 31   | 0.6101        | -               | -                            | -                          |
| 0.64   | 32   | 0.9438        | -               | -                            | -                          |
| 0.66   | 33   | 0.8656        | -               | -                            | -                          |
| 0.68   | 34   | 0.5758        | -               | -                            | -                          |
| 0.7    | 35   | 0.2412        | -               | -                            | -                          |
| 0.72   | 36   | 0.4738        | -               | -                            | -                          |
| 0.74   | 37   | 0.7844        | -               | -                            | -                          |
| 0.76   | 38   | 0.7517        | -               | -                            | -                          |
| 0.78   | 39   | 0.3222        | -               | -                            | -                          |
| 0.8    | 40   | 0.466         | 0.6199          | 0.9600                       | -                          |
| 0.82   | 41   | 0.5259        | -               | -                            | -                          |
| 0.84   | 42   | 0.3936        | -               | -                            | -                          |
| 0.86   | 43   | 0.23          | -               | -                            | -                          |
| 0.88   | 44   | 0.4184        | -               | -                            | -                          |
| 0.9    | 45   | 0.7641        | -               | -                            | -                          |
| 0.92   | 46   | 0.2579        | -               | -                            | -                          |
| 0.94   | 47   | 1.2493        | -               | -                            | -                          |
| 0.96   | 48   | 0.4205        | -               | -                            | -                          |
| 0.98   | 49   | 0.4778        | -               | -                            | -                          |
| 1.0    | 50   | 0.545         | -               | -                            | -                          |
| 1.02   | 51   | 0.2018        | -               | -                            | -                          |
| 1.04   | 52   | 0.2048        | -               | -                            | -                          |
| 1.06   | 53   | 0.2031        | -               | -                            | -                          |
| 1.08   | 54   | 0.5784        | -               | -                            | -                          |
| 1.1    | 55   | 0.2764        | -               | -                            | -                          |
| 1.12   | 56   | 0.5112        | -               | -                            | -                          |
| 1.1400 | 57   | 0.2482        | -               | -                            | -                          |
| 1.16   | 58   | 0.3772        | -               | -                            | -                          |
| 1.18   | 59   | 0.1247        | -               | -                            | -                          |
| 1.2    | 60   | 0.1832        | 0.5882          | 1.0                          | -                          |
| 1.22   | 61   | 0.1802        | -               | -                            | -                          |
| 1.24   | 62   | 0.3174        | -               | -                            | -                          |
| 1.26   | 63   | 0.0836        | -               | -                            | -                          |
| 1.28   | 64   | 0.2814        | -               | -                            | -                          |
| 1.3    | 65   | 0.0926        | -               | -                            | -                          |
| 1.32   | 66   | 0.3834        | -               | -                            | -                          |
| 1.34   | 67   | 0.2547        | -               | -                            | -                          |
| 1.3600 | 68   | 0.3229        | -               | -                            | -                          |
| 1.38   | 69   | 0.0441        | -               | -                            | -                          |
| 1.4    | 70   | 0.1735        | -               | -                            | -                          |
| 1.42   | 71   | 0.0494        | -               | -                            | -                          |
| 1.44   | 72   | 0.2197        | -               | -                            | -                          |
| 1.46   | 73   | 0.2218        | -               | -                            | -                          |
| 1.48   | 74   | 0.2196        | -               | -                            | -                          |
| 1.5    | 75   | 0.2516        | -               | -                            | -                          |
| 1.52   | 76   | 0.6337        | -               | -                            | -                          |
| 1.54   | 77   | 0.1729        | -               | -                            | -                          |
| 1.56   | 78   | 0.5629        | -               | -                            | -                          |
| 1.58   | 79   | 0.4224        | -               | -                            | -                          |
| 1.6    | 80   | 0.1977        | 0.4683          | 1.0                          | -                          |
| 1.62   | 81   | 0.2117        | -               | -                            | -                          |
| 1.6400 | 82   | 0.2423        | -               | -                            | -                          |
| 1.6600 | 83   | 0.2047        | -               | -                            | -                          |
| 1.6800 | 84   | 0.1741        | -               | -                            | -                          |
| 1.7    | 85   | 0.4539        | -               | -                            | -                          |
| 1.72   | 86   | 0.5744        | -               | -                            | -                          |
| 1.74   | 87   | 0.2697        | -               | -                            | -                          |
| 1.76   | 88   | 0.1926        | -               | -                            | -                          |
| 1.78   | 89   | 0.1882        | -               | -                            | -                          |
| 1.8    | 90   | 0.1527        | -               | -                            | -                          |
| 1.8200 | 91   | 0.2154        | -               | -                            | -                          |
| 1.8400 | 92   | 0.5145        | -               | -                            | -                          |
| 1.8600 | 93   | 0.1294        | -               | -                            | -                          |
| 1.88   | 94   | 0.1499        | -               | -                            | -                          |
| 1.9    | 95   | 0.2143        | -               | -                            | -                          |
| 1.92   | 96   | 0.2039        | -               | -                            | -                          |
| 1.94   | 97   | 0.1662        | -               | -                            | -                          |
| 1.96   | 98   | 0.1414        | -               | -                            | -                          |
| 1.98   | 99   | 0.0743        | -               | -                            | -                          |
| 2.0    | 100  | 0.1603        | 0.4067          | 0.9800                       | -                          |
| 2.02   | 101  | 0.1885        | -               | -                            | -                          |
| 2.04   | 102  | 0.1539        | -               | -                            | -                          |
| 2.06   | 103  | 0.0592        | -               | -                            | -                          |
| 2.08   | 104  | 0.0874        | -               | -                            | -                          |
| 2.1    | 105  | 0.0873        | -               | -                            | -                          |
| 2.12   | 106  | 0.057         | -               | -                            | -                          |
| 2.14   | 107  | 0.0317        | -               | -                            | -                          |
| 2.16   | 108  | 0.0807        | -               | -                            | -                          |
| 2.18   | 109  | 0.0232        | -               | -                            | -                          |
| 2.2    | 110  | 0.0847        | -               | -                            | -                          |
| 2.22   | 111  | 0.0811        | -               | -                            | -                          |
| 2.24   | 112  | 0.0688        | -               | -                            | -                          |
| 2.26   | 113  | 0.1392        | -               | -                            | -                          |
| 2.2800 | 114  | 0.0681        | -               | -                            | -                          |
| 2.3    | 115  | 0.0329        | -               | -                            | -                          |
| 2.32   | 116  | 0.0177        | -               | -                            | -                          |
| 2.34   | 117  | 0.0794        | -               | -                            | -                          |
| 2.36   | 118  | 0.1128        | -               | -                            | -                          |
| 2.38   | 119  | 0.095         | -               | -                            | -                          |
| 2.4    | 120  | 0.0384        | 0.4131          | 0.9800                       | -                          |
| 2.42   | 121  | 0.0791        | -               | -                            | -                          |
| 2.44   | 122  | 0.078         | -               | -                            | -                          |
| 2.46   | 123  | 0.0232        | -               | -                            | -                          |
| 2.48   | 124  | 0.0265        | -               | -                            | -                          |
| 2.5    | 125  | 0.023         | -               | -                            | -                          |
| 2.52   | 126  | 0.1105        | -               | -                            | -                          |
| 2.54   | 127  | 0.0114        | -               | -                            | -                          |
| 2.56   | 128  | 0.1051        | -               | -                            | -                          |
| 2.58   | 129  | 0.0178        | -               | -                            | -                          |
| 2.6    | 130  | 0.0731        | -               | -                            | -                          |
| 2.62   | 131  | 0.051         | -               | -                            | -                          |
| 2.64   | 132  | 0.0589        | -               | -                            | -                          |
| 2.66   | 133  | 0.1714        | -               | -                            | -                          |
| 2.68   | 134  | 0.0452        | -               | -                            | -                          |
| 2.7    | 135  | 0.0491        | -               | -                            | -                          |
| 2.7200 | 136  | 0.0652        | -               | -                            | -                          |
| 2.74   | 137  | 0.0534        | -               | -                            | -                          |
| 2.76   | 138  | 0.0414        | -               | -                            | -                          |
| 2.7800 | 139  | 0.0611        | -               | -                            | -                          |
| 2.8    | 140  | 0.1983        | 0.4193          | 0.9800                       | -                          |
| 2.82   | 141  | 0.0489        | -               | -                            | -                          |
| 2.84   | 142  | 0.0215        | -               | -                            | -                          |
| 2.86   | 143  | 0.0491        | -               | -                            | -                          |
| 2.88   | 144  | 0.0521        | -               | -                            | -                          |
| 2.9    | 145  | 0.1212        | -               | -                            | -                          |
| 2.92   | 146  | 0.0464        | -               | -                            | -                          |
| 2.94   | 147  | 0.0145        | -               | -                            | -                          |
| 2.96   | 148  | 0.0281        | -               | -                            | -                          |
| 2.98   | 149  | 0.1358        | -               | -                            | -                          |
| 3.0    | 150  | 0.0479        | -               | -                            | -                          |
| -1     | -1   | -             | -               | -                            | 0.9800                     |

</details>

### Framework Versions
- Python: 3.11.13
- Sentence Transformers: 5.0.0
- Transformers: 4.52.4
- PyTorch: 2.6.0+cu124
- Accelerate: 1.8.1
- Datasets: 3.6.0
- Tokenizers: 0.21.2

## Citation

### BibTeX

#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
```

#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
```

<!--
## Glossary

*Clearly define terms in order to be accessible across audiences.*
-->

<!--
## Model Card Authors

*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->

<!--
## Model Card Contact

*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->