jo-mengr commited on
Commit
e79964a
·
verified ·
1 Parent(s): a78e2d1

Add new SentenceTransformer model

Browse files
0_MMContextEncoder/config.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "text_encoder_name": "NeuML/pubmedbert-base-embeddings",
3
+ "adapter_hidden_dim": null,
4
+ "adapter_output_dim": null,
5
+ "freeze_text_encoder": true,
6
+ "unfreeze_last_n_layers": 0,
7
+ "registered_data_origin": "unregistered",
8
+ "registered_input_dim": null,
9
+ "output_token_embeddings": false,
10
+ "train_lookup": false,
11
+ "pooling_mode": "mean",
12
+ "joint_adapter_hidden_dim": null,
13
+ "_joint_adapter_was_trained": false,
14
+ "max_seq_length": 512,
15
+ "text_model_kwargs": {}
16
+ }
0_MMContextEncoder/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:83eaa68d2d173cd3b5f7b3175469ddd49d435a9575d9eeab166001b3a5b7cf6e
3
+ size 437953880
README.md ADDED
@@ -0,0 +1,588 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - code
4
+ tags:
5
+ - sentence-transformers
6
+ - sentence-similarity
7
+ - feature-extraction
8
+ - dense
9
+ - generated_from_trainer
10
+ - dataset_size:81143
11
+ - loss:MultipleNegativesRankingLoss
12
+ base_model: NeuML/pubmedbert-base-embeddings
13
+ widget:
14
+ - source_sentence: EEF1A1 FTL CD74 MALAT1 TPT1 ACTB TMSB10 WARS1 HSPA8 LCP1 EIF1 PTMA
15
+ HSP90AB1 RBM3 FAU TYMP VIM GSTP1 CALR RACK1 TMSB4X HSP90AA1 HSPA5 SYNGR2 STAT1
16
+ FTH1 IRF1 PPDPF BTF3 LAPTM5 HSP90B1 GDI2 WDR1 CORO1A ATP5F1E TMBIM6 HINT1 NACA
17
+ HERPUD1 MYL6 GADD45B PGK1 DDX5 GAPDH MOB1A ACTR3 CSDE1 EIF4B PABPC1 RBX1 ATP5F1B
18
+ ARPC3 PRDX1 NCOA3 PRDX5 RAN ACTR2 SNRPG SNRNP200 ALDOA ATP5F1A YWHAZ PPP1CA TALDO1
19
+ EWSR1 HMGB1 PSMA4 EIF2AK2 RHOA CDC42 ENO1 SRI MEF2C EIF3L ACIN1 MYL12A OAZ1 NOP53
20
+ TAX1BP1 CHCHD2 PDAP1 PFN1 PRKAR1A EIF4G2 PTGES3 TNFAIP3 RBM25 TUBA1B UXT COX7C
21
+ ZFP36 BST2 RBM39 COX4I1 CCT7 DBNL SEC11A SERBP1 TPM3 HNRNPU DBI SAFB GHITM YWHAB
22
+ PDIA3 TRANK1 UBB PA2G4 RESF1 MORF4L1 LITAF HLA-DRB5 SELENOH TMA7 LAP3 CFLAR ARF5
23
+ CD38 RBM6 VRK2 CYBA YBX1 PKM CLEC2D RAB7A XRCC5 CLDND1 BAX
24
+ sentences:
25
+ - This measurement was conducted with 10x 5' v1. Naive B cell from blood of a 26-year
26
+ old male, activated with CD3.
27
+ - EEF1A1 MALAT1 TMSB4X NACA TPT1 PABPC1 FAU PTMA FTL FTH1 NPM1 HSPA5 LDHB COX4I1
28
+ LCP1 SH3BGRL3 EEF2 EIF1 RACK1 GSTP1 SMCHD1 ELOB DDX5 GAPDH GTF3A BTF3 HNRNPU TAGLN2
29
+ RNF149 SSR2 YWHAB HNRNPF AKAP13 CALR OST4 MYCBP2 IL32 VIM TMSB10 GABARAPL2 THRAP3
30
+ ARID4B EIF4B TRAM1 HSP90AA1 ERP29 FXYD5 EZR RAB18 EIF3L MYH9 EIF3E PDCD5 RABAC1
31
+ FKBP8 CHCHD2 DOCK8 HDLBP SRSF7 TMED5 MYL12B TRIR NCOA3 EIF2S2 MMP24OS COX7C ATF4
32
+ LDLR ATP5MC2 USP15 GCC2 ISCU NUMA1 CD53 TPM3 CALM2 ATG3 PNRC1 EIF3H UBC PDE3B
33
+ TMEM123 HNRNPDL EPB41 SNRPD1 ATP5MG VAMP5 HINT1 LIMS1 CFL1 SMARCC1 TUFM GIMAP7
34
+ SSR4 MORF4L1 IKZF1 PPIA CDC42SE1 HMGN2 FIS1 UBA52 NEAT1 EID1 RBM6 NDUFAB1 RALA
35
+ CRY1 CSDE1 ZNF207 FYN SLC30A9 CD74 RABEP1 ARID4A STAU2 YTHDC2 POLR2B CELF2 SYNE2
36
+ NOP58 PUM2 ATP11B SLC2A3 VMP1 YBX1 RHOA KIF2A ERLEC1
37
+ - This measurement was conducted with 10x 5' v1. A 26-year-old male individual's
38
+ blood sample, containing naive thymus-derived CD4-positive, alpha-beta T cells,
39
+ with no activation or treatment, and in G1 phase.
40
+ - source_sentence: MALAT1 FTH1 EEF1A1 TPT1 FTL EIF1 ACTB UBC S100A6 NACA NFKBIA DST
41
+ TMSB10 FAU RACK1 HSP90AA1 SAT1 H3-3B TXN CEBPD MGP CD63 NPM1 PTMA NEAT1 VIM MYL6
42
+ HSP90AB1 JUND EIF4A1 S100A11 TMSB4X GAPDH CALD1 CCN1 ANXA5 TM4SF1 NME2 MYL12A
43
+ ACTA2 MYL12B TPM1 H1-10 GSTP1 CD59 DUSP1 ANXA1 BTF3 ITGB1 EEF2 JUNB CEBPB ZFAS1
44
+ ANXA2 TPM2 UBA52 EEF1G MGST1 YBX1 KLF6 PPP1R15A PFN1 GADD45A DSTN DNAJB1 TMBIM6
45
+ SERF2 SRP14 FBXO32 SLC3A2 UBB JUN PPIA S100A10 SELENOM CD9 YBX3 GADD45B CAV1 CHCHD2
46
+ HSPB1 PRDX1 CCNI ZFP36 LGALS3 GSTO1 CSRP1 TPM4 HINT1 RNASEK TYMP CD44 MYLK LAPTM4A
47
+ PABPC1 ACTN1 ENO1 SLC25A3 GNAS HNRNPC TSC22D1 OAZ1 PDCD5 RHEB EDF1 DDX5 PTGES3
48
+ SRSF3 COX7A2 SKP1 NCL EGR1 HNRNPA2B1 TUBA1B YWHAH RBM39 COX4I1 RAN ITM2B RAC1
49
+ HSPD1 HNRNPDL EIF4A2 DDX21 PPIB GPX4 SPINT2 SRRM2
50
+ sentences:
51
+ - This measurement was conducted with 10x 3' v3. Mural cells from the breast tissue
52
+ of a 23-year-old female, obtained through reduction mammoplasty, in supernatant
53
+ form.
54
+ - MALAT1 FTH1 FTL EEF1A1 H3-3B TPT1 EIF1 VIM TMSB4X S100A6 TIMP1 UBC PTMA NEAT1
55
+ CD63 FAU TXN NACA MYL6 ANXA5 ACTB EIF4A1 IGFBP7 HSP90AA1 CHCHD2 DSTN SAT1 TMSB10
56
+ YBX3 CD59 SERF2 PNRC1 CEBPB RACK1 CEBPD UBA52 LGALS1 JUND ARID5B TPM4 SELENOM
57
+ YBX1 LAPTM4A HSP90AB1 GAPDH HSPE1 DUSP1 SRP14 BTF3 S100A11 NPM1 H1-10 PDK4 CD44
58
+ DDX5 CALD1 SOX4 AKAP12 ANXA1 CTSL JUNB ZFAS1 JUN HMGB1 CALM1 RNASEK EEF1G KLF6
59
+ SLC25A3 PPP1R15A NFKBIA MYL12A OAZ1 SEC61B PFN1 PTGES3 EIF1B RTN4 PRRX1 PRDX1
60
+ SBDS ZFP36 APOE MAP1B RAN SEC61G POMP COL4A2 CCN1 GSTO1 HNRNPDL HNRNPK DDX21 EEF2
61
+ NDUFS5 HINT1 UBB CYCS PPA1 ZFP36L1 COL4A1 MEG3 NME2 H2AJ SNHG8 SLC25A5 PABPC1
62
+ RAB21 GNAS HNRNPH3 NAMPT EDF1 ACTA2 UBE2D3 EIF4G2 CD81 CLEC2B SRSF3 LAMA4 SUB1
63
+ SKP1 ATP6V0E1 SELENOK IGFBP5 GADD45A CNN3 MYL12B UBE2B
64
+ - This measurement was conducted with 10x 3' v3. Myoepithelial cells from the breast
65
+ of a 46-year-old female who underwent prophylactic mastectomy.
66
+ - source_sentence: MALAT1 EEF1A1 TPT1 PTMA ACTB TMSB4X H3-3B FTL FTH1 TMSB10 LGALS1
67
+ VIM CYBA FAU EIF1 NACA RACK1 UBA52 HSP90AA1 CD63 SH3BGRL3 LMO4 HMGB1 S100A4 UBC
68
+ HNRNPU HSP90AB1 DDX5 DUSP1 HNRNPA2B1 SOX4 JUND DBI S100A6 GSTP1 MYL6 PFN1 GAPDH
69
+ SRGN SERF2 TAGLN2 IER2 UBB CFL1 JUN YBX1 PABPC1 OAZ1 ARPC3 CCNI DAD1 BTG1 ATP5MC2
70
+ BTF3 ZFP36L2 TSC22D3 EEF2 FOS IFITM2 PPIA KLF6 GNAS DYNLL1 MYL12A EIF3E NOP53
71
+ ID2 VAMP8 ATP5F1E SAT1 COX4I1 ITM2B SRP14 EMP3 LAPTM5 ARPC2 ZEB2 JUNB H1-10 S100A10
72
+ XIST TYROBP SERPINB1 RHOA CDC42 ENO1 SLC25A3 FUS HNRNPC CIRBP SRSF5 PNN CST3 CDK6
73
+ CHCHD2 PDLIM1 EIF4G2 ATP5F1B SRSF3 SKP1 MACROH2A1 ATP6V0E1 NCL SFPQ PRDX1 MYL12B
74
+ SET TUBA1B COX7C KLF2 RBM39 PNISR ANXA1 TRA2B TXN ATP6V1G1 ACTR2 SEC11A TPM3 PNRC1
75
+ EIF3H MBNL1 UQCRB CCNL1 HNRNPK ATP5MG GPX4 TUBA1A
76
+ sentences:
77
+ - MALAT1 TPT1 HSP90B1 SSR4 SUB1 EEF1A1 SAT1 XBP1 SPCS1 ITM2C PPIB SEC61B TMBIM6
78
+ SEC61G CYBA FAU UBC NACA SELENOS TMSB10 SEC11C UBE2J1 CALR TXNIP HSPA5 ACTB SELENOK
79
+ SPCS2 RRBP1 UBA52 H3-3B SERF2 FTH1 EIF1 SEC62 NUCB2 SSR2 VIM ERLEC1 MYL6 SRGN
80
+ ATP5F1E PTMA NEAT1 TRAM1 GNAS KLF6 LMAN1 MYDGF TMEM59 IFI6 ARPC2 H1-10 CD74 HERPUD1
81
+ HSP90AA1 OAZ1 GAPDH SSR3 CCNI SPCS3 COX4I1 ITM2B TXN HINT1 CFL1 KDELR1 EDF1 ZNF706
82
+ ISCU P4HB OSTC RACK1 OST4 YBX3 FNDC3B FTL SEL1L3 HSP90AB1 ATP6V0B COX7C KLF2 SARAF
83
+ SRP14 RHOB ATP5MG EEF2 NDUFS5 SRP72 IDH2 TMED2 DNAJC3 CORO1A CHCHD2 MAP3K8 ARPC3
84
+ SERP1 SPATS2 COX6B1 DAD1 ATP6V1G1 UBXN4 SRRM2 PPP1R2 TOP1 GSTP1 LGALS1 ERH EIF3A
85
+ COX7A2 DSTN EIF2S2 SDF2L1 RBM39 POMP TCF25 EMP3 TMEM123 ATP5MJ ARF4 ATP5ME UBB
86
+ JUNB NPM1 PTP4A2 HMGB1 DDX3X NDUFAB1
87
+ - This measurement was conducted with 10x 3' v3. Blasts cells derived from the blood
88
+ of a 4-month old male.
89
+ - This measurement was conducted with 10x 3' v3. This is a megakaryocyte-erythroid
90
+ progenitor cell (MEP-like) derived from a 1-month-old female patient with KMT2A-rearranged
91
+ (KMT2A-r) infant acute lymphoblastic leukemia (ALL). The cell exhibits increased
92
+ lineage plasticity, downregulated steroid response pathways, and belongs to a
93
+ hematopoietic stem and progenitor-like (HSPC-like) population that forms an immunosuppressive
94
+ signaling circuit with cytotoxic lymphocytes.
95
+ - source_sentence: MALAT1 NRXN3 ERBB4 CADM2 ROBO2 NRXN1 NPAS3 ADARB2 CCK GALNTL6 PCDH9
96
+ ROBO1 KCNQ5 GRID2 TENM2 HS6ST3 LSAMP ATP1B1 OXR1 IL1RAPL1 TCF4 FRMD4A SYNPR MEG3
97
+ OLFM3 DPP6 SNTG1 CNTNAP5 INPP4B CNR1 EDIL3 ZEB2 KCNMB2 ALCAM CNTN5 OPCML CXCL14
98
+ DAB1 CALM1 GNAS CHRM3 CDH8 ZNF385D GRIP1 GRM7 ASTN2 ANK3 GRIN2A PDE4B HDAC9 MEF2C
99
+ CCNH RTN1 GABRB2 GPM6A RTN4 MAGI1 PPP2R2B MYT1L MIR99AHG ARID1B PLEKHA5 NCOA1
100
+ TNRC6A GAPDH PDE4D SYNE1 MAP1B NFIB AKAP6 LINGO2 CNTN1 MAP2 MAPK10 QKI RTN3 LARGE1
101
+ SEMA6D LRRC4C NCAM1 PTPRD TNIK GRIA1 KCNMA1 CACNA1D RGS12 JMJD1C ARPP21 PLD5 MGAT4C
102
+ GABRG3 ATP6V0C DNM3 DMD TENM3 LIMCH1 RORA HSP90AA1 HSP90AB1 PHACTR1 AHI1 APP CNTN4
103
+ NTRK2 NR3C2 CLSTN2 CLASP2 KCNC2 FRMD5 VWC2L KCND2 LINC00632 MEG8 DLX6-AS1 LINC03051
104
+ GABRA1 CDH10 CCDC85A RUNX1T1 TNRC6B PCSK1N FBXW7 KHDRBS2 CDH9 CCNI SEPTIN7 ATXN1
105
+ CALY
106
+ sentences:
107
+ - This measurement was conducted with 10x 3' v3. Nucleus sample from a 50-year old
108
+ male neuron, specifically an MGE interneuron, located in the inferior temporal
109
+ gyrus region of the cerebral cortex.
110
+ - This measurement was conducted with 10x 3' v3. Nucleus suspension of neurons from
111
+ the inferior temporal gyrus region of the cerebral cortex, taken from a 29-year-old
112
+ male of European descent.
113
+ - MALAT1 DPP10 PTPRD NRXN3 PCDH9 ROBO2 PDE4D CADM2 CLSTN2 NKAIN2 NRXN1 PHACTR1 RALYL
114
+ MEG3 KIAA1217 OPCML CELF2 GRID2 SLC8A1 KCNMA1 FRMD4A RYR2 PDE1A CHRM3 KHDRBS3
115
+ KCNQ3 PCDH7 DMD NRG1 ZFPM2 ANK3 KALRN CDH12 PPP3CA CDH18 GPM6A PLXDC2 PDZRN4 PDE4B
116
+ HS6ST3 ARPP21 MCTP1 SGCZ TMEM132D GABRG3 ASTN2 PCLO MIR99AHG PAK3 SLC35F3 PID1
117
+ DPYD CNTN1 PTK2 RAPGEF4 MMP16 TMEFF2 OXR1 PTPRT MEG8 GRIN2B SYN2 DOCK4 AFF3 TENM2
118
+ NTRK3 IL1RAPL1 PRKG1 MAP2 NCALD POU6F2 SLC1A2 LINC01122 CELF4 GRIA1 SLC24A2 ATRNL1
119
+ SH3GL2 PEX5L MARCHF1 CACNA1D DSCAML1 DTNA AEBP2 SAMD12 CACNA1E RIMS1 PTPRG IMMP2L
120
+ DCLK1 LARGE1 CADPS RUNX1T1 PLEKHA5 SLC4A10 NFIA LINC01250 MLIP GABRB1 KCNH7 RTN1
121
+ SORBS2 HS3ST4 MYT1L PRKAG2 KHDRBS2 NCAM1 AK5 SLC35F1 MIR137HG CACNA2D3 SLIT3 LSAMP
122
+ FOCAD NRCAM PRKCE TTC3 MAPK10 CAMK2D HERC1 ERBB4 HIVEP3 GABRB3 SV2B DNM3 PACRG
123
+ WWOX XKR4
124
+ - source_sentence: MALAT1 EEF1A1 TMSB4X PTMA H3-3B VIM NACA COL1A2 RACK1 ACTB TMSB10
125
+ FAU NPM1 ATP5MC2 TPT1 BTF3 UBB HSP90AA1 MYL6 LGALS1 GAPDH CCNI NAP1L1 HMGN1 UBA52
126
+ FTL HSP90AB1 CIRBP STMN1 COX4I1 SRP14 IFITM3 ARPC2 HNRNPK SDC2 FOS EIF1 JUN ZFP36L1
127
+ YBX1 LAPTM4A NOP53 SRSF3 HSPE1 CCN2 BTG1 COL6A2 ILF2 RBMX HNRNPDL HMGB2 FTH1 EIF3F
128
+ XIST TMA7 PABPC1 ERH TSC22D1 ELOB EIF3E EDF1 DDX5 LDHB HES1 EIF1B SLIRP RBM25
129
+ CALD1 ATP5F1E TMEM258 NREP ITM2C ANP32B TPM1 ZFP36L2 UQCRB IER2 AP2M1 ATP5MG EEF2
130
+ ATP5ME FAM110B MARCKSL1 H1-0 HMGB1 TOMM7 SLC25A5 CD9 CLK1 VCAN PHPT1 MAP3K13 FSCN1
131
+ GSTP1 PEBP1 HNRNPC NDUFB7 DDX17 PSMB5 MYL12A UBE2I PTOV1 HSPB1 LUC7L3 CTSC ATP5F1B
132
+ ARPC3 ALDH2 SNX3 COX7A2 ARRDC3 SUB1 PDGFRB IGFBP5 MYL12B SEPTIN7 TRIR SOX4 PSMB2
133
+ COX7C SEM1 LSM7 RBM39 SEC61G SARAF ARGLU1 PPIG HSPD1
134
+ sentences:
135
+ - MALAT1 EEF1A1 TPT1 FTL RACK1 GAPDH PTMA TMSB10 FTH1 NACA NPM1 TMSB4X S100A10 HSP90AA1
136
+ FAU COX7C HMGB1 BTF3 LGALS3 SERF2 UBA52 ACTB HSPD1 HSP90B1 EEF2 GSTP1 HSP90AB1
137
+ TXN HINT1 PPIA TOMM7 MYL6 CFL1 PFN1 EIF1 ZFAS1 COX4I1 HMGA1 TPI1 TUBA1B ATP5F1E
138
+ NDUFA4 EIF3E LDHB HSPE1 HNRNPA2B1 H3-3B CD63 ATP5F1A ATP5MG UQCRH ANXA2 TMA7 COX7A2
139
+ ATP5MC3 ATP5ME UBB BSG SNHG8 YBX1 CHCHD2 ATP5F1B ATP5MC2 CALR PABPC1 SLC25A3 PRDX1
140
+ PPDPF SRP14 ATP5MJ UQCRB COX6C CKB NAP1L1 CYBA PSMA7 CST3 ELOB TMBIM6 ALDOA UQCRQ
141
+ ALDH1A1 NDUFS5 PRELID1 EIF3F P4HB HMGN1 ATP5PO SLC25A5 ENO1 EIF3L OAZ1 NOP53 HES1
142
+ PARK7 SET SAT1 COX7B CALM2 SPINT2 ATP5MK COX8A HMGN2 HACD3 TIMM13 HSPA8 NPC2 SNRPD2
143
+ COX6B1 ITM2B MGST3 NDUFB11 ATP5PF EIF4A2 S100A11 PPIB CYC1 UQCR10 PCBP2 XIST ATP5MF
144
+ NDUFAB1 MGST1 HNRNPC ATP5F1D EDF1 C1QBP SUB1
145
+ - This measurement was conducted with 10x 3' v2. Sample contains enteric neurons,
146
+ isolated from the ileum tissue of a human fetus at Carnegie stage 17.
147
+ - This measurement was conducted with 10x 3' v2. Colon tissue sample containing
148
+ mesodermal cells, specifically mesenchymal cells, at Carnegie stage 22.
149
+ datasets:
150
+ - jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation
151
+ pipeline_tag: sentence-similarity
152
+ library_name: sentence-transformers
153
+ metrics:
154
+ - cosine_accuracy
155
+ model-index:
156
+ - name: SentenceTransformer based on NeuML/pubmedbert-base-embeddings
157
+ results:
158
+ - task:
159
+ type: triplet
160
+ name: Triplet
161
+ dataset:
162
+ name: cellxgene pseudo bulk 100k multiplets natural language annotation cell
163
+ sentence 2
164
+ type: cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation_cell_sentence_2
165
+ metrics:
166
+ - type: cosine_accuracy
167
+ value: 0.5196981430053711
168
+ name: Cosine Accuracy
169
+ ---
170
+
171
+ # SentenceTransformer based on NeuML/pubmedbert-base-embeddings
172
+
173
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [NeuML/pubmedbert-base-embeddings](https://huggingface.co/NeuML/pubmedbert-base-embeddings) on the [cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation](https://huggingface.co/datasets/jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
174
+
175
+ ## Model Details
176
+
177
+ ### Model Description
178
+ - **Model Type:** Sentence Transformer
179
+ - **Base model:** [NeuML/pubmedbert-base-embeddings](https://huggingface.co/NeuML/pubmedbert-base-embeddings) <!-- at revision d6eaca8254bc229f3ca42749a5510ae287eb3486 -->
180
+ - **Maximum Sequence Length:** 512 tokens
181
+ - **Output Dimensionality:** 768 dimensions
182
+ - **Similarity Function:** Cosine Similarity
183
+ - **Training Dataset:**
184
+ - [cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation](https://huggingface.co/datasets/jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation)
185
+ - **Language:** code
186
+ <!-- - **License:** Unknown -->
187
+
188
+ ### Model Sources
189
+
190
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
191
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
192
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
193
+
194
+ ### Full Model Architecture
195
+
196
+ ```
197
+ SentenceTransformer(
198
+ (0): MMContextEncoder(
199
+ (text_encoder): BertModel(
200
+ (embeddings): BertEmbeddings(
201
+ (word_embeddings): Embedding(30522, 768, padding_idx=0)
202
+ (position_embeddings): Embedding(512, 768)
203
+ (token_type_embeddings): Embedding(2, 768)
204
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
205
+ (dropout): Dropout(p=0.1, inplace=False)
206
+ )
207
+ (encoder): BertEncoder(
208
+ (layer): ModuleList(
209
+ (0-11): 12 x BertLayer(
210
+ (attention): BertAttention(
211
+ (self): BertSdpaSelfAttention(
212
+ (query): Linear(in_features=768, out_features=768, bias=True)
213
+ (key): Linear(in_features=768, out_features=768, bias=True)
214
+ (value): Linear(in_features=768, out_features=768, bias=True)
215
+ (dropout): Dropout(p=0.1, inplace=False)
216
+ )
217
+ (output): BertSelfOutput(
218
+ (dense): Linear(in_features=768, out_features=768, bias=True)
219
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
220
+ (dropout): Dropout(p=0.1, inplace=False)
221
+ )
222
+ )
223
+ (intermediate): BertIntermediate(
224
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
225
+ (intermediate_act_fn): GELUActivation()
226
+ )
227
+ (output): BertOutput(
228
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
229
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
230
+ (dropout): Dropout(p=0.1, inplace=False)
231
+ )
232
+ )
233
+ )
234
+ )
235
+ (pooler): BertPooler(
236
+ (dense): Linear(in_features=768, out_features=768, bias=True)
237
+ (activation): Tanh()
238
+ )
239
+ )
240
+ (pooling): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
241
+ )
242
+ )
243
+ ```
244
+
245
+ ## Usage
246
+
247
+ ### Direct Usage (Sentence Transformers)
248
+
249
+ First install the Sentence Transformers library:
250
+
251
+ ```bash
252
+ pip install -U sentence-transformers
253
+ ```
254
+
255
+ Then you can load this model and run inference.
256
+ ```python
257
+ from sentence_transformers import SentenceTransformer
258
+
259
+ # Download from the 🤗 Hub
260
+ model = SentenceTransformer("jo-mengr/mmcontext-pubmedbert-100k_cs128")
261
+ # Run inference
262
+ sentences = [
263
+ 'MALAT1 EEF1A1 TMSB4X PTMA H3-3B VIM NACA COL1A2 RACK1 ACTB TMSB10 FAU NPM1 ATP5MC2 TPT1 BTF3 UBB HSP90AA1 MYL6 LGALS1 GAPDH CCNI NAP1L1 HMGN1 UBA52 FTL HSP90AB1 CIRBP STMN1 COX4I1 SRP14 IFITM3 ARPC2 HNRNPK SDC2 FOS EIF1 JUN ZFP36L1 YBX1 LAPTM4A NOP53 SRSF3 HSPE1 CCN2 BTG1 COL6A2 ILF2 RBMX HNRNPDL HMGB2 FTH1 EIF3F XIST TMA7 PABPC1 ERH TSC22D1 ELOB EIF3E EDF1 DDX5 LDHB HES1 EIF1B SLIRP RBM25 CALD1 ATP5F1E TMEM258 NREP ITM2C ANP32B TPM1 ZFP36L2 UQCRB IER2 AP2M1 ATP5MG EEF2 ATP5ME FAM110B MARCKSL1 H1-0 HMGB1 TOMM7 SLC25A5 CD9 CLK1 VCAN PHPT1 MAP3K13 FSCN1 GSTP1 PEBP1 HNRNPC NDUFB7 DDX17 PSMB5 MYL12A UBE2I PTOV1 HSPB1 LUC7L3 CTSC ATP5F1B ARPC3 ALDH2 SNX3 COX7A2 ARRDC3 SUB1 PDGFRB IGFBP5 MYL12B SEPTIN7 TRIR SOX4 PSMB2 COX7C SEM1 LSM7 RBM39 SEC61G SARAF ARGLU1 PPIG HSPD1',
264
+ "This measurement was conducted with 10x 3' v2. Colon tissue sample containing mesodermal cells, specifically mesenchymal cells, at Carnegie stage 22.",
265
+ "This measurement was conducted with 10x 3' v2. Sample contains enteric neurons, isolated from the ileum tissue of a human fetus at Carnegie stage 17.",
266
+ ]
267
+ embeddings = model.encode(sentences)
268
+ print(embeddings.shape)
269
+ # [3, 768]
270
+
271
+ # Get the similarity scores for the embeddings
272
+ similarities = model.similarity(embeddings, embeddings)
273
+ print(similarities)
274
+ # tensor([[1.0000, 0.0670, 0.0628],
275
+ # [0.0670, 1.0000, 0.7297],
276
+ # [0.0628, 0.7297, 1.0000]])
277
+ ```
278
+
279
+ <!--
280
+ ### Direct Usage (Transformers)
281
+
282
+ <details><summary>Click to see the direct usage in Transformers</summary>
283
+
284
+ </details>
285
+ -->
286
+
287
+ <!--
288
+ ### Downstream Usage (Sentence Transformers)
289
+
290
+ You can finetune this model on your own dataset.
291
+
292
+ <details><summary>Click to expand</summary>
293
+
294
+ </details>
295
+ -->
296
+
297
+ <!--
298
+ ### Out-of-Scope Use
299
+
300
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
301
+ -->
302
+
303
+ ## Evaluation
304
+
305
+ ### Metrics
306
+
307
+ #### Triplet
308
+
309
+ * Dataset: `cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation_cell_sentence_2`
310
+ * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)
311
+
312
+ | Metric | Value |
313
+ |:--------------------|:-----------|
314
+ | **cosine_accuracy** | **0.5197** |
315
+
316
+ <!--
317
+ ## Bias, Risks and Limitations
318
+
319
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
320
+ -->
321
+
322
+ <!--
323
+ ### Recommendations
324
+
325
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
326
+ -->
327
+
328
+ ## Training Details
329
+
330
+ ### Training Dataset
331
+
332
+ #### cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation
333
+
334
+ * Dataset: [cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation](https://huggingface.co/datasets/jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation) at [b141493](https://huggingface.co/datasets/jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation/tree/b141493854960a0e33c4583cab3c497379c1f8f0)
335
+ * Size: 81,143 training samples
336
+ * Columns: <code>anchor</code>, <code>positive</code>, <code>negative_1</code>, and <code>negative_2</code>
337
+ * Approximate statistics based on the first 1000 samples:
338
+ | | anchor | positive | negative_1 | negative_2 |
339
+ |:--------|:--------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------|
340
+ | type | string | string | string | string |
341
+ | details | <ul><li>min: 739 characters</li><li>mean: 780.34 characters</li><li>max: 851 characters</li></ul> | <ul><li>min: 92 characters</li><li>mean: 216.13 characters</li><li>max: 900 characters</li></ul> | <ul><li>min: 101 characters</li><li>mean: 215.14 characters</li><li>max: 870 characters</li></ul> | <ul><li>min: 726 characters</li><li>mean: 780.32 characters</li><li>max: 848 characters</li></ul> |
342
+ * Samples:
343
+ | anchor | positive | negative_1 | negative_2 |
344
+ |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
345
+ | <code>TMSB4X TMSB10 ACTB MALAT1 GNLY NKG7 IFITM2 LGALS1 GZMA EEF1A1 PFN1 HMGB2 FTH1 PTMA HSP90AA1 GZMB ARHGDIB HNRNPA2B1 PLAAT4 FAU CMC1 VIM MYL12A CBX3 ATP5F1E HCST IFI44L KLRF1 H3-3A COX6C ARL6IP1 CFL1 ISG15 HMGB1 S100A4 ATP5MF RORA MYL6 CORO1A OAZ1 KLRB1 ID2 HMGN3 CCNI RBM39 CAP1 SERF2 ELOC FCER1G S100A9 IFI16 YWHAZ EIF1 CALR HMGN2 SKAP2 SLC25A5 ZZZ3 YBX1 NUCB2 CDC42 GSTP1 FTL ATP5F1D HNRNPM CHMP4B PGK1 ELOB NOP53 DDX5 ARPC3 SRSF3 SUB1 SMC4 NCL SF3B6 SET ENY2 SEPTIN6 SNRPD2 COX6B1 IFI6 UQCR11 RAC2 COX7B RAN H3-3B POMP EPSTI1 KLRD1 DHX9 LCP1 RTF1 IFI44 UFC1 ANP32E HNRNPDL EIF5B SON ADAR GBP4 CTSS CCNL1 NSA2 HNRNPK FUNDC2 TERF2IP PCBP1 CD7 RSRC1 COX8A COX5A UQCR10 NDUFA6 NACA CD247 NOL7 TMA7 STMP1 M6PR FKBP4 JARID2 CSDE1 UQCRC1 TYROBP MDH1 BID ERP44</code> | <code>This measurement was conducted with 10x 3' v2. A proliferating lymphocyte cell sample, obtained from a 34-year-old female Asian individual, derived from peripheral blood mononuclear cells.</code> | <code>This measurement was conducted with 10x 3' v2. Sample is a 25-year-old female with European ethnicity, having CD8-positive, alpha-beta T cell type. This cell type exhibits elevated expression of type 1 interferon-stimulated genes (ISGs) in monocytes, reduction of naïve CD4+ T cells correlating with monocyte ISG expression, and expansion of repertoire-restricted cytotoxic GZMH+ CD8+ T cells.</code> | <code>MALAT1 TMSB4X EEF1A1 CD74 BTG1 PTMA TMSB10 TPT1 FAU EIF1 FTH1 FTL CXCR4 TSC22D3 DUSP1 UBA52 ACTB CD37 CD52 NACA RACK1 EZR CD69 LAPTM5 H3-3A FOS ISG20 YBX1 CIRBP EIF3E OAZ1 COX7C SAT1 COX4I1 H3-3B SH3BGRL3 UBC UBB JUNB COMMD6 VIM CYBA KLF6 STK17B FUS HNRNPC MYL6 GADD45B LGALS1 EIF3L SRSF5 NFKBIA ANKRD12 CORO1A TLE5 NOP53 CHCHD2 PFN1 DDX5 ARPC3 COX7A2 YPEL5 ARL4A SRGN ATP5F1E PPDPF SNRPB2 COX6B1 ZFP36 JUND RBM39 SARAF ATP5MC2 ITM2B ATP6V1G1 IFI44L SERF2 SRP14 EMP3 TPM3 PLAC8 RNF145 PNRC1 TAGLN2 SON PCSK7 IER2 CTSS S100A9 PPM1K CCNL1 HMGB2 TAF1D TUBA1A EEF2 HINT1 HNRNPA3 RSL1D1 CYCS CFL1 UQCRH ZFAS1 JUN DAZAP2 NAP1L1 ISG15 S100A4 PTPN1 TOMM7 S100A6 CALM1 HMGN2 OST4 SMIM26 FGR CFH CFTR ANKIB1 LAP3 HECW1 MAD1L1 LASP1 TMEM176A M6PR CASP10 CFLAR TFPI RBM5</code> |
346
+ | <code>EEF1A1 MALAT1 FTH1 JUNB TPT1 FOS TMSB10 BTG1 TMSB4X ZFP36L2 NACA PABPC1 ACTB FAU VIM H3-3B EIF1 ZFP36 SARAF PTMA IL7R JUN RACK1 EEF2 UBA52 GAPDH FTL FXYD5 DUSP1 S100A4 CD69 CXCR4 UBC TSC22D3 CFL1 KLF6 ARHGDIB KLF2 BTG2 CITED2 IER2 TUBB4B CD3E EEF1G SLC2A3 NFKBIA PFN1 SRGN SNX9 COX4I1 DNAJB1 SERF2 CD8A PCBP2 IL32 BIRC3 SMAP2 FUS GADD45B MYL12A OAZ1 ATP5F1E TUBA4A PNRC1 LAPTM5 RNF149 TOMM7 OST4 YBX1 PPP2R5C SMCHD1 RGCC ID2 FOSB SNRPD2 COX7C GYPC EMP3 YWHAZ EIF3F ZFP36L1 HMGB1 CALM1 TMA7 MCUB CDC42 HSP90AA1 SUN2 LSR H2AZ2 CHCHD2 TLE4 HSPA8 CDKN1B TPI1 CRYBG1 SRSF7 NR4A1 HCST SH3BGRL3 MCL1 TPM3 NSD3 RGS10 UQCRB ABCG1 UBXN1 SF1 CD52 HINT1 UBB H4C3 S100A6 APRT TXNIP IDS FYN CD74 HERPUD1 CYBA PUM2 RASGRP2 RORA PAG1 FYB1 PABPC4 HNRNPC MYL6</code> | <code>This measurement was conducted with 10x 5' v1. Sample is a cell from the omentum tissue, specifically an effector memory CD4-positive, alpha-beta T cell, from a female in her sixth decade.</code> | <code>This measurement was conducted with 10x 5' v2. Conventional dendritic cell from the jejunal epithelium of a female in her eighth decade.</code> | <code>CD74 MALAT1 EEF1A1 FOS TPT1 TMSB4X TMSB10 ACTB FAU JUN CD37 DUSP1 RACK1 JUNB EIF1 PTMA FTL DNAJB1 H3-3B CD52 NACA BTG1 TSC22D3 FTH1 PABPC1 EEF2 UBA52 EEF1G HSP90AA1 LAPTM5 CYBA PPP1R15A HSP90AB1 CD69 ARHGDIB ZFP36 SERF2 UBC H3-3A PCBP2 HLA-DRB5 KLF6 PFN1 DDX5 HSPA8 ARPC3 CD83 CCNI CXCR4 ATP5F1E SARAF TUBA1A ZFP36L1 TOMM7 HERPUD1 YBX1 RHOA MEF2C FXYD5 MYL6 SRSF5 MYL12A CORO1A OAZ1 NOP53 HNRNPA2B1 HVCN1 FOSB COX7C KLF2 RAC2 COX4I1 ATP5MC2 SH3BGRL3 RHOB AFF3 PNRC1 ZFP36L2 HNRNPDL YWHAZ PCBP1 HINT1 GABARAP UBB CFL1 SELL HMGB1 TMA7 TXNIP SLC25A5 SEC62 CD22 VIM RNASET2 CD44 POU2F2 ADAM28 GDI2 SLC2A3 EIF4B CNN2 SP100 IL4R STX7 DNAJA1 ERP29 HNRNPC EZR CIRBP EIF3L DDX17 ST13 NFKBIA STK4 EIF3E TLE5 CHCHD2 EDF1 SYNGR2 EIF4G2 PTGES3 GAPDH C12orf57 PTPN6 RIPOR2 BACH2 SNX3 SKP1</code> |
347
+ | <code>MALAT1 GRIK1 SYT1 PCDH9 RORA NRG1 CADPS ZFPM2 LRRC4C LINGO2 RALYL PTPRD SPHKAP CNTNAP5 SLC8A1 CCSER1 HDAC9 CELF2 R3HDM1 CNTN4 RBMS3 PCDH7 GALNT13 UNC5D ROBO1 SYNPR SNAP25 GPM6A ANK3 FRMPD4 CHRM2 RYR2 KHDRBS2 CADM1 CACNA1D RGS6 PDE4D DOCK4 UNC13C CDH18 FAT3 MEG3 NR2F2-AS1 HMCN1 GULP1 CAMK2D ZEB1 SYN2 DYNC1I1 OXR1 DPP10 OSBPL6 FRAS1 PPP3CA ZNF385D ZMAT4 PCBP3 HS6ST3 ERC2 PLEKHA5 CDK14 MAP2 NCOA1 ATP8A2 MEIS2 CDH13 TMEM108 FBXL17 PTPRO CEP112 EFNA5 TOX RORB ARID1B TRPM3 AFF3 CDH8 SLC2A13 FRMD4A CDH12 PPM1L HCN1 LRFN5 FBN1 FUT9 PPM1E GRIN2A PRKG1 CPNE4 FRY RASAL2 TNRC6A TNRC6B QKI PLCL1 EPHA4 SGIP1 FOXP2 AHI1 ARHGAP24 BMPR1B RERE CDC42BPA SLC4A10 PTPRG GPR158 MAGI1 DST PRKCA TNIK CACHD1 KCNJ3 NELL1 MLLT3 JMJD1C UBE2E2 PDE4B AGBL4 PCLO RALGAPA2 TAFA2 MEG8 ZFPM2-AS1 REV3L SLC12A2 ERC1 TRIM9 GLS</code> | <code>This measurement was conducted with 10x 3' v3. Neuron cell type from a 29-year-old male, specifically from the thalamic complex, specifically the thalamus (THM) - posterior nuclear complex of thalamus (PoN) - medial geniculate nuclei (MG).</code> | <code>This measurement was conducted with 10x 3' v3. Neuron from the thalamic complex (thalamus, posterior nuclear complex of thalamus, medial geniculate nuclei) of a 42-year-old male, identified as a midbrain-derived inhibitory neuron.</code> | <code>MALAT1 PCDH9 PTPRD NRG1 SYT1 DPP10 ROBO1 TENM2 LRRC4C RBMS3 CNTNAP5 LINGO2 CDH18 SLC8A1 DMD PDE4D RYR2 ATP1B1 RGS6 PTPRT CHRM3 ADGRL2 NOVA1 NTNG1 PCDH7 TAFA2 CCSER1 ANK3 MEG3 MAP2 PLCB4 CACNA2D1 PRKG1 LINC03000 RMST RORA FOXP2 LHFPL3 MEG8 TNRC6A DAB1 KCTD8 RALYL GNAS INPP4B OLFM3 CNTN4 FRMD4A LINC00632 GAPDH ENOX1 AHI1 GPM6A EBF1 LRFN5 PCSK1N SEMA5A KIAA1217 CALY MAP1B SNAP25 GABRB2 CDH8 GRIP1 SNCG MYT1L RTN4 SYNE1 ATP8A2 AFF3 SCN1A SLC4A10 TCF7L2 DST GUCY1A2 ARPP21 KCNH7 HS6ST3 PBX1 AGBL4 PCLO RELN HSP90AA1 RTN3 FBXL17 PCDH15 RABGAP1L BASP1 EFNA5 DCC CALM1 MYCBP2 ARID1B CCDC85A HSP90AB1 PPP3CA TMEM108 UNC5D DGKI JMJD1C NBEA ZBTB20 CADM1 GPC6 KCTD16 LINC01122 THSD7A IDS PAK3 RUNX1T1 TNRC6B STMN2 PHACTR1 ATP8A1 COX4I1 DCLK1 DTNA RTN1 CDC42BPA GALNT13 ZCCHC7 FER ZNF385D SPHKAP CDH12 CA10 KCNMA1 DYNC1I1</code> |
348
+ * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
349
+ ```json
350
+ {
351
+ "scale": 20.0,
352
+ "similarity_fct": "cos_sim"
353
+ }
354
+ ```
355
+
356
+ ### Evaluation Dataset
357
+
358
+ #### cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation
359
+
360
+ * Dataset: [cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation](https://huggingface.co/datasets/jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation) at [b141493](https://huggingface.co/datasets/jo-mengr/cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation/tree/b141493854960a0e33c4583cab3c497379c1f8f0)
361
+ * Size: 9,011 evaluation samples
362
+ * Columns: <code>anchor</code>, <code>positive</code>, <code>negative_1</code>, and <code>negative_2</code>
363
+ * Approximate statistics based on the first 1000 samples:
364
+ | | anchor | positive | negative_1 | negative_2 |
365
+ |:--------|:--------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------|
366
+ | type | string | string | string | string |
367
+ | details | <ul><li>min: 730 characters</li><li>mean: 783.66 characters</li><li>max: 850 characters</li></ul> | <ul><li>min: 99 characters</li><li>mean: 209.99 characters</li><li>max: 941 characters</li></ul> | <ul><li>min: 102 characters</li><li>mean: 213.87 characters</li><li>max: 981 characters</li></ul> | <ul><li>min: 722 characters</li><li>mean: 782.09 characters</li><li>max: 842 characters</li></ul> |
368
+ * Samples:
369
+ | anchor | positive | negative_1 | negative_2 |
370
+ |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
371
+ | <code>MALAT1 EEF1A1 FTH1 TMSB4X ACTB FTL RTN4 ATP6V0B TPT1 FAU S100A6 NDUFA4 ATP5F1E COX7C ITM2B IGFBP7 EIF1 C12orf75 CD9 COX7B SERF2 ATP1B1 COX8A TXNIP NDUFB2 MYL6 PPDPF COX6B1 UQCR11 APOE COX4I1 CALM2 UQCRB S100A11 UQCRQ COX6C ATP5MG BSG ATP6AP2 UQCR10 PTMA NACA UBL5 UBA52 TMSB10 ADGRF5 HSP90AA1 GSTP1 ATP5F1D CHCHD2 GAPDH COX7A2 SKP1 HSPE1 PRDX1 CYSTM1 LGALS3 CD63 ATP5MJ CKB NDUFS5 ATP5ME UBB MAL ATP5MK PPIA S100A10 RACK1 TMA7 NEAT1 SLC25A5 LAPTM4A SLC25A3 DYNLL1 NDUFB7 DHRS7 ELOB OAZ1 HSPB1 EDF1 CD81 ATP5F1B LDHB ATP6V0E1 ATP6V1A IGFBP5 TMEM59 PRDX6 CTSD MYL12B NDUFA1 PRDX5 ATP6V1F JUND SEC61G ATP5MC2 SRP14 MGST3 NDUFB11 UBC NDUFC2 ATP5MC3 ATP5PF ATP5MC1 GPX4 HINT1 FOS CFL1 MICOS10 UQCRH POLR2L COX5A NDUFB1 NDUFA13 COMMD6 LITAF TOMM7 CEBPD OST4 ATP5MF NDUFAB1 GABARAPL2 HSPA5 NEDD4L HERPUD1 PHPT1 MPC1 HIPK2</code> | <code>This measurement was conducted with 10x 3' v3. Cell sample from the cortex of kidney, taken from a 43-year-old male of European ethnicity with a reported history of kidney cancer. The cell type is identified as a kidney collecting duct intercalated cell.</code> | <code>This measurement was conducted with 10x 3' v3. Kidney collecting duct intercalated cell from a 43-year old European male with kidney cancer, taken from the cortex of kidney and cryopreserved for further analysis.</code> | <code>MALAT1 EEF1A1 CRYAB S100A6 ITM2B ACTB TPT1 PTMA FTL PEBP1 H3-3B GSTP1 ADIRF IGFBP7 S100A10 HIPK2 MYL6 SERF2 TPM1 FAU FTH1 ID4 EIF1 TMSB10 HSP90AA1 SKP1 IGFBP2 IGFBP5 PRDX1 MYL12B CYSTM1 CLU ATP5F1E AHNAK PPDPF DSTN ID1 COX7C JUND SRP14 ATP1B1 HINT1 NDUFA4 PPIA NACA TMA7 NEAT1 CD9 SYNE2 LAPTM4A GNAS CIRBP ATP5F1D DDX17 EDF1 CCND1 LDHB RTN4 TMEM59 NR4A1 KTN1 SAT1 TMBIM6 APP CALM2 UBC UQCRB PDXK S100A11 SPINT2 UBB FOS COX8A ZFP36L1 NDUFA13 HMGB1 PSAP UBL5 RACK1 CEBPD UBA52 EEF1G TXNIP SEC62 CSDE1 YBX1 MYO9A PKM NUCKS1 CDC42 ENO1 COBLL1 APLP2 DYNLL1 NDUFB2 EZR HSP90AB1 SRSF5 JAG1 MYL12A TSC22D1 HCFC1R1 ELOB TLE5 CPVL CHCHD2 HSPB1 SPOCK2 PFN1 RASD1 DDX5 LUC7L3 WSB1 NDUFC1 CD81 GAPDH SERINC1 SRSF3 COX7A2 GMDS SUB1 ATP6V0E1 SPCS1 SF3B6 HSPE1 ODC1 IVNS1ABP CD46</code> |
372
+ | <code>MALAT1 KCND2 NRXN1 CDH18 NRXN3 ZNF385D CADM2 RALYL NKAIN2 CADPS2 RIMS1 FSTL5 GRID2 TRPM3 CHN2 DPP6 JMJD1C RORA PDE1A UNC13C TIAM1 NRG1 SNAP25 ZFPM2 CALN1 LSAMP CNTN1 ABLIM1 SYNE1 ANK3 CA10 NFIA ZBTB20 NTM CADM1 OPCML RELN DNM3 NEBL ERC1 SCN2A PPP3CA CACNA1A GALNT13 LRRC4C GPM6A RABGAP1L RIT2 CAMK4 GRIA4 PTPRD RBFOX3 MCTP1 LHFPL6 PCLO MEG3 PDE10A NOVA1 RTN1 ZNF385B CNTN4 GABRB2 SPOCK1 OXR1 ARPP21 SGCZ WWOX TCF4 RYR2 ETV1 ADAM22 TENM1 RUNX1T1 PLCB4 SH3GL2 MAP1B PATJ DTNA AFF3 CNKSR2 MAGI1 EPB41 KCNJ3 CLASP2 SYNPR TSPAN5 ERBB4 KCTD8 SLIT3 MAML2 CALM1 ZNF521 LINC00632 TLL1 TRIO CDH10 ARID1B LIMA1 FOXN3 KIF1B DIP2B FRY ZNF638 DLG1 MAP2 PTPN4 TNRC6A RAPGEF4 TRIM9 KDM4C MICU1 CPE ARFGEF3 RTN4 OLFM3 MED13L EXOC4 CCNH MAP7 AHI1 KCNK1 LRCH1 ITM2B GLCE SRRM4 CDC42BPA EXOC6B UNC80</code> | <code>This measurement was conducted with 10x 3' v3. Neuron cell type from a 29-year-old male cerebellum, specifically from the Cerebellar Vermis - CBV region, with European self-reported ethnicity, analyzed at the nucleus level.</code> | <code>This measurement was conducted with 10x 3' v3. Endothelial cells derived from the cerebellum (specifically, cerebellar vermis) of a 42-year-old male, classified under the vascular supercluster term.</code> | <code>MALAT1 ATP10A COBLL1 GPCPD1 PTPRG SLC39A10 FLT1 FLI1 TSPAN5 THSD4 RUNDC3B CCNY IGFBP7 ST6GALNAC3 PRKCH ST6GAL1 MECOM ESYT2 TBC1D4 IGF1R TACC1 HERC4 CDH2 TCF4 ABCB1 DOCK9 SORBS2 USP54 CBFA2T2 TSC22D1 QKI EPAS1 APP NFIB AOPEP ELMO1 ZNF704 PTPRM NET1 A2M FGD6 EPHA3 NEBL RAPGEF2 ACVR1 SPTBN1 BBS9 KLF2 MKLN1 EXOC6 LEF1 PPP3CA RBMS3 LRMDA WDFY3 BCL2L1 TTC3 SIPA1L1 CFLAR ADGRF5 MAP4K4 SCARB1 RAPGEF4 ABLIM1 ATP6V0E1 ITIH5 SYNE1 LARGE1 CEMIP2 UBR3 PAM JAK1 BAALC HEG1 PBX1 WWOX TEAD1 TOX NSUN6 NEAT1 VPS13D PLEKHA5 LAMA3 SYNE2 TAB2 SLC12A2 EML1 HNRNPC SORBS1 MYH9 SLC7A5 PON2 ZKSCAN1 MED13L AKAP9 HIP1 ITFG1 XAF1 SBF2 DNAJC1 SLCO2B1 MYOF PRKACB WDPCP IQSEC1 PDGFC MARCHF6 CEP112 PPP1R9A SNED1 SNRK EMCN RIMKLB NAV2 CTNNB1 SLC20A2 STXBP6 MYRIP DCLK2 INSR EGFL7 YES1 LDLRAD3 RAD51B EXT1 LHFPL6 MRTFB UNC13B</code> |
373
+ | <code>EEF1A1 ACTB GAPDH HMGN2 PTMA SERF2 TMSB4X CD74 PABPC1 FTH1 TMSB10 FAU PFN1 HMGN1 OAZ1 HMGB1 TPT1 PPIA NACA BTF3 MALAT1 MYL6 ATP5MG CFL1 RACK1 ODC1 ATP5F1E TMA7 SLC25A5 ELOB ARPC3 NPM1 COX7C ANP32B C4orf3 EIF1 PCBP2 KLF6 LAPTM5 COX8A RHOA HSPA8 H3-3B PTP4A2 UBA52 OST4 CIRBP LGALS1 EIF3L STMN1 PPDPF COX4I1 RAN EIF3F PPP1CC COMMD6 NDUFA4 YBX1 PEBP1 COTL1 COX7A2 HSPE1 CCNI TRIR SNRPD2 MAP1LC3B HSPD1 ARPC2 H2AZ1 YWHAZ EEF2 TRAPPC1 MARCKSL1 CALM1 EEF1G EIF4B SLC25A3 TPD52 FTL H2AZ2 DDX5 UBE2D3 CCT4 ATP5PB ROMO1 BTG1 SNRPG ATP5MC3 TSC22D3 CSTB GABARAP PGAM1 ZFAS1 ZFP36L1 ARF5 DYNLL1 NDUFB2 CORO1A LUC7L3 EIF4G2 RWDD1 SNX3 CCND3 SERP1 HNRNPA2B1 COX6B1 UXT SEC14L1 SH3BGRL YWHAQ ATP5MC2 LCP1 MIEN1 TXNL4A SOD1 SH3BGRL3 PNRC1 EIF3H ATP5MJ UQCRB BTG2 UBXN1 YWHAB HINT1 UQCRH GRB2 S100A10 UBE2J1</code> | <code>This measurement was conducted with 10x 5' v1. Cell sample from the tonsil of a 9-year-old female with recurrent tonsillitis, characterized as a centroblast B cell with IGLC2, IGLV7-43, IGLJ3 immunoglobulin genes expressed.</code> | <code>This measurement was conducted with 10x 5' v1. Centroblast cells derived from a 3-year-old male human tonsil sample, with obstructive sleep apnea and recurrent tonsillitis, undergoing affinity maturation and differentiation into memory or plasma cells.</code> | <code>CD74 MALAT1 EEF1A1 ACTB TMSB4X LAPTM5 PTMA TPT1 TMSB10 CXCR4 FAU BTG1 TXNIP PABPC1 FTH1 NACA FTL IRF1 RBM3 CD83 CCNI SARAF BTF3 HNRNPA3 HLA-DRB5 UBA52 MEF2C CORO1A UBE2D3 ATP5F1E PDIA6 UBC GABARAP CFL1 CALR RACK1 HSPA5 EIF4B RHOA HNRNPC SRSF5 PFN1 HSPA8 CNOT2 IFT57 HNRNPA2B1 COX7C ITM2B SH3BGRL3 PNRC1 PDIA3 EEF2 UBB PARP14 SNX2 LAP3 SLC25A5 POU2F2 ADAM28 ZNF800 CYBA GDI2 STK17B EIF3I PAPOLA MYL6 CIRBP STK4 OAZ1 NOP53 ZC3HAV1 SYNGR2 EIF4G2 CLEC2B RSRC2 ARPC3 SLC38A1 CYTIP STAT1 BCAS2 VAMP8 DUSP1 ZNF706 BST2 PNISR H3-3B EPSTI1 ATP5MC2 SCAF11 GLIPR1 SERF2 SRP14 MARCHF6 TIAL1 TMEM123 MBNL1 HNRNPU EIF4A2 CTSS TGFBR2 CCNL1 YWHAZ HNRNPK GHITM PCBP1 CNBP EIF1AX EIF1 ZFP36L1 IKZF1 PPP1CC MPHOSPH8 PPIA CD47 ANXA6 HMGN2 PRR13 MYCBP2 IFRD1 SYPL1 CSDE1 PTBP1 CD22 VIM RNASET2 ARID4A GABARAPL2 CYFIP2</code> |
374
+ * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
375
+ ```json
376
+ {
377
+ "scale": 20.0,
378
+ "similarity_fct": "cos_sim"
379
+ }
380
+ ```
381
+
382
+ ### Training Hyperparameters
383
+ #### Non-Default Hyperparameters
384
+
385
+ - `eval_strategy`: steps
386
+ - `per_device_train_batch_size`: 256
387
+ - `per_device_eval_batch_size`: 256
388
+ - `learning_rate`: 0.05
389
+ - `num_train_epochs`: 4
390
+ - `warmup_ratio`: 0.1
391
+ - `bf16`: True
392
+ - `gradient_checkpointing`: True
393
+
394
+ #### All Hyperparameters
395
+ <details><summary>Click to expand</summary>
396
+
397
+ - `overwrite_output_dir`: False
398
+ - `do_predict`: False
399
+ - `eval_strategy`: steps
400
+ - `prediction_loss_only`: True
401
+ - `per_device_train_batch_size`: 256
402
+ - `per_device_eval_batch_size`: 256
403
+ - `per_gpu_train_batch_size`: None
404
+ - `per_gpu_eval_batch_size`: None
405
+ - `gradient_accumulation_steps`: 1
406
+ - `eval_accumulation_steps`: None
407
+ - `torch_empty_cache_steps`: None
408
+ - `learning_rate`: 0.05
409
+ - `weight_decay`: 0.0
410
+ - `adam_beta1`: 0.9
411
+ - `adam_beta2`: 0.999
412
+ - `adam_epsilon`: 1e-08
413
+ - `max_grad_norm`: 1.0
414
+ - `num_train_epochs`: 4
415
+ - `max_steps`: -1
416
+ - `lr_scheduler_type`: linear
417
+ - `lr_scheduler_kwargs`: {}
418
+ - `warmup_ratio`: 0.1
419
+ - `warmup_steps`: 0
420
+ - `log_level`: passive
421
+ - `log_level_replica`: warning
422
+ - `log_on_each_node`: True
423
+ - `logging_nan_inf_filter`: True
424
+ - `save_safetensors`: True
425
+ - `save_on_each_node`: False
426
+ - `save_only_model`: False
427
+ - `restore_callback_states_from_checkpoint`: False
428
+ - `no_cuda`: False
429
+ - `use_cpu`: False
430
+ - `use_mps_device`: False
431
+ - `seed`: 42
432
+ - `data_seed`: None
433
+ - `jit_mode_eval`: False
434
+ - `use_ipex`: False
435
+ - `bf16`: True
436
+ - `fp16`: False
437
+ - `fp16_opt_level`: O1
438
+ - `half_precision_backend`: auto
439
+ - `bf16_full_eval`: False
440
+ - `fp16_full_eval`: False
441
+ - `tf32`: None
442
+ - `local_rank`: 0
443
+ - `ddp_backend`: None
444
+ - `tpu_num_cores`: None
445
+ - `tpu_metrics_debug`: False
446
+ - `debug`: []
447
+ - `dataloader_drop_last`: False
448
+ - `dataloader_num_workers`: 0
449
+ - `dataloader_prefetch_factor`: None
450
+ - `past_index`: -1
451
+ - `disable_tqdm`: False
452
+ - `remove_unused_columns`: True
453
+ - `label_names`: None
454
+ - `load_best_model_at_end`: False
455
+ - `ignore_data_skip`: False
456
+ - `fsdp`: []
457
+ - `fsdp_min_num_params`: 0
458
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
459
+ - `fsdp_transformer_layer_cls_to_wrap`: None
460
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
461
+ - `deepspeed`: None
462
+ - `label_smoothing_factor`: 0.0
463
+ - `optim`: adamw_torch
464
+ - `optim_args`: None
465
+ - `adafactor`: False
466
+ - `group_by_length`: False
467
+ - `length_column_name`: length
468
+ - `ddp_find_unused_parameters`: None
469
+ - `ddp_bucket_cap_mb`: None
470
+ - `ddp_broadcast_buffers`: False
471
+ - `dataloader_pin_memory`: True
472
+ - `dataloader_persistent_workers`: False
473
+ - `skip_memory_metrics`: True
474
+ - `use_legacy_prediction_loop`: False
475
+ - `push_to_hub`: False
476
+ - `resume_from_checkpoint`: None
477
+ - `hub_model_id`: None
478
+ - `hub_strategy`: every_save
479
+ - `hub_private_repo`: None
480
+ - `hub_always_push`: False
481
+ - `hub_revision`: None
482
+ - `gradient_checkpointing`: True
483
+ - `gradient_checkpointing_kwargs`: None
484
+ - `include_inputs_for_metrics`: False
485
+ - `include_for_metrics`: []
486
+ - `eval_do_concat_batches`: True
487
+ - `fp16_backend`: auto
488
+ - `push_to_hub_model_id`: None
489
+ - `push_to_hub_organization`: None
490
+ - `mp_parameters`:
491
+ - `auto_find_batch_size`: False
492
+ - `full_determinism`: False
493
+ - `torchdynamo`: None
494
+ - `ray_scope`: last
495
+ - `ddp_timeout`: 1800
496
+ - `torch_compile`: False
497
+ - `torch_compile_backend`: None
498
+ - `torch_compile_mode`: None
499
+ - `include_tokens_per_second`: False
500
+ - `include_num_input_tokens_seen`: False
501
+ - `neftune_noise_alpha`: None
502
+ - `optim_target_modules`: None
503
+ - `batch_eval_metrics`: False
504
+ - `eval_on_start`: False
505
+ - `use_liger_kernel`: False
506
+ - `liger_kernel_config`: None
507
+ - `eval_use_gather_object`: False
508
+ - `average_tokens_across_devices`: False
509
+ - `prompts`: None
510
+ - `batch_sampler`: batch_sampler
511
+ - `multi_dataset_batch_sampler`: proportional
512
+ - `router_mapping`: {}
513
+ - `learning_rate_mapping`: {}
514
+
515
+ </details>
516
+
517
+ ### Training Logs
518
+ | Epoch | Step | Training Loss | cellxgene pseudo bulk 100k multiplets natural language annotation loss | cellxgene_pseudo_bulk_100k_multiplets_natural_language_annotation_cell_sentence_2_cosine_accuracy |
519
+ |:------:|:----:|:-------------:|:----------------------------------------------------------------------:|:-------------------------------------------------------------------------------------------------:|
520
+ | 0.3155 | 100 | 21.0864 | 22.2000 | 0.5197 |
521
+ | 0.6309 | 200 | 21.0764 | 22.2000 | 0.5197 |
522
+ | 0.9464 | 300 | 21.0606 | 22.2000 | 0.5197 |
523
+ | 1.2618 | 400 | 21.075 | 22.2000 | 0.5197 |
524
+ | 1.5773 | 500 | 21.0764 | 22.2000 | 0.5197 |
525
+ | 1.8927 | 600 | 21.074 | 22.2000 | 0.5197 |
526
+ | 2.2082 | 700 | 21.0842 | 22.2000 | 0.5197 |
527
+ | 2.5237 | 800 | 21.0592 | 22.2000 | 0.5197 |
528
+ | 2.8391 | 900 | 21.0785 | 22.2000 | 0.5197 |
529
+ | 3.1546 | 1000 | 21.0701 | 22.2000 | 0.5197 |
530
+ | 3.4700 | 1100 | 21.0667 | 22.2000 | 0.5197 |
531
+ | 3.7855 | 1200 | 21.0805 | 22.2000 | 0.5197 |
532
+
533
+
534
+ ### Framework Versions
535
+ - Python: 3.11.6
536
+ - Sentence Transformers: 5.0.0
537
+ - Transformers: 4.55.0.dev0
538
+ - PyTorch: 2.5.1+cu121
539
+ - Accelerate: 1.9.0
540
+ - Datasets: 2.19.1
541
+ - Tokenizers: 0.21.4
542
+
543
+ ## Citation
544
+
545
+ ### BibTeX
546
+
547
+ #### Sentence Transformers
548
+ ```bibtex
549
+ @inproceedings{reimers-2019-sentence-bert,
550
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
551
+ author = "Reimers, Nils and Gurevych, Iryna",
552
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
553
+ month = "11",
554
+ year = "2019",
555
+ publisher = "Association for Computational Linguistics",
556
+ url = "https://arxiv.org/abs/1908.10084",
557
+ }
558
+ ```
559
+
560
+ #### MultipleNegativesRankingLoss
561
+ ```bibtex
562
+ @misc{henderson2017efficient,
563
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
564
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
565
+ year={2017},
566
+ eprint={1705.00652},
567
+ archivePrefix={arXiv},
568
+ primaryClass={cs.CL}
569
+ }
570
+ ```
571
+
572
+ <!--
573
+ ## Glossary
574
+
575
+ *Clearly define terms in order to be accessible across audiences.*
576
+ -->
577
+
578
+ <!--
579
+ ## Model Card Authors
580
+
581
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
582
+ -->
583
+
584
+ <!--
585
+ ## Model Card Contact
586
+
587
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
588
+ -->
config_sentence_transformers.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "model_type": "SentenceTransformer",
3
+ "__version__": {
4
+ "sentence_transformers": "5.0.0",
5
+ "transformers": "4.55.0.dev0",
6
+ "pytorch": "2.5.1+cu121"
7
+ },
8
+ "prompts": {
9
+ "query": "",
10
+ "document": ""
11
+ },
12
+ "default_prompt_name": null,
13
+ "similarity_fn_name": "cosine"
14
+ }
modules.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "0_MMContextEncoder",
6
+ "type": "mmcontext.models.mmcontextencoder.MMContextEncoder"
7
+ }
8
+ ]