SentenceTransformer based on NeuML/bioclinical-modernbert-base-embeddings
This is a sentence-transformers model finetuned from NeuML/bioclinical-modernbert-base-embeddings on the cellxgene_pseudo_bulk_3_5k_multiplets_natural_language_annotation and gene_description datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: NeuML/bioclinical-modernbert-base-embeddings
- Maximum Sequence Length: None tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
- Training Datasets:
- cellxgene_pseudo_bulk_3_5k_multiplets_natural_language_annotation
- gene_description
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): MMContextEncoder(
(text_encoder): ModernBertModel(
(embeddings): ModernBertEmbeddings(
(tok_embeddings): Embedding(50368, 768, padding_idx=50283)
(norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(drop): Dropout(p=0.0, inplace=False)
)
(layers): ModuleList(
(0): ModernBertEncoderLayer(
(attn_norm): Identity()
(attn): ModernBertAttention(
(Wqkv): Linear(in_features=768, out_features=2304, bias=False)
(rotary_emb): ModernBertRotaryEmbedding()
(Wo): Linear(in_features=768, out_features=768, bias=False)
(out_drop): Identity()
)
(mlp_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(mlp): ModernBertMLP(
(Wi): Linear(in_features=768, out_features=2304, bias=False)
(act): GELUActivation()
(drop): Dropout(p=0.0, inplace=False)
(Wo): Linear(in_features=1152, out_features=768, bias=False)
)
)
(1-21): 21 x ModernBertEncoderLayer(
(attn_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(attn): ModernBertAttention(
(Wqkv): Linear(in_features=768, out_features=2304, bias=False)
(rotary_emb): ModernBertRotaryEmbedding()
(Wo): Linear(in_features=768, out_features=768, bias=False)
(out_drop): Identity()
)
(mlp_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(mlp): ModernBertMLP(
(Wi): Linear(in_features=768, out_features=2304, bias=False)
(act): GELUActivation()
(drop): Dropout(p=0.0, inplace=False)
(Wo): Linear(in_features=1152, out_features=768, bias=False)
)
)
)
(final_norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
)
(pooling): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the ๐ค Hub
model = SentenceTransformer("jo-mengr/mmcontext-cg_3_5k-nla-biomodern-None-text_only_50-feat_cs-v4")
# Run inference
sentences = [
'CD74 MALAT1 RPL13 RPLP2 HMGN2 TMSB10 CD37 HSP90AA1 MS4A1 NCF1 CD79B DNAJB1 HNRNPA1 TSC22D3 MARCKSL1 STMN1 DUSP1 EZR DBI PSIP1 ISG20 HSPA6 FTL OSBPL8 CBX5 ATP5F1D ADNP COTL1 HSPB1 SYNGR2 CD83 HSPE1 DDX39A NCOA3 RPS4Y1 SEC61G LIMD2 IFI16 SMIM14 H2AZ1 HMGB2 STK17A FOS UCP2 TYMS CALR GSTK1 S100A10 RNASEK LINC00926',
"This measurement was conducted with 10x 5' v1. Germinal center B cell derived from a 3-year-old male tonsil, expressing IgM isotype, with genotyped IGHV4-34*01, IGHJ4*02, IGHD6-6*01, IGLV1-40, IGLC2, and IGLJ2 genes, and exhibiting cycling behavior.",
"This measurement was conducted with 10x 5' v1. Naive B cell sample taken from a 5-year-old female with obstructive sleep apnea and recurrent tonsillitis, originating from the tonsil tissue.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.9999, 0.9972],
# [0.9999, 1.0000, 0.9980],
# [0.9972, 0.9980, 1.0000]])
Training Details
Training Datasets
cellxgene_pseudo_bulk_3_5k_multiplets_natural_language_annotation
- Dataset: cellxgene_pseudo_bulk_3_5k_multiplets_natural_language_annotation
- Size: 2,845 training samples
- Columns:
anchor
,positive
,negative_1
, andnegative_2
- Approximate statistics based on the first 1000 samples:
anchor positive negative_1 negative_2 type string string string string details - min: 277 characters
- mean: 302.06 characters
- max: 327 characters
- min: 99 characters
- mean: 220.61 characters
- max: 899 characters
- min: 92 characters
- mean: 217.28 characters
- max: 899 characters
- min: 46 characters
- mean: 48.27 characters
- max: 49 characters
- Samples:
anchor positive negative_1 negative_2 MALAT1 TAGLN MT2A ADIRF ACTA2 MYL9 ACTB GADD45B TPM2 DSTN IGFBP7 TMSB4X MT1M MYL6 MT1X RPL3 RPS6 CCL2 RPS27A RPL32 RPS19 KLF2 TPT1 HSPB1 RPL11 RGS16 RPL10A RPS12 CALD1 TPM1 RHOB RPS3 FAU RPS23 RPS4X SPARCL1 RPL30 PPP1R15A SOD3 SELENOM TSC22D1 CRISPLD2 ZFAND5 RPS11 ZFP36L1 TMSB10 NFKBIA IGFBP5 DUSP1 MYC
This measurement was conducted with 10x 3' v2. Colon pericytes from a male individual in his eighth decade, characterized as Pericytes RERGL NTRK2.
This measurement was conducted with 10x 3' v3. Colon fibroblast cells derived from the lamina propria of mucosa of a male individual in his fourth decade.
census_6cf3634d-e911-44ad-bf52-c747a9af3c01_213
MALAT1 CTNNA3 PLP1 MBP PDE4B SLC44A1 MAN2A1 PTPRD FRMD5 DLG1 EDIL3 ZEB2 PLCL1 DOCK10 PIP4K2A ELMO1 CLASP2 DPYD TMEM165 TTLL7 TMEM144 TF MAP4K4 FRMD4B ZFYVE16 PDE8A PRUNE2 PTGDS UBE2E2 MVB12B PDE4D CDK18 ENPP2 DOCK5 FTX FAM107B SYNJ2 SSH2 APP GALNT13 PKP4 PDE1C ALCAM UGT8 MYO1D DSCAML1 WWOX GRM3 NEAT1 SLC25A13
This measurement was conducted with 10x 3' v3. Oligodendrocyte cells from the hippocampal formation of a 29-year-old male, specifically from the Head of hippocampus (HiH) - Uncal DG-CA4 region, with a region of interest at Human DGU-CA4Upy.
This measurement was conducted with 10x 3' v3. Central nervous system macrophage (microglia) derived from the hippocampal formation of a 42-year-old male, specifically from the Head of hippocampus (HiH) - Uncal DG-CA4 dissection.
census_4724c395-0c46-46d2-81f7-60fd271fb488_844
MALAT1 NRG3 SOX5 WWOX SYNE1 RORA NFIA SOX6 CASC15 TRPS1 MAML2 KANK1 GLIS3 PPP1R12B MSI2 ITGA2 PLEKHA7 SDC2 NFIB TCF7L2 LDLRAD4 AKAP13 REV3L TNRC6B RIPOR2 SPTBN1 CLU MYO1E SNTB1 KCNMB2 TMEM178B ADCY2 OPHN1 CCSER2 PHACTR1 FOXP1 SGIP1 MKLN1 CD36 PTPRG SETBP1 CHD7 SLC36A4 RAD51B PBX1 NEAT1 SLC25A13 WWTR1 POLR2B LTBP1
This measurement was conducted with 10x 3' v3. Ependymal cell from the thalamic complex (thalamus (THM) - medial nuclear complex of thalamus (MNC) - mediodorsal nucleus of thalamus + reuniens nucleus (medioventral nucleus) of thalamus - MD + Re) of a 50-year-old male with European ethnicity.
This measurement was conducted with 10x 3' v3. Neuron cell type from a 50-year-old human thalamic complex, specifically from the thalamus (THM) - medial nuclear complex of thalamus (MNC) - mediodorsal nucleus of thalamus + reuniens nucleus (medioventral nucleus) of thalamus - MD + Re region, with European self-reported ethnicity and male sex.
census_b8681124-aabf-4904-b9c9-66d834bf5bba_383
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
gene_description
- Dataset: gene_description at 338d1c3
- Size: 199 training samples
- Columns:
anchor
andpositive
- Approximate statistics based on the first 199 samples:
anchor positive type string string details - min: 3 characters
- mean: 5.77 characters
- max: 12 characters
- min: 18 characters
- mean: 419.45 characters
- max: 1375 characters
- Samples:
anchor positive A1BG
The protein encoded by this gene is a plasma glycoprotein of unknown function. The protein shows sequence similarity to the variable regions of some immunoglobulin supergene family member proteins. [provided by RefSeq, Jul 2008]
A1BG-AS1
A1BG antisense RNA 1
A1CF
Mammalian apolipoprotein B mRNA undergoes site-specific C to U deamination, which is mediated by a multi-component enzyme complex containing a minimal core composed of APOBEC-1 and a complementation factor encoded by this gene. The gene product has three non-identical RNA recognition motifs and belongs to the hnRNP R family of RNA-binding proteins. It has been proposed that this complementation factor functions as an RNA-binding subunit and docks APOBEC-1 to deaminate the upstream cytidine. Studies suggest that the protein may also be involved in other RNA editing or RNA processing events. Several transcript variants encoding a few different isoforms have been found for this gene. [provided by RefSeq, Nov 2010]
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Evaluation Dataset
cellxgene_pseudo_bulk_3_5k_multiplets_natural_language_annotation
- Dataset: cellxgene_pseudo_bulk_3_5k_multiplets_natural_language_annotation
- Size: 306 evaluation samples
- Columns:
anchor
,positive
,negative_1
, andnegative_2
- Approximate statistics based on the first 306 samples:
anchor positive negative_1 negative_2 type string string string string details - min: 283 characters
- mean: 300.67 characters
- max: 319 characters
- min: 90 characters
- mean: 201.29 characters
- max: 623 characters
- min: 101 characters
- mean: 198.93 characters
- max: 573 characters
- min: 46 characters
- mean: 47.67 characters
- max: 49 characters
- Samples:
anchor positive negative_1 negative_2 MALAT1 RORA MEG3 SPARCL1 NFIA CPE MACF1 MAML2 PDE4D CLU APOE CELF2 NEAT1 TRPS1 DST MSI2 MT2A TCF4 KIF1B TTC28 QKI PHF21A NFIB MYCBP2 PTPN4 ARHGAP5 SPTBN1 TMTC2 PDE4B ERC1 DDX17 USP34 ASH1L ARGLU1 ANK3 CLASP2 RBM6 KMT2C LIMCH1 CLASP1 ADNP NFAT5 PTN PHF14 PTGDS LUC7L3 SOX6 SENP6 AAK1 PPM1B
This measurement was conducted with 10x 3' v3. Astrocyte cell type from the hippocampal formation of a 29-year-old male, specifically from the Head of hippocampus (HiH) - Uncal DG-CA4 region, with a region of interest at Human DGU-CA4Upy.
This measurement was conducted with 10x 3' v2. Malignant stem-like, NPC-like cell sample derived from the left temporal lobe of a 61-year-old male.
census_4724c395-0c46-46d2-81f7-60fd271fb488_1312
MALAT1 MEG3 ATP1B1 AHI1 ANK3 ZNF385D PTPRD TSHZ2 MYCBP2 CDK14 ERC1 RTN4 APP DST CLASP2 ATP6V0C LIMCH1 RORA ATP2B1 HSP90AA1 DDX17 MEST CNOT2 PDE4D USP34 MACF1 ANKRD17 SRPK2 ITM2B PRKCB SETD5 JMJD1C PDE4B PEG10 RBM6 ARID4A TIMP2 MAP4 CELF2 R3HDM1 KIF1B PUM2 CLASP1 PCM1 APLP2 TTC28 TNRC6B RNF24 TSC22D1 NFAT5
This measurement was conducted with 10x 3' v3. Neuron from a 29-year old male, taken from the hypothalamus, specifically the supraoptic region (HTHso) and tuberal region (HTHtub), including anterior hypothalamic nucleus (AHN), ventromedial hypothalamic nucleus (VMH), and dorsomedial hypothalamic nucleus (DMH).
This measurement was conducted with 10x multiome. Endothelial cell from the sinoatrial node of a male individual in their fifth decade, which has been flushed.
census_35c8a04c-8639-4d15-8228-765d8d93fc96_183
MALAT1 MEG3 ATP1B1 TCF4 ZNF385D MEF2C ADAMTS9-AS2 ANK3 PDE4D APP PRKCB TMTC2 CELF2 QKI PTPRD CLASP2 ZEB2 TTC28 NFIB JMJD1C TSHZ2 MYCBP2 ERC1 TNRC6B AHI1 DST KIF1B CDK14 LIMCH1 RORA ATP2B1 RTN4 SGK1 AFF3 ZNF292 HSP90AA1 HERC1 ASH1L MACF1 SNRPN SRPK2 CYRIB PRNP ATP6V0C RBM6 TIMP2 FOXN3 GLG1 OSBPL8 ARHGAP5
This measurement was conducted with 10x 3' v3. Neuron cell type from a 50-year-old male cerebral cortex, specifically the Middle Temporal Gyrus (MTG), with European ethnicity.
This measurement was conducted with 10x 3' v3. Neuron cell type from cerebral cortex, specifically from the Granular insular cortex, Short insular gyri, of a 42-year old male.
census_e1f595f6-ba2c-495e-9bee-7056f116b1e4_1626
- Loss:
MultipleNegativesRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 128per_device_eval_batch_size
: 128learning_rate
: 0.05num_train_epochs
: 1warmup_ratio
: 0.1
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 128per_device_eval_batch_size
: 128per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 0.05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportionalrouter_mapping
: {}learning_rate_mapping
: {}
Framework Versions
- Python: 3.12.9
- Sentence Transformers: 5.0.0
- Transformers: 4.52.3
- PyTorch: 2.7.0
- Accelerate: 1.7.0
- Datasets: 3.6.0
- Tokenizers: 0.21.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for jo-mengr/mmcontext-cg_3_5k-nla-biomodern-None-text_only_50-feat_cs-v4
Base model
answerdotai/ModernBERT-base