Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
IrisCHEN04
/
Sentence-Transformer_Roberta_Reddit_ed
like
0
Sentence Similarity
sentence-transformers
Safetensors
roberta
feature-extraction
dense
Generated from Trainer
dataset_size:6000
loss:ContrastiveLoss
Eval Results
text-embeddings-inference
arxiv:
1908.10084
Model card
Files
Files and versions
xet
Community
Train
Deploy
Use this model
main
Sentence-Transformer_Roberta_Reddit_ed
Ctrl+K
Ctrl+K
1 contributor
History:
2 commits
IrisCHEN04
Upload folder using huggingface_hub
11526af
verified
about 2 months ago
.ipynb_checkpoints
Upload folder using huggingface_hub
about 2 months ago
1_Pooling
Upload folder using huggingface_hub
about 2 months ago
.gitattributes
Safe
1.52 kB
initial commit
about 2 months ago
README.md
Safe
24 kB
Upload folder using huggingface_hub
about 2 months ago
config.json
Safe
717 Bytes
Upload folder using huggingface_hub
about 2 months ago
config_sentence_transformers.json
Safe
283 Bytes
Upload folder using huggingface_hub
about 2 months ago
merges.txt
Safe
456 kB
Upload folder using huggingface_hub
about 2 months ago
model.safetensors
Safe
1.42 GB
xet
Upload folder using huggingface_hub
about 2 months ago
modules.json
Safe
349 Bytes
Upload folder using huggingface_hub
about 2 months ago
sentence_bert_config.json
Safe
57 Bytes
Upload folder using huggingface_hub
about 2 months ago
special_tokens_map.json
Safe
964 Bytes
Upload folder using huggingface_hub
about 2 months ago
tokenizer.json
Safe
3.56 MB
Upload folder using huggingface_hub
about 2 months ago
tokenizer_config.json
Safe
1.44 kB
Upload folder using huggingface_hub
about 2 months ago
vocab.json
Safe
798 kB
Upload folder using huggingface_hub
about 2 months ago