Update README.md
Browse files
README.md
CHANGED
@@ -1,5 +1,4 @@
|
|
1 |
---
|
2 |
-
pipeline_tag: sentence-similarity
|
3 |
tags:
|
4 |
- sentence-transformers
|
5 |
- feature-extraction
|
@@ -10,6 +9,9 @@ datasets:
|
|
10 |
license: cc-by-sa-4.0
|
11 |
language:
|
12 |
- ja
|
|
|
|
|
|
|
13 |
---
|
14 |
|
15 |
|
@@ -78,6 +80,24 @@ SentenceTransformer(
|
|
78 |
)
|
79 |
```
|
80 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
81 |
## Citing & Authors
|
82 |
|
83 |
```
|
|
|
1 |
---
|
|
|
2 |
tags:
|
3 |
- sentence-transformers
|
4 |
- feature-extraction
|
|
|
9 |
license: cc-by-sa-4.0
|
10 |
language:
|
11 |
- ja
|
12 |
+
metrics:
|
13 |
+
- spearmanr
|
14 |
+
library_name: sentence-transformers
|
15 |
---
|
16 |
|
17 |
|
|
|
80 |
)
|
81 |
```
|
82 |
|
83 |
+
## Model Summary
|
84 |
+
|
85 |
+
- Fine-tuning method: Unsupervised SimCSE
|
86 |
+
- Base model: [cl-tohoku/bert-base-japanese-v3](https://huggingface.co/cl-tohoku/bert-base-japanese-v3)
|
87 |
+
- Training dataset: [Wiki40B](https://huggingface.co/datasets/wiki40b)
|
88 |
+
- Pooling strategy: cls (with an extra MLP layer only during training)
|
89 |
+
- Hidden size: 768
|
90 |
+
- Learning rate: 5e-5
|
91 |
+
- Batch size: 64
|
92 |
+
- Temperature: 0.05
|
93 |
+
- Max sequence length: 64
|
94 |
+
- Number of training examples: 2^20
|
95 |
+
- Validation interval (steps): 2^6
|
96 |
+
- Warmup ratio: 0.1
|
97 |
+
- Dtype: BFloat16
|
98 |
+
|
99 |
+
See the [GitHub repository](https://github.com/hppRC/simple-simcse-ja) for a detailed experimental setup.
|
100 |
+
|
101 |
## Citing & Authors
|
102 |
|
103 |
```
|