Update README.md
Browse files
README.md
CHANGED
@@ -4,7 +4,7 @@ license_name: open-aleph-license
|
|
4 |
license_link: LICENSE
|
5 |
---
|
6 |
|
7 |
-
# Model Card for
|
8 |
|
9 |
This model card provides an overview of Pharia-1-Embedding-4608-control, an embedding model
|
10 |
developed by Aleph Alpha Research*. Pharia-1-Embedding-4608-control has been built on top of Pharia-1-LLM-7B-control.
|
@@ -16,23 +16,29 @@ data in compliance with applicable EU and national regulations, including copyri
|
|
16 |
Furthermore it shows good cross-lingual performance allowing for prompting and text to be embedded written
|
17 |
in different languages. The finetuning was always performed using English instructions.
|
18 |
|
19 |
-
This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1).
|
20 |
|
21 |
-
## Model
|
22 |
|
23 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
|
25 |
-
<!-- Provide a longer summary of what this model is. -->
|
26 |
|
|
|
27 |
|
|
|
|
|
|
|
|
|
|
|
|
|
28 |
|
29 |
-
|
30 |
-
- **Funded by [optional]:** [More Information Needed]
|
31 |
-
- **Shared by [optional]:** [More Information Needed]
|
32 |
-
- **Model type:** [More Information Needed]
|
33 |
-
- **Language(s) (NLP):** [More Information Needed]
|
34 |
-
- **License:** [More Information Needed]
|
35 |
-
- **Finetuned from model [optional]:** [More Information Needed]
|
36 |
|
37 |
### Model Sources [optional]
|
38 |
|
|
|
4 |
license_link: LICENSE
|
5 |
---
|
6 |
|
7 |
+
# Model Card for Pharia-1-Embedding-4608-control
|
8 |
|
9 |
This model card provides an overview of Pharia-1-Embedding-4608-control, an embedding model
|
10 |
developed by Aleph Alpha Research*. Pharia-1-Embedding-4608-control has been built on top of Pharia-1-LLM-7B-control.
|
|
|
16 |
Furthermore it shows good cross-lingual performance allowing for prompting and text to be embedded written
|
17 |
in different languages. The finetuning was always performed using English instructions.
|
18 |
|
|
|
19 |
|
20 |
+
## Model Overview
|
21 |
|
22 |
+
- **Developed by:** Aleph Alpha Research
|
23 |
+
<!--- **Funded by [optional]:** [More Information Needed]-->
|
24 |
+
<!--- **Shared by [optional]:** [More Information Needed]-->
|
25 |
+
- **Model type:** Embedding adapter on top of Pharia-1-LLM-7B-control trained with representational
|
26 |
+
instruction-tuning (inspired by the approach of GritLM).
|
27 |
+
- **Language(s) (NLP):** Trained on English, German, French, Spanish.
|
28 |
+
<!--- **License:** [More Information Needed]-->
|
29 |
+
<!--- **Finetuned from model [optional]:** [More Information Needed]-->
|
30 |
|
|
|
31 |
|
32 |
+
### Model Description
|
33 |
|
34 |
+
Model
|
35 |
+
Embedding Size
|
36 |
+
Description
|
37 |
+
Pharia-1-Embedding-4608-control
|
38 |
+
4608
|
39 |
+
Pharia-1-Embedding-4608-control is an Embedding model optimized for German, French and Spanish and designed for customizable embeddings at runtime via instructions (prompts).
|
40 |
|
41 |
+
<!-- Provide a longer summary of what this model is. -->
|
|
|
|
|
|
|
|
|
|
|
|
|
42 |
|
43 |
### Model Sources [optional]
|
44 |
|