Update README.md
Browse files
README.md
CHANGED
@@ -6876,10 +6876,16 @@ def get_detailed_instruct(task_description: str, query: str) -> str:
|
|
6876 |
|
6877 |
# Each query must come with a one-sentence instruction that describes the task
|
6878 |
task = 'Given a web search query, retrieve relevant passages that answer the query'
|
6879 |
-
|
6880 |
-
|
6881 |
-
|
6882 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
6883 |
|
6884 |
tokenizer = AutoTokenizer.from_pretrained('intfloat/e5-mistral-7b-instruct')
|
6885 |
model = AutoModel.from_pretrained('intfloat/e5-mistral-7b-instruct')
|
@@ -6921,7 +6927,7 @@ Yes, this is how the model is trained, otherwise you will see a performance degr
|
|
6921 |
The task definition should be a one-sentence instruction that describes the task.
|
6922 |
This is a way to customize text embeddings for different scenarios through natural language instructions.
|
6923 |
|
6924 |
-
Please check out [unilm/e5/utils.py](https://github.com/microsoft/unilm/blob/
|
6925 |
|
6926 |
On the other hand, there is no need to add instructions to the document side.
|
6927 |
|
@@ -6929,6 +6935,10 @@ On the other hand, there is no need to add instructions to the document side.
|
|
6929 |
|
6930 |
Different versions of `transformers` and `pytorch` could cause negligible but non-zero performance differences.
|
6931 |
|
|
|
|
|
|
|
|
|
6932 |
## Citation
|
6933 |
|
6934 |
If you find our paper or models helpful, please consider cite as follows:
|
|
|
6876 |
|
6877 |
# Each query must come with a one-sentence instruction that describes the task
|
6878 |
task = 'Given a web search query, retrieve relevant passages that answer the query'
|
6879 |
+
queries = [
|
6880 |
+
get_detailed_instruct(task, 'how much protein should a female eat'),
|
6881 |
+
get_detailed_instruct(task, 'summit define')
|
6882 |
+
]
|
6883 |
+
# No need to add instruction for retrieval documents
|
6884 |
+
documents = [
|
6885 |
+
"As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
|
6886 |
+
"Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."
|
6887 |
+
]
|
6888 |
+
input_texts = queries + documents
|
6889 |
|
6890 |
tokenizer = AutoTokenizer.from_pretrained('intfloat/e5-mistral-7b-instruct')
|
6891 |
model = AutoModel.from_pretrained('intfloat/e5-mistral-7b-instruct')
|
|
|
6927 |
The task definition should be a one-sentence instruction that describes the task.
|
6928 |
This is a way to customize text embeddings for different scenarios through natural language instructions.
|
6929 |
|
6930 |
+
Please check out [unilm/e5/utils.py](https://github.com/microsoft/unilm/blob/9c0f1ff7ca53431fe47d2637dfe253643d94185b/e5/utils.py#L106) for instructions we used for evaluation.
|
6931 |
|
6932 |
On the other hand, there is no need to add instructions to the document side.
|
6933 |
|
|
|
6935 |
|
6936 |
Different versions of `transformers` and `pytorch` could cause negligible but non-zero performance differences.
|
6937 |
|
6938 |
+
**3. Where are the LoRA-only weights?**
|
6939 |
+
|
6940 |
+
You can find the LoRA-only weights at [https://huggingface.co/intfloat/e5-mistral-7b-instruct/tree/main/lora](https://huggingface.co/intfloat/e5-mistral-7b-instruct/tree/main/lora).
|
6941 |
+
|
6942 |
## Citation
|
6943 |
|
6944 |
If you find our paper or models helpful, please consider cite as follows:
|