srinivasanAI commited on
Commit
0563c59
·
verified ·
1 Parent(s): 7fae551
Files changed (1) hide show
  1. README.md +10 -4
README.md CHANGED
@@ -169,12 +169,17 @@ SentenceTransformer(
169
  )
170
  ```
171
 
172
- ## Usage
173
 
174
- ### Direct Usage (Sentence Transformers)
175
 
176
- First install the Sentence Transformers library:
177
 
 
 
 
 
 
178
  from sentence_transformers import SentenceTransformer, util
179
 
180
  # Load the fine-tuned model from the Hub
@@ -201,11 +206,12 @@ passage_embeddings = model.encode(passages)
201
  similarities = util.cos_sim(query_embedding, passage_embeddings)
202
 
203
  # 4. Print the results
204
- print(f"Query: {query.replace(instruction, '')}\n")
205
  for i, passage in enumerate(passages):
206
  print(f"Similarity: {similarities[0][i]:.4f} | Passage: {passage}")
207
  ```
208
 
 
209
  <!--
210
  ### Direct Usage (Transformers)
211
 
 
169
  )
170
  ```
171
 
172
+ # Fine-Tuned BGE-Small Model for Q&A
173
 
174
+ This is a `BAAI/bge-small-en-v1.5` model that has been fine-tuned for a specific Question & Answering task using the `MultipleNegativesRankingLoss` in the `sentence-transformers` library.
175
 
176
+ It has been trained on a private dataset of 100,000+ question-answer pairs. Its primary purpose is to be the retriever model in a Retrieval-Augmented Generation (RAG) system. It excels at mapping questions to the passages that contain their answers.
177
 
178
+ ## How to Use (Practical Inference Example)
179
+
180
+ The primary use case is to find the most relevant passage for a given query.
181
+
182
+ ```python
183
  from sentence_transformers import SentenceTransformer, util
184
 
185
  # Load the fine-tuned model from the Hub
 
206
  similarities = util.cos_sim(query_embedding, passage_embeddings)
207
 
208
  # 4. Print the results
209
+ print(f"Query: {query.replace(instruction, '')}\\n")
210
  for i, passage in enumerate(passages):
211
  print(f"Similarity: {similarities[0][i]:.4f} | Passage: {passage}")
212
  ```
213
 
214
+
215
  <!--
216
  ### Direct Usage (Transformers)
217