qa-expert-7B-V1.0 / README.md
khaimaitien's picture
Update README.md
2d27d52
metadata
language:
  - en
pipeline_tag: text-generation

Model Card for qa-expert-7B-V1.0

This model aims to handle Multi-hop Question answering by splitting a multi-hop questions into a sequence of single questions, handle these single questions then summarize the information to get the final answer.

Model Details

This model is a fine-tuned version of mistralai/Mistral-7B-v0.1 on the dataset: khaimaitien/qa-expert-multi-hop-qa-V1.0

You can get more information about how to use/train the model from this repo: https://github.com/khaimt/qa_expert

Model Sources [optional]

How to Get Started with the Model

First, you need to clone the repo: https://github.com/khaimt/qa_expert

Then install the requirements:

pip install -r requirements.txt

Here is the example code:

from qa_expert import get_inference_model, InferenceType
def retrieve(query: str) -> str:
    # You need to implement this retrieval function, input is a query and output is a string
    # This can be treated as the function to call in function calling of OpenAI
    return context

model_inference = get_inference_model(InferenceType.hf, "khaimaitien/qa-expert-7B-V1.0")
answer, messages = model_inference.generate_answer(question, retriever_func)