Update the version name
Browse files
README.md
CHANGED
@@ -11,9 +11,9 @@ tags:
|
|
11 |
pipeline_tag: text-generation
|
12 |
---
|
13 |
|
14 |
-
# Bangla-Llama-3.2-3B-Instruct-QA-
|
15 |
|
16 |
-
<b>Bengali Question-Answering Model</b> | Fine-tuned on Llama-3 Architecture | Version
|
17 |
|
18 |
## Model Description
|
19 |
This model is optimized for question-answering in the Bengali language. It is fine-tuned using **Llama-3-3B** architecture with Unsloth. The model is trained on a **context-aware instruct dataset**, designed to generate accurate and relevant responses.
|
@@ -30,7 +30,7 @@ pip install transformers torch accelerate
|
|
30 |
from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
|
31 |
import torch
|
32 |
|
33 |
-
model_name = "Kowshik24/Bangla-llama-3.2-3B-Instruct-QA-
|
34 |
|
35 |
# Load model and tokenizer
|
36 |
tokenizer = AutoTokenizer.from_pretrained(model_name)
|
@@ -107,10 +107,10 @@ If this model helps you in your work, please cite it as follows:
|
|
107 |
```bibtex
|
108 |
@software{BanglaLlama3QA,
|
109 |
author = {Kowshik},
|
110 |
-
title = {Bangla-Llama-3.2-3B-Instruct-QA-
|
111 |
year = {2024},
|
112 |
publisher = {Hugging Face},
|
113 |
-
url = {https://huggingface.co/Kowshik24/Bangla-llama-3.2-3B-Instruct-QA-
|
114 |
}
|
115 |
```
|
116 |
|
|
|
11 |
pipeline_tag: text-generation
|
12 |
---
|
13 |
|
14 |
+
# Bangla-Llama-3.2-3B-Instruct-QA-v2
|
15 |
|
16 |
+
<b>Bengali Question-Answering Model</b> | Fine-tuned on Llama-3 Architecture | Version 2
|
17 |
|
18 |
## Model Description
|
19 |
This model is optimized for question-answering in the Bengali language. It is fine-tuned using **Llama-3-3B** architecture with Unsloth. The model is trained on a **context-aware instruct dataset**, designed to generate accurate and relevant responses.
|
|
|
30 |
from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
|
31 |
import torch
|
32 |
|
33 |
+
model_name = "Kowshik24/Bangla-llama-3.2-3B-Instruct-QA-v2"
|
34 |
|
35 |
# Load model and tokenizer
|
36 |
tokenizer = AutoTokenizer.from_pretrained(model_name)
|
|
|
107 |
```bibtex
|
108 |
@software{BanglaLlama3QA,
|
109 |
author = {Kowshik},
|
110 |
+
title = {Bangla-Llama-3.2-3B-Instruct-QA-v2},
|
111 |
year = {2024},
|
112 |
publisher = {Hugging Face},
|
113 |
+
url = {https://huggingface.co/Kowshik24/Bangla-llama-3.2-3B-Instruct-QA-v2}
|
114 |
}
|
115 |
```
|
116 |
|