Add link
#1
by
SayBitekhan
- opened
README.md
CHANGED
@@ -24,7 +24,7 @@ The KazRoBERTa-Large KazQAD model is an optimized variant of the RoBERTa model,
|
|
24 |
|
25 |
The model is designed to perform efficiently on question-answering tasks in Kazakh, demonstrating substantial improvements in metrics after fine-tuning and adaptation using LoRA.
|
26 |
|
27 |
-
- **Developed by:** Tleubayeva Arailym, Saparbek Makhambet, Bassanova Nurgul, Shomanov Aday, Sabitkhanov Askhat
|
28 |
- **Model type:** Transformer-based (RoBERTa)
|
29 |
- **Language(s) (NLP):** Kazakh (kk)
|
30 |
- **License:** apache-2.0
|
@@ -111,6 +111,6 @@ Saparbek Makhambet
|
|
111 |
|
112 |
Bassanova Nurgul
|
113 |
|
114 |
-
Sabitkhanov Askhat
|
115 |
|
116 |
Shomanov Aday
|
|
|
24 |
|
25 |
The model is designed to perform efficiently on question-answering tasks in Kazakh, demonstrating substantial improvements in metrics after fine-tuning and adaptation using LoRA.
|
26 |
|
27 |
+
- **Developed by:** Tleubayeva Arailym, Saparbek Makhambet, Bassanova Nurgul, Shomanov Aday, [Sabitkhanov Askhat](https://huggingface.co/SayBitekhan)
|
28 |
- **Model type:** Transformer-based (RoBERTa)
|
29 |
- **Language(s) (NLP):** Kazakh (kk)
|
30 |
- **License:** apache-2.0
|
|
|
111 |
|
112 |
Bassanova Nurgul
|
113 |
|
114 |
+
[Sabitkhanov Askhat](https://huggingface.co/SayBitekhan)
|
115 |
|
116 |
Shomanov Aday
|