Update README.md
Browse files
README.md
CHANGED
@@ -130,6 +130,22 @@ When fine-tuned on downstream tasks, this model achieves the following results:
|
|
130 |
For more details on the evaluation, please visit our [GitHub repository](https://github.com/alex-shvets/emopillars).
|
131 |
|
132 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
133 |
## Disclaimer
|
134 |
|
135 |
<details>
|
|
|
130 |
For more details on the evaluation, please visit our [GitHub repository](https://github.com/alex-shvets/emopillars).
|
131 |
|
132 |
|
133 |
+
## Citation information
|
134 |
+
|
135 |
+
If you use this model, please cite our [paper](https://arxiv.org/abs/2504.16856):
|
136 |
+
```bibtex
|
137 |
+
@misc{shvets2025emopillarsknowledgedistillation,
|
138 |
+
title={Emo Pillars: Knowledge Distillation to Support Fine-Grained Context-Aware and Context-Less Emotion Classification},
|
139 |
+
author={Alexander Shvets},
|
140 |
+
year={2025},
|
141 |
+
eprint={2504.16856},
|
142 |
+
archivePrefix={arXiv},
|
143 |
+
primaryClass={cs.CL},
|
144 |
+
url={https://arxiv.org/abs/2504.16856}
|
145 |
+
}
|
146 |
+
```
|
147 |
+
|
148 |
+
|
149 |
## Disclaimer
|
150 |
|
151 |
<details>
|