Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,29 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# MultiMed: Multilingual Medical Speech Recognition via Attention Encoder Decoder
|
2 |
+
|
3 |
+
## Description:
|
4 |
+
Multilingual automatic speech recognition (ASR) in the medical domain serves as a foundational task for various downstream applications such as speech translation, spoken language understanding, and voice-activated assistants.
|
5 |
+
This technology enhances patient care by enabling efficient communication across language barriers, alleviating specialized workforce shortages, and facilitating improved diagnosis and treatment, particularly during pandemics.
|
6 |
+
In this work, we introduce *MultiMed*, a collection of small-to-large end-to-end ASR models for the medical domain, spanning five languages: Vietnamese, English, German, French, and Mandarin Chinese, together with the corresponding real-world ASR dataset.
|
7 |
+
To our best knowledge, *MultiMed* stands as **the largest and the first multilingual medical ASR dataset**, in terms of total duration, number of speakers, diversity of diseases, recording conditions, speaker roles, unique medical terms, accents, and ICD-10 codes.
|
8 |
+
|
9 |
+
|
10 |
+
Please cite this paper: [https://arxiv.org/abs/2409.14074](https://arxiv.org/abs/2409.14074)
|
11 |
+
|
12 |
+
@inproceedings{le2024multimed,
|
13 |
+
title={MultiMed: Multilingual Medical Speech Recognition via Attention Encoder Decoder},
|
14 |
+
author={Le-Duc, Khai and Phan, Phuc and Pham, Tan-Hanh and Tat, Bach Phan and Ngo, Minh-Huong and Hy, Truong-Son},
|
15 |
+
journal={arXiv preprint arXiv:2409.14074},
|
16 |
+
year={2024}
|
17 |
+
}
|
18 |
+
To load labeled data, please refer to our [HuggingFace](https://huggingface.co/datasets/leduckhai/MultiMed), [Paperswithcodes](https://paperswithcode.com/dataset/multimed).
|
19 |
+
|
20 |
+
## Contact:
|
21 |
+
|
22 |
+
If any links are broken, please contact me for fixing!
|
23 |
+
|
24 |
+
```
|
25 |
+
Le Duc Khai
|
26 |
+
University of Toronto, Canada
|
27 |
+
Email: [email protected]
|
28 |
+
GitHub: https://github.com/leduckhai
|
29 |
+
```
|