Datasets:
Update README.md
Browse files
README.md
CHANGED
|
@@ -208,63 +208,201 @@ license: mit
|
|
| 208 |
tags:
|
| 209 |
- medical
|
| 210 |
---
|
| 211 |
-
# MultiMed-ST: Large-scale Many-to-many Multilingual Medical Speech Translation
|
| 212 |
|
| 213 |
-
|
|
|
|
|
|
|
| 214 |
|
| 215 |
-
<
|
| 216 |
-
<div align="center">Bach Phan Tat, Nguyen Kim Hai Bui, Quan Dang, Hung-Phong Tran, Thanh-Thuy Nguyen, Ly Nguyen, Tuan-Minh Phan, Thi Thu Phuong Tran, Chris Ngo,</div>
|
| 217 |
-
<div align="center">Nguyen X. Khanh**, Thanh Nguyen-Tang**</div>
|
| 218 |
|
| 219 |
|
| 220 |
-
<
|
| 221 |
-
<
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 222 |
|
| 223 |
-
|
| 224 |
-
|
|
|
|
| 225 |
|
| 226 |
-
>
|
|
|
|
|
|
|
| 227 |
|
| 228 |
-
|
| 229 |
-
|
|
|
|
| 230 |
|
| 231 |
-
|
| 232 |
-
Please cite this paper: [https://arxiv.org/abs/2504.03546](https://arxiv.org/abs/2504.03546)
|
| 233 |
|
| 234 |
-
|
| 235 |
-
|
| 236 |
-
|
| 237 |
-
|
| 238 |
-
|
| 239 |
-
|
| 240 |
-
|
| 241 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 242 |
|
| 243 |
-
## Dataset
|
| 244 |
|
| 245 |
-
Dataset
|
|
|
|
| 246 |
|
| 247 |
-
|
| 248 |
|
| 249 |
-
|
|
|
|
| 250 |
|
| 251 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 252 |
|
| 253 |
-
|
| 254 |
-
|
| 255 |
-
|
| 256 |
-
|
| 257 |
-
|
| 258 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 259 |
|
| 260 |
-
**Tuyen Tran**
|
| 261 |
-
```
|
| 262 |
Hanoi University of Science and Technology, Vietnam
|
| 263 |
-
|
| 264 |
-
|
| 265 |
-
|
| 266 |
-
|
| 267 |
-
|
| 268 |
-
|
| 269 |
-
|
| 270 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 208 |
tags:
|
| 209 |
- medical
|
| 210 |
---
|
|
|
|
| 211 |
|
| 212 |
+
<p align="center">
|
| 213 |
+
<img src="./MultiMedST_icon.png" alt="MultiMedST_icon" width="70">
|
| 214 |
+
</p>
|
| 215 |
|
| 216 |
+
<h1 align="center">MultiMed-ST: Large-scale Many-to-many Multilingual Medical Speech Translation</h1>
|
|
|
|
|
|
|
| 217 |
|
| 218 |
|
| 219 |
+
<p align="center">
|
| 220 |
+
<a href="https://arxiv.org/abs/2504.03546">
|
| 221 |
+
<img src="https://img.shields.io/badge/Paper-arXiv%3A2504.03546-b31b1b?logo=arxiv&logoColor=white" alt="Paper">
|
| 222 |
+
</a>
|
| 223 |
+
<a href="https://huggingface.co/datasets/leduckhai/MultiMed-ST">
|
| 224 |
+
<img src="https://img.shields.io/badge/Dataset-HuggingFace-blue?logo=huggingface&logoColor=white" alt="Dataset">
|
| 225 |
+
</a>
|
| 226 |
+
<a href="https://huggingface.co/leduckhai/MultiMed-ST">
|
| 227 |
+
<img src="https://img.shields.io/badge/Models-HuggingFace-green?logo=huggingface&logoColor=white" alt="Models">
|
| 228 |
+
</a>
|
| 229 |
+
<a href="https://github.com/leduckhai/MultiMed-ST/blob/main/LICENSE">
|
| 230 |
+
<img src="https://img.shields.io/badge/License-MIT-yellow" alt="License">
|
| 231 |
+
</a>
|
| 232 |
+
<a href="https://github.com/leduckhai/MultiMed-ST/stargazers">
|
| 233 |
+
<img src="https://img.shields.io/github/stars/leduckhai/MultiMed-ST?style=social" alt="Stars">
|
| 234 |
+
</a>
|
| 235 |
+
</p>
|
| 236 |
|
| 237 |
+
<p align="center">
|
| 238 |
+
<strong>π EMNLP 2025</strong>
|
| 239 |
+
</p>
|
| 240 |
|
| 241 |
+
<p align="center">
|
| 242 |
+
<b>Khai Le-Duc*</b>, <b>Tuyen Tran*</b>, Bach Phan Tat, Nguyen Kim Hai Bui, Quan Dang, Hung-Phong Tran, Thanh-Thuy Nguyen, Ly Nguyen, Tuan-Minh Phan, Thi Thu Phuong Tran, Chris Ngo, Nguyen X. Khanh**, Thanh Nguyen-Tang**
|
| 243 |
+
</p>
|
| 244 |
|
| 245 |
+
<p align="center">
|
| 246 |
+
<sub>*Equal contribution | **Equal supervision</sub>
|
| 247 |
+
</p>
|
| 248 |
|
| 249 |
+
---
|
|
|
|
| 250 |
|
| 251 |
+
> β **If you find this work useful, please consider starring the repo and citing our paper!**
|
| 252 |
+
|
| 253 |
+
---
|
| 254 |
+
|
| 255 |
+
## π§ Abstract
|
| 256 |
+
|
| 257 |
+
Multilingual speech translation (ST) in the **medical domain** enhances patient care by enabling effective communication across language barriers, alleviating workforce shortages, and improving diagnosis and treatment β especially in global health emergencies.
|
| 258 |
+
|
| 259 |
+
In this work, we introduce **MultiMed-ST**, the *first large-scale multilingual medical speech translation dataset*, spanning **all translation directions** across **five languages**:
|
| 260 |
+
π»π³ Vietnamese, π¬π§ English, π©πͺ German, π«π· French, π¨π³ Traditional & Simplified Chinese.
|
| 261 |
+
|
| 262 |
+
With **290,000 samples**, *MultiMed-ST* represents:
|
| 263 |
+
- π§© the **largest medical MT dataset** to date
|
| 264 |
+
- π the **largest many-to-many multilingual ST dataset** across all domains
|
| 265 |
+
|
| 266 |
+
We also conduct **the most comprehensive ST analysis in the field's history**, to our best knowledge, covering:
|
| 267 |
+
- β
Empirical baselines
|
| 268 |
+
- π Bilingual vs. multilingual study
|
| 269 |
+
- π§© End-to-end vs. cascaded models
|
| 270 |
+
- π― Task-specific vs. multi-task seq2seq approaches
|
| 271 |
+
- π£οΈ Code-switching analysis
|
| 272 |
+
- π Quantitative & qualitative error analysis
|
| 273 |
+
|
| 274 |
+
All **code, data, and models** are publicly available: π [**GitHub Repository**](https://github.com/leduckhai/MultiMed-ST)
|
| 275 |
+
|
| 276 |
+
|
| 277 |
+
<p align="center">
|
| 278 |
+
<img src="./poster_MultiMed-ST_EMNLP2025.png" alt="poster_MultiMed-ST_EMNLP2025" width="85%">
|
| 279 |
+
</p>
|
| 280 |
+
|
| 281 |
+
---
|
| 282 |
+
|
| 283 |
+
## π§° Repository Overview
|
| 284 |
+
|
| 285 |
+
This repository provides scripts for:
|
| 286 |
+
|
| 287 |
+
- ποΈ **Automatic Speech Recognition (ASR)**
|
| 288 |
+
- π **Machine Translation (MT)**
|
| 289 |
+
- π **Speech Translation (ST)** β both **cascaded** and **end-to-end** seq2seq models
|
| 290 |
+
|
| 291 |
+
It includes:
|
| 292 |
+
|
| 293 |
+
- βοΈ Model preparation & fine-tuning
|
| 294 |
+
- π Training & inference scripts
|
| 295 |
+
- π Evaluation & benchmarking utilities
|
| 296 |
+
|
| 297 |
+
---
|
| 298 |
|
| 299 |
+
## π¦ Dataset & Models
|
| 300 |
|
| 301 |
+
- **Dataset:** [π€ Hugging Face Dataset](https://huggingface.co/datasets/leduckhai/MultiMed-ST)
|
| 302 |
+
- **Fine-tuned Models:** [π€ Hugging Face Models](https://huggingface.co/leduckhai/MultiMed-ST)
|
| 303 |
|
| 304 |
+
You can explore and download all fine-tuned models for **MultiMed-ST** directly from our Hugging Face repository:
|
| 305 |
|
| 306 |
+
<details>
|
| 307 |
+
<summary><b>πΉ Whisper ASR Fine-tuned Models (Click to expand) </b></summary>
|
| 308 |
|
| 309 |
+
| Language | Model Link |
|
| 310 |
+
|-----------|------------|
|
| 311 |
+
| Chinese | [whisper-small-chinese](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/asr/whisper-small-chinese) |
|
| 312 |
+
| English | [whisper-small-english](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/asr/whisper-small-english) |
|
| 313 |
+
| French | [whisper-small-french](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/asr/whisper-small-french) |
|
| 314 |
+
| German | [whisper-small-german](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/asr/whisper-small-german) |
|
| 315 |
+
| Multilingual | [whisper-small-multilingual](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/asr/whisper-small-multilingual) |
|
| 316 |
+
| Vietnamese | [whisper-small-vietnamese](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/asr/whisper-small-vietnamese) |
|
| 317 |
|
| 318 |
+
</details>
|
| 319 |
+
|
| 320 |
+
<details>
|
| 321 |
+
<summary><b>πΉ LLaMA-based MT Fine-tuned Models (Click to expand) </b></summary>
|
| 322 |
+
|
| 323 |
+
| Source β Target | Model Link |
|
| 324 |
+
|------------------|------------|
|
| 325 |
+
| Chinese β English | [llama_Chinese_English](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_Chinese_English) |
|
| 326 |
+
| Chinese β French | [llama_Chinese_French](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_Chinese_French) |
|
| 327 |
+
| Chinese β German | [llama_Chinese_German](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_Chinese_German) |
|
| 328 |
+
| Chinese β Vietnamese | [llama_Chinese_Vietnamese](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_Chinese_Vietnamese) |
|
| 329 |
+
| English β Chinese | [llama_English_Chinese](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_English_Chinese) |
|
| 330 |
+
| English β French | [llama_English_French](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_English_French) |
|
| 331 |
+
| English β German | [llama_English_German](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_English_German) |
|
| 332 |
+
| English β Vietnamese | [llama_English_Vietnamese](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_English_Vietnamese) |
|
| 333 |
+
| French β Chinese | [llama_French_Chinese](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_French_Chinese) |
|
| 334 |
+
| French β English | [llama_French_English](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_French_English) |
|
| 335 |
+
| French β German | [llama_French_German](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_French_German) |
|
| 336 |
+
| French β Vietnamese | [llama_French_Vietnamese](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_French_Vietnamese) |
|
| 337 |
+
| German β Chinese | [llama_German_Chinese](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_German_Chinese) |
|
| 338 |
+
| German β English | [llama_German_English](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_German_English) |
|
| 339 |
+
| German β French | [llama_German_French](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_German_French) |
|
| 340 |
+
| German β Vietnamese | [llama_German_Vietnamese](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_German_Vietnamese) |
|
| 341 |
+
| Vietnamese β Chinese | [llama_Vietnamese_Chinese](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_Vietnamese_Chinese) |
|
| 342 |
+
| Vietnamese β English | [llama_Vietnamese_English](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_Vietnamese_English) |
|
| 343 |
+
| Vietnamese β French | [llama_Vietnamese_French](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_Vietnamese_French) |
|
| 344 |
+
| Vietnamese β German | [llama_Vietnamese_German](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/llama_Vietnamese_German) |
|
| 345 |
+
|
| 346 |
+
</details>
|
| 347 |
+
|
| 348 |
+
<details>
|
| 349 |
+
<summary><b>πΉ m2m100_418M MT Fine-tuned Models (Click to expand) </b></summary>
|
| 350 |
+
|
| 351 |
+
| Source β Target | Model Link |
|
| 352 |
+
|------------------|------------|
|
| 353 |
+
| de β en | [m2m100_418M-finetuned-de-to-en](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-de-to-en) |
|
| 354 |
+
| de β fr | [m2m100_418M-finetuned-de-to-fr](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-de-to-fr) |
|
| 355 |
+
| de β vi | [m2m100_418M-finetuned-de-to-vi](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-de-to-vi) |
|
| 356 |
+
| de β zh | [m2m100_418M-finetuned-de-to-zh](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-de-to-zh) |
|
| 357 |
+
| en β de | [m2m100_418M-finetuned-en-to-de](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-en-to-de) |
|
| 358 |
+
| en β fr | [m2m100_418M-finetuned-en-to-fr](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-en-to-fr) |
|
| 359 |
+
| en β vi | [m2m100_418M-finetuned-en-to-vi](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-en-to-vi) |
|
| 360 |
+
| en β zh | [m2m100_418M-finetuned-en-to-zh](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-en-to-zh) |
|
| 361 |
+
| fr β de | [m2m100_418M-finetuned-fr-to-de](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-fr-to-de) |
|
| 362 |
+
| fr β en | [m2m100_418M-finetuned-fr-to-en](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-fr-to-en) |
|
| 363 |
+
| fr β vi | [m2m100_418M-finetuned-fr-to-vi](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-fr-to-vi) |
|
| 364 |
+
| fr β zh | [m2m100_418M-finetuned-fr-to-zh](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-fr-to-zh) |
|
| 365 |
+
| vi β de | [m2m100_418M-finetuned-vi-to-de](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-vi-to-de) |
|
| 366 |
+
| vi β en | [m2m100_418M-finetuned-vi-to-en](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-vi-to-en) |
|
| 367 |
+
| vi β fr | [m2m100_418M-finetuned-vi-to-fr](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-vi-to-fr) |
|
| 368 |
+
| vi β zh | [m2m100_418M-finetuned-vi-to-zh](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-vi-to-zh) |
|
| 369 |
+
| zh β de | [m2m100_418M-finetuned-zh-to-de](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-zh-to-de) |
|
| 370 |
+
| zh β en | [m2m100_418M-finetuned-zh-to-en](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-zh-to-en) |
|
| 371 |
+
| zh β fr | [m2m100_418M-finetuned-zh-to-fr](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-zh-to-fr) |
|
| 372 |
+
| zh β vi | [m2m100_418M-finetuned-zh-to-vi](https://huggingface.co/leduckhai/MultiMed-ST/tree/main/m2m100_418M-finetuned-zh-to-vi) |
|
| 373 |
+
|
| 374 |
+
</details>
|
| 375 |
+
|
| 376 |
+
---
|
| 377 |
+
|
| 378 |
+
## π¨βπ» Core Developers
|
| 379 |
+
|
| 380 |
+
1. **Khai Le-Duc**
|
| 381 |
+
|
| 382 |
+
University of Toronto, Canada
|
| 383 |
+
|
| 384 |
+
π§ [duckhai.le@mail.utoronto.ca](mailto:duckhai.le@mail.utoronto.ca)
|
| 385 |
+
π [https://github.com/leduckhai](https://github.com/leduckhai)
|
| 386 |
+
|
| 387 |
+
2. **Tuyen Tran**: π§ [tuyencbt@gmail.com](mailto:tuyencbt@gmail.com)
|
| 388 |
|
|
|
|
|
|
|
| 389 |
Hanoi University of Science and Technology, Vietnam
|
| 390 |
+
|
| 391 |
+
3. **Nguyen Kim Hai Bui**: π§ [htlulem185@gmail.com](mailto:htlulem185@gmail.com)
|
| 392 |
+
|
| 393 |
+
EΓΆtvΓΆs LorΓ‘nd University, Hungary
|
| 394 |
+
|
| 395 |
+
## π§Ύ Citation
|
| 396 |
+
|
| 397 |
+
If you use our dataset or models, please cite:
|
| 398 |
+
|
| 399 |
+
π [arXiv:2504.03546](https://arxiv.org/abs/2504.03546)
|
| 400 |
+
|
| 401 |
+
```bibtex
|
| 402 |
+
@inproceedings{le2025multimedst,
|
| 403 |
+
title={MultiMed-ST: Large-scale Many-to-many Multilingual Medical Speech Translation},
|
| 404 |
+
author={Le-Duc, Khai and Tran, Tuyen and Tat, Bach Phan and Bui, Nguyen Kim Hai and Anh, Quan Dang and Tran, Hung-Phong and Nguyen, Thanh Thuy and Nguyen, Ly and Phan, Tuan Minh and Tran, Thi Thu Phuong and others},
|
| 405 |
+
booktitle={Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing},
|
| 406 |
+
pages={11838--11963},
|
| 407 |
+
year={2025}
|
| 408 |
+
}
|