Alvin-Nahabwe's picture
End of training
2bede0a verified
metadata
library_name: transformers
license: mit
base_model: facebook/w2v-bert-2.0
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: w2v-bert-2.0-mdc-rutoro-asr-3.0.0
    results: []

w2v-bert-2.0-mdc-rutoro-asr-3.0.0

This model is a fine-tuned version of facebook/w2v-bert-2.0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2288
  • Wer: 0.2059
  • Cer: 0.0411

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
4.5614 1.0 84 2.8697 1.0 0.9556
2.9558 2.0 168 1.1423 0.9883 0.2675
1.3943 3.0 252 0.4472 0.5657 0.1024
1.0741 4.0 336 0.3785 0.4383 0.0772
1.1728 5.0 420 0.3127 0.3658 0.0654
0.8781 6.0 504 0.2809 0.2998 0.0558
0.8013 7.0 588 0.2611 0.2898 0.0543
0.7441 8.0 672 0.2340 0.2698 0.0513
0.7162 9.0 756 0.2467 0.2529 0.0484
0.7013 10.0 840 0.2505 0.2421 0.0466
0.5552 11.0 924 0.2313 0.2270 0.0433
0.5324 12.0 1008 0.2374 0.2204 0.0435
0.4722 13.0 1092 0.2206 0.2189 0.0425
0.638 14.0 1176 0.2197 0.2070 0.0413
0.6088 15.0 1260 0.2083 0.2129 0.0416
0.4534 16.0 1344 0.2137 0.2085 0.0412
0.5218 17.0 1428 0.2246 0.2062 0.0403
0.3967 18.0 1512 0.2257 0.2021 0.0396
0.6081 19.0 1596 0.2431 0.1957 0.0398
0.239 20.0 1680 0.2029 0.2023 0.0392
0.3622 21.0 1764 0.2479 0.1985 0.0395
0.3429 22.0 1848 0.2335 0.1987 0.0403
0.5635 23.0 1932 0.2226 0.1982 0.0397
0.4587 24.0 2016 0.2288 0.2059 0.0411

Framework versions

  • Transformers 4.57.1
  • Pytorch 2.9.0+cu128
  • Datasets 4.4.0
  • Tokenizers 0.22.1