whisper-small-sorani-v1

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2918
  • Wer: 23.4484

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1250
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.3599 0.3365 500 0.3697 52.7891
0.2722 0.6729 1000 0.2894 41.9277
0.2465 1.0094 1500 0.2676 40.5501
0.1811 1.3459 2000 0.2399 37.7610
0.1754 1.6824 2500 0.2218 34.5224
0.0896 2.0188 3000 0.2165 32.1297
0.1024 2.3553 3500 0.2097 31.3563
0.0998 2.6918 4000 0.2038 30.7038
0.0606 3.0283 4500 0.2031 29.3745
0.052 3.3647 5000 0.2153 29.5292
0.0633 3.7012 5500 0.2065 28.5818
0.0305 4.0377 6000 0.2216 28.3014
0.0323 4.3742 6500 0.2163 27.3395
0.0347 4.7106 7000 0.2180 27.9099
0.0215 5.0471 7500 0.2291 27.5232
0.0237 5.3836 8000 0.2320 27.0978
0.0236 5.7201 8500 0.2318 27.0253
0.0126 6.0565 9000 0.2399 26.6193
0.0141 6.3930 9500 0.2468 26.3486
0.0123 6.7295 10000 0.2493 25.7299
0.009 7.0659 10500 0.2435 26.5371
0.0082 7.4024 11000 0.2500 25.9039
0.0128 7.7389 11500 0.2517 26.3679
0.0042 8.0754 12000 0.2594 26.0731
0.003 8.4118 12500 0.2690 26.1698
0.0056 8.7483 13000 0.2602 25.4592
0.0057 9.0848 13500 0.2617 24.4103
0.0046 9.4213 14000 0.2621 25.2078
0.0046 9.7577 14500 0.2672 24.9468
0.0016 10.0942 15000 0.2693 24.4538
0.0019 10.4307 15500 0.2692 24.0719
0.0014 10.7672 16000 0.2674 24.5795
0.0008 11.1036 16500 0.2768 24.5601
0.0007 11.4401 17000 0.2743 23.8786
0.0007 11.7766 17500 0.2751 24.2411
0.0003 12.1131 18000 0.2808 24.1638
0.0002 12.4495 18500 0.2815 24.1589
0.0006 12.7860 19000 0.2827 23.5934
0.0002 13.1225 19500 0.2847 23.3759
0.0005 13.4590 20000 0.2881 23.6466
0.0001 13.7954 20500 0.2900 23.3227
0.0 14.1319 21000 0.2903 23.3372
0.0 14.4684 21500 0.2914 23.2744
0.0 14.8048 22000 0.2918 23.4484

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.2
Downloads last month
1
Safetensors
Model size
0.2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for samil24/whisper-small-sorani-v1

Finetuned
(3435)
this model