Configuration Parsing Warning: In adapter_config.json: "peft.task_type" must be a string

Whisper Small ko

This model is a fine-tuned version of openai/whisper-large-v3-turbo on the custom dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1904

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 64
  • eval_batch_size: 256
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • training_steps: 2000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
0.831 0.0319 10 1.5891
0.8371 0.0639 20 1.5741
0.8014 0.0958 30 1.5449
0.7445 0.1278 40 1.4933
0.6808 0.1597 50 1.3938
0.5229 0.1917 60 1.1658
0.3057 0.2236 70 0.9399
0.2167 0.2556 80 0.8860
0.1855 0.2875 90 0.8494
0.1614 0.3195 100 0.8240
0.1227 0.3514 110 0.7880
0.1179 0.3834 120 0.7530
0.0939 0.4153 130 0.7150
0.0857 0.4473 140 0.6848
0.0653 0.4792 150 0.6642
0.0779 0.5112 160 0.6487
0.0644 0.5431 170 0.6472
0.0701 0.5751 180 0.6389
0.0545 0.6070 190 0.6243
0.0606 0.6390 200 0.6031
0.0581 0.6709 210 0.5788
0.0582 0.7029 220 0.5645
0.0507 0.7348 230 0.5589
0.0476 0.7668 240 0.5435
0.0431 0.7987 250 0.5336
0.0452 0.8307 260 0.5239
0.0425 0.8626 270 0.5211
0.035 0.8946 280 0.5237
0.0413 0.9265 290 0.5049
0.0642 0.9585 300 0.4803
0.0356 0.9904 310 0.4834
0.0461 1.0224 320 0.4719
0.0321 1.0543 330 0.4745
0.0384 1.0863 340 0.4579
0.0363 1.1182 350 0.4500
0.0301 1.1502 360 0.4383
0.0469 1.1821 370 0.4324
0.0347 1.2141 380 0.4253
0.0307 1.2460 390 0.4142
0.0341 1.2780 400 0.4111
0.0252 1.3099 410 0.4044
0.0372 1.3419 420 0.4048
0.0346 1.3738 430 0.4000
0.029 1.4058 440 0.3963
0.0277 1.4377 450 0.3899
0.0322 1.4696 460 0.3875
0.0241 1.5016 470 0.3878
0.0424 1.5335 480 0.3835
0.0323 1.5655 490 0.3781
0.0456 1.5974 500 0.3796
0.0326 1.6294 510 0.3735
0.0318 1.6613 520 0.3689
0.03 1.6933 530 0.3510
0.0307 1.7252 540 0.3461
0.0318 1.7572 550 0.3425
0.03 1.7891 560 0.3332
0.0299 1.8211 570 0.3359
0.0262 1.8530 580 0.3376
0.0337 1.8850 590 0.3369
0.0344 1.9169 600 0.3427
0.0236 1.9489 610 0.3365
0.0229 1.9808 620 0.3318
0.0211 2.0128 630 0.3369
0.0248 2.0447 640 0.3299
0.0346 2.0767 650 0.3179
0.0223 2.1086 660 0.3230
0.0251 2.1406 670 0.3253
0.0192 2.1725 680 0.3259
0.0219 2.2045 690 0.3240
0.0284 2.2364 700 0.3269
0.0246 2.2684 710 0.3208
0.0281 2.3003 720 0.3202
0.0277 2.3323 730 0.3147
0.0249 2.3642 740 0.3068
0.0184 2.3962 750 0.3018
0.0279 2.4281 760 0.2991
0.0178 2.4601 770 0.2980
0.0234 2.4920 780 0.2977
0.0231 2.5240 790 0.2951
0.0242 2.5559 800 0.2949
0.0279 2.5879 810 0.2947
0.0216 2.6198 820 0.2950
0.0192 2.6518 830 0.2924
0.0273 2.6837 840 0.2881
0.0192 2.7157 850 0.2865
0.0267 2.7476 860 0.2822
0.0276 2.7796 870 0.2771
0.0234 2.8115 880 0.2784
0.0236 2.8435 890 0.2826
0.0255 2.8754 900 0.2762
0.0306 2.9073 910 0.2703
0.0213 2.9393 920 0.2699
0.0242 2.9712 930 0.2692
0.0231 3.0032 940 0.2690
0.0184 3.0351 950 0.2697
0.0145 3.0671 960 0.2674
0.0196 3.0990 970 0.2671
0.0205 3.1310 980 0.2668
0.0212 3.1629 990 0.2666
0.0218 3.1949 1000 0.2618
0.0202 3.2268 1010 0.2658
0.0187 3.2588 1020 0.2593
0.0161 3.2907 1030 0.2588
0.0175 3.3227 1040 0.2603
0.0162 3.3546 1050 0.2572
0.0346 3.3866 1060 0.2437
0.0199 3.4185 1070 0.2499
0.0235 3.4505 1080 0.2497
0.0175 3.4824 1090 0.2467
0.0187 3.5144 1100 0.2458
0.0171 3.5463 1110 0.2461
0.0189 3.5783 1120 0.2446
0.0229 3.6102 1130 0.2440
0.021 3.6422 1140 0.2422
0.0163 3.6741 1150 0.2400
0.0223 3.7061 1160 0.2406
0.0241 3.7380 1170 0.2367
0.0166 3.7700 1180 0.2372
0.0187 3.8019 1190 0.2378
0.0286 3.8339 1200 0.2396
0.0244 3.8658 1210 0.2357
0.0239 3.8978 1220 0.2317
0.026 3.9297 1230 0.2311
0.0203 3.9617 1240 0.2312
0.0177 3.9936 1250 0.2275
0.0199 4.0256 1260 0.2284
0.0174 4.0575 1270 0.2299
0.0195 4.0895 1280 0.2284
0.0167 4.1214 1290 0.2288
0.0197 4.1534 1300 0.2278
0.0194 4.1853 1310 0.2258
0.0233 4.2173 1320 0.2188
0.018 4.2492 1330 0.2154
0.0181 4.2812 1340 0.2146
0.0177 4.3131 1350 0.2157
0.0172 4.3450 1360 0.2168
0.02 4.3770 1370 0.2166
0.0144 4.4089 1380 0.2127
0.0166 4.4409 1390 0.2121
0.0183 4.4728 1400 0.2131
0.0159 4.5048 1410 0.2126
0.0137 4.5367 1420 0.2128
0.0218 4.5687 1430 0.2130
0.0145 4.6006 1440 0.2106
0.0192 4.6326 1450 0.2061
0.0134 4.6645 1460 0.2058
0.0204 4.6965 1470 0.2062
0.0157 4.7284 1480 0.2050
0.0142 4.7604 1490 0.2054
0.0192 4.7923 1500 0.2051
0.0137 4.8243 1510 0.2047
0.0296 4.8562 1520 0.2062
0.0176 4.8882 1530 0.2060
0.0146 4.9201 1540 0.2050
0.0197 4.9521 1550 0.2036
0.0173 4.9840 1560 0.2026
0.0183 5.0160 1570 0.2031
0.0177 5.0479 1580 0.2034
0.0145 5.0799 1590 0.2035
0.015 5.1118 1600 0.2024
0.0173 5.1438 1610 0.2015
0.0201 5.1757 1620 0.2015
0.0138 5.2077 1630 0.2017
0.0141 5.2396 1640 0.2012
0.0164 5.2716 1650 0.2015
0.0166 5.3035 1660 0.2004
0.0147 5.3355 1670 0.1997
0.0216 5.3674 1680 0.1997
0.0132 5.3994 1690 0.1990
0.0113 5.4313 1700 0.1980
0.0159 5.4633 1710 0.1977
0.0125 5.4952 1720 0.1978
0.0138 5.5272 1730 0.1973
0.0099 5.5591 1740 0.1972
0.0296 5.5911 1750 0.1969
0.0224 5.6230 1760 0.1961
0.0156 5.6550 1770 0.1952
0.0238 5.6869 1780 0.1944
0.0112 5.7188 1790 0.1940
0.0133 5.7508 1800 0.1935
0.0261 5.7827 1810 0.1924
0.0146 5.8147 1820 0.1919
0.0136 5.8466 1830 0.1921
0.0118 5.8786 1840 0.1913
0.0163 5.9105 1850 0.1914
0.0199 5.9425 1860 0.1915
0.017 5.9744 1870 0.1914
0.0163 6.0064 1880 0.1912
0.0189 6.0383 1890 0.1910
0.0146 6.0703 1900 0.1910
0.0266 6.1022 1910 0.1909
0.0114 6.1342 1920 0.1908
0.017 6.1661 1930 0.1908
0.0147 6.1981 1940 0.1907
0.0125 6.2300 1950 0.1907
0.0201 6.2620 1960 0.1906
0.011 6.2939 1970 0.1905
0.0169 6.3259 1980 0.1905
0.0148 6.3578 1990 0.1904
0.0113 6.3898 2000 0.1904

Framework versions

  • PEFT 0.14.0
  • Transformers 4.47.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
4
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for nomnoos37/stt-turbo-1227-v1.1-peft-eng-2k-rank64-full

Adapter
(37)
this model