my_awesome_asr_mind_model_2

This model is a fine-tuned version of facebook/wav2vec2-base on the minds14 dataset. It achieves the following results on the evaluation set:

  • Loss: 11.9803
  • Wer: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: multi-GPU
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 2000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.3509 10 72.3784 1.0325
No log 0.7018 20 69.9115 1.0282
120.859 1.0351 30 62.1146 1.0276
120.859 1.3860 40 68.2888 1.0393
123.4983 1.7368 50 62.4194 1.0319
123.4983 2.0702 60 63.7456 1.0282
123.4983 2.4211 70 61.1313 1.0362
114.5327 2.7719 80 57.3109 1.0393
114.5327 3.1053 90 63.8890 1.0405
92.0841 3.4561 100 54.7270 1.0368
92.0841 3.8070 110 52.6792 1.0166
92.0841 4.1404 120 48.9400 1.0043
82.4037 4.4912 130 45.7991 1.0006
82.4037 4.8421 140 42.2452 1.0
66.3204 5.1754 150 42.3656 1.0
66.3204 5.5263 160 35.6008 1.0
66.3204 5.8772 170 33.9067 1.0
54.6729 6.2105 180 35.5143 1.0
54.6729 6.5614 190 29.4490 1.0
47.0622 6.9123 200 31.8126 1.0
47.0622 7.2456 210 29.0946 1.0
47.0622 7.5965 220 29.9088 1.0
45.3162 7.9474 230 31.5442 1.0
45.3162 8.2807 240 32.5329 1.0
42.1556 8.6316 250 23.7687 1.0
42.1556 8.9825 260 26.9307 1.0
42.1556 9.3158 270 28.8010 1.0
37.6117 9.6667 280 25.4160 1.0
37.6117 10.0 290 31.1950 1.0
44.1981 10.3509 300 27.4846 1.0
44.1981 10.7018 310 27.9379 1.0
44.1981 11.0351 320 29.6825 1.0
37.2733 11.3860 330 28.1058 1.0
37.2733 11.7368 340 32.6282 1.0
43.2232 12.0702 350 26.3374 1.0
43.2232 12.4211 360 27.2462 1.0
43.2232 12.7719 370 29.3861 1.0
39.612 13.1053 380 35.5876 1.0
39.612 13.4561 390 26.4560 1.0
38.569 13.8070 400 30.3207 1.0
38.569 14.1404 410 24.5547 1.0
38.569 14.4912 420 17.0083 1.0
32.3299 14.8421 430 14.2530 1.0
32.3299 15.1754 440 12.6302 1.0
20.7078 15.5263 450 13.9304 1.0
20.7078 15.8772 460 11.8625 1.0
20.7078 16.2105 470 9.0681 1.0
18.1266 16.5614 480 11.8818 1.0
18.1266 16.9123 490 11.3368 1.0
18.9089 17.2456 500 10.6305 1.0
18.9089 17.5965 510 12.9240 1.0
18.9089 17.9474 520 11.0008 1.0
16.9312 18.2807 530 11.0118 1.0
16.9312 18.6316 540 12.8748 1.0
16.9932 18.9825 550 11.6355 1.0
16.9932 19.3158 560 10.7317 1.0
16.9932 19.6667 570 13.6561 1.0
16.7623 20.0 580 11.6266 1.0
16.7623 20.3509 590 11.2916 1.0
17.8217 20.7018 600 11.7152 1.0
17.8217 21.0351 610 9.4275 1.0
17.8217 21.3860 620 13.7821 1.0
17.5088 21.7368 630 10.2990 1.0
17.5088 22.0702 640 9.6245 1.0
19.6078 22.4211 650 12.9627 1.0
19.6078 22.7719 660 15.1354 1.0
19.6078 23.1053 670 9.5284 1.0
16.1741 23.4561 680 12.0476 1.0
16.1741 23.8070 690 11.7502 1.0
17.9308 24.1404 700 10.6731 1.0
17.9308 24.4912 710 7.9770 1.0
17.9308 24.8421 720 12.0266 1.0
14.9285 25.1754 730 12.6227 1.0
14.9285 25.5263 740 13.2743 1.0
16.4239 25.8772 750 15.2944 1.0
16.4239 26.2105 760 10.8676 1.0
16.4239 26.5614 770 10.1779 1.0
16.4257 26.9123 780 9.8973 1.0
16.4257 27.2456 790 9.1979 1.0
15.2576 27.5965 800 10.5384 1.0
15.2576 27.9474 810 10.2605 1.0
15.2576 28.2807 820 10.1279 1.0
16.9401 28.6316 830 14.6968 1.0
16.9401 28.9825 840 11.0547 1.0
16.8678 29.3158 850 9.5137 1.0
16.8678 29.6667 860 8.0353 1.0
16.8678 30.0 870 9.4884 1.0
14.1705 30.3509 880 7.4831 1.0
14.1705 30.7018 890 10.3853 1.0
12.4927 31.0351 900 9.0745 1.0
12.4927 31.3860 910 12.7624 1.0
12.4927 31.7368 920 10.9480 1.0
13.2511 32.0702 930 9.6001 1.0
13.2511 32.4211 940 10.0557 1.0
11.6225 32.7719 950 9.7758 1.0
11.6225 33.1053 960 9.1229 1.0
11.6225 33.4561 970 10.5019 1.0
9.8833 33.8070 980 10.4892 1.0
9.8833 34.1404 990 11.0348 1.0
9.4119 34.4912 1000 8.3344 1.0
9.4119 34.8421 1010 7.9083 1.0
9.4119 35.1754 1020 12.6228 1.0
9.2786 35.5263 1030 10.6666 1.0
9.2786 35.8772 1040 8.6810 1.0
8.7912 36.2105 1050 6.8696 1.0006
8.7912 36.5614 1060 7.5894 1.0
8.7912 36.9123 1070 10.2622 1.0
8.7841 37.2456 1080 5.6630 1.0
8.7841 37.5965 1090 11.6318 1.0
8.6711 37.9474 1100 8.7652 1.0018
8.6711 38.2807 1110 8.7590 1.0
8.6711 38.6316 1120 12.6484 1.0
8.539 38.9825 1130 9.6365 1.0
8.539 39.3158 1140 9.4819 1.0
8.1748 39.6667 1150 12.8334 1.0
8.1748 40.0 1160 13.2719 1.0
8.1748 40.3509 1170 8.2957 1.0
8.1309 40.7018 1180 12.5595 1.0
8.1309 41.0351 1190 11.5797 1.0
8.1256 41.3860 1200 11.6983 1.0
8.1256 41.7368 1210 9.5951 1.0
8.1256 42.0702 1220 11.3728 1.0
7.8759 42.4211 1230 8.7616 1.0
7.8759 42.7719 1240 9.5483 1.0
7.7428 43.1053 1250 9.1343 1.0
7.7428 43.4561 1260 8.3384 1.0
7.7428 43.8070 1270 7.6469 1.0
7.8274 44.1404 1280 13.3626 1.0
7.8274 44.4912 1290 10.8081 1.0
7.7745 44.8421 1300 7.2929 1.0
7.7745 45.1754 1310 13.2838 1.0
7.7745 45.5263 1320 9.0075 1.0
7.5423 45.8772 1330 8.6317 1.0
7.5423 46.2105 1340 6.8053 1.0049
7.8556 46.5614 1350 10.9606 1.0
7.8556 46.9123 1360 11.3875 1.0
7.8556 47.2456 1370 8.8268 1.0
7.435 47.5965 1380 8.9938 1.0
7.435 47.9474 1390 7.1973 1.0
7.5196 48.2807 1400 10.2050 1.0
7.5196 48.6316 1410 7.6926 1.0
7.5196 48.9825 1420 10.3563 1.0
7.2988 49.3158 1430 10.2332 1.0
7.2988 49.6667 1440 9.7126 1.0
7.9996 50.0 1450 10.2664 1.0
7.9996 50.3509 1460 11.0007 1.0
7.9996 50.7018 1470 7.6255 1.0
7.4441 51.0351 1480 12.5091 1.0
7.4441 51.3860 1490 8.5649 1.0
7.158 51.7368 1500 11.0922 1.0
7.158 52.0702 1510 5.5673 1.0
7.158 52.4211 1520 10.2015 1.0
7.3232 52.7719 1530 9.6431 1.0
7.3232 53.1053 1540 12.6101 1.0
7.2851 53.4561 1550 7.4011 1.0
7.2851 53.8070 1560 12.8892 1.0
7.2851 54.1404 1570 12.7549 1.0
7.13 54.4912 1580 7.0643 1.0
7.13 54.8421 1590 13.2393 1.0
7.241 55.1754 1600 7.7935 1.0
7.241 55.5263 1610 8.4647 1.0
7.241 55.8772 1620 9.1026 1.0
7.0737 56.2105 1630 7.7952 1.0
7.0737 56.5614 1640 7.5957 1.0
7.3016 56.9123 1650 11.8872 1.0
7.3016 57.2456 1660 11.6042 1.0
7.3016 57.5965 1670 8.3765 1.0
7.0452 57.9474 1680 11.1004 1.0
7.0452 58.2807 1690 15.3519 1.0
7.1912 58.6316 1700 9.1837 1.0
7.1912 58.9825 1710 9.0901 1.0
7.1912 59.3158 1720 10.7072 1.0
7.1017 59.6667 1730 11.8671 1.0
7.1017 60.0 1740 8.7281 1.0
7.0886 60.3509 1750 11.4665 1.0
7.0886 60.7018 1760 9.1317 1.0
7.0886 61.0351 1770 9.4837 1.0
7.4933 61.3860 1780 11.0652 1.0
7.4933 61.7368 1790 8.6223 1.0
6.9543 62.0702 1800 7.3605 1.0
6.9543 62.4211 1810 10.4706 1.0
6.9543 62.7719 1820 13.1088 1.0
7.1395 63.1053 1830 9.2800 1.0
7.1395 63.4561 1840 9.6354 1.0
7.0871 63.8070 1850 7.5116 1.0
7.0871 64.1404 1860 8.7661 1.0
7.0871 64.4912 1870 10.5800 1.0
6.9937 64.8421 1880 11.2561 1.0
6.9937 65.1754 1890 8.7216 1.0
6.9257 65.5263 1900 9.6056 1.0
6.9257 65.8772 1910 11.6064 1.0
6.9257 66.2105 1920 15.2734 1.0
6.9951 66.5614 1930 12.0877 1.0
6.9951 66.9123 1940 7.6033 1.0
6.8757 67.2456 1950 11.5104 1.0
6.8757 67.5965 1960 9.6040 1.0
6.8757 67.9474 1970 7.8608 1.0
6.9293 68.2807 1980 13.0502 1.0
6.9293 68.6316 1990 7.9725 1.0
7.0953 68.9825 2000 11.9803 1.0

Framework versions

  • Transformers 4.48.0.dev0
  • Pytorch 2.1.0a0+32f93b1
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
19
Safetensors
Model size
94.4M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ryos17/my_awesome_asr_mind_model_2

Finetuned
(695)
this model

Evaluation results