SChem5Labels-google-t5-v1_1-large-inter_model-frequency-human_annots_str

This model is a fine-tuned version of google/t5-v1_1-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3174

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss
20.7337 1.0 25 23.6106
19.8181 2.0 50 21.7021
18.5121 3.0 75 18.6451
16.2008 4.0 100 12.8515
14.7585 5.0 125 10.1513
11.4985 6.0 150 9.3063
10.0012 7.0 175 9.0999
8.6447 8.0 200 8.8536
8.3766 9.0 225 8.7377
8.1231 10.0 250 8.6065
8.0504 11.0 275 8.4953
8.0051 12.0 300 8.3466
7.7615 13.0 325 8.1101
7.6344 14.0 350 7.8434
7.3869 15.0 375 7.6118
7.3158 16.0 400 7.4364
7.1667 17.0 425 7.3245
6.988 18.0 450 7.2732
7.0234 19.0 475 7.2125
6.9602 20.0 500 7.1699
6.8268 21.0 525 7.1251
6.8999 22.0 550 7.0694
6.3358 23.0 575 0.6967
0.86 24.0 600 0.6708
0.7148 25.0 625 0.6347
0.674 26.0 650 0.6297
0.6683 27.0 675 0.6234
0.6711 28.0 700 0.6214
0.6773 29.0 725 0.6170
0.6596 30.0 750 0.6162
0.6812 31.0 775 0.6207
0.6813 32.0 800 0.6121
0.6655 33.0 825 0.6147
0.653 34.0 850 0.6112
0.651 35.0 875 0.6082
0.6659 36.0 900 0.6075
0.6639 37.0 925 0.6023
0.6529 38.0 950 0.5998
0.6434 39.0 975 0.6023
0.645 40.0 1000 0.5976
0.64 41.0 1025 0.5987
0.6423 42.0 1050 0.5971
0.6439 43.0 1075 0.5940
0.6472 44.0 1100 0.5946
0.6459 45.0 1125 0.5965
0.6229 46.0 1150 0.5940
0.6414 47.0 1175 0.6111
0.6215 48.0 1200 0.5910
0.6375 49.0 1225 0.5928
0.6324 50.0 1250 0.6103
0.6212 51.0 1275 0.6075
0.6406 52.0 1300 0.5869
0.631 53.0 1325 0.5866
0.6227 54.0 1350 0.5833
0.6255 55.0 1375 0.5837
0.633 56.0 1400 0.5833
0.6224 57.0 1425 0.5822
0.628 58.0 1450 0.5858
0.62 59.0 1475 0.5827
0.6211 60.0 1500 0.5834
0.6236 61.0 1525 0.5794
0.6136 62.0 1550 0.5820
0.6132 63.0 1575 0.5800
0.6098 64.0 1600 0.5788
0.6167 65.0 1625 0.5785
0.6271 66.0 1650 0.5794
0.615 67.0 1675 0.5764
0.6143 68.0 1700 0.5789
0.6085 69.0 1725 0.5756
0.611 70.0 1750 0.5740
0.6161 71.0 1775 0.5730
0.5999 72.0 1800 0.5738
0.6194 73.0 1825 0.5753
0.6221 74.0 1850 0.5731
0.6061 75.0 1875 0.5738
0.6038 76.0 1900 0.5745

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.6.1
  • Tokenizers 0.14.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for owanr/SChem5Labels-google-t5-v1_1-large-inter_model-frequency-human_annots_str

Finetuned
(103)
this model