End of training
Browse files
README.md
CHANGED
@@ -15,14 +15,14 @@ This student model is distilled from the teacher model [roneneldan/TinyStories-3
|
|
15 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
16 |
|
17 |
It achieves the following results on the evaluation set:
|
18 |
-
- eval_enwikippl:
|
19 |
-
- eval_frwikippl:
|
20 |
-
- eval_zhwikippl:
|
21 |
-
- eval_tinystoriesppl:
|
22 |
-
- eval_loss:
|
23 |
-
- eval_runtime: 66.
|
24 |
-
- eval_samples_per_second: 75.
|
25 |
-
- eval_steps_per_second: 9.
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
@@ -45,7 +45,7 @@ More information needed
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
-
- distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=10.0, loss_fn=
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 0.004
|
51 |
- train_batch_size: 8
|
@@ -56,34 +56,34 @@ The following hyperparameters were used during training:
|
|
56 |
- num_epochs: 1.0
|
57 |
|
58 |
### Resource Usage
|
59 |
-
Peak GPU Memory: 8.
|
60 |
|
61 |
### Eval-Phase Metrics
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
|
65 |
-
| 0 | 0 |
|
66 |
-
| 3000 | 0.0485 |
|
67 |
-
| 6000 | 0.0970 |
|
68 |
-
| 9000 | 0.1455 |
|
69 |
-
| 12000 | 0.1939 |
|
70 |
-
| 15000 | 0.2424 |
|
71 |
-
| 18000 | 0.2909 |
|
72 |
-
| 21000 | 0.3394 |
|
73 |
-
| 24000 | 0.3879 |
|
74 |
-
| 27000 | 0.4364 |
|
75 |
-
| 30000 | 0.4848 |
|
76 |
-
| 33000 | 0.5333 |
|
77 |
-
| 36000 | 0.5818 |
|
78 |
-
| 39000 | 0.6303 |
|
79 |
-
| 42000 | 0.6788 |
|
80 |
-
| 45000 | 0.7273 |
|
81 |
-
| 48000 | 0.7758 |
|
82 |
-
| 51000 | 0.8242 |
|
83 |
-
| 54000 | 0.8727 |
|
84 |
-
| 57000 | 0.9212 |
|
85 |
-
| 60000 | 0.9697 |
|
86 |
-
| 61875 | 1.0 |
|
87 |
|
88 |
### Framework versions
|
89 |
- Distily 0.2.0
|
|
|
15 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
16 |
|
17 |
It achieves the following results on the evaluation set:
|
18 |
+
- eval_enwikippl: 148.8680
|
19 |
+
- eval_frwikippl: 21987.7637
|
20 |
+
- eval_zhwikippl: 181662.0469
|
21 |
+
- eval_tinystoriesppl: 12.2941
|
22 |
+
- eval_loss: 25.4402
|
23 |
+
- eval_runtime: 66.3462
|
24 |
+
- eval_samples_per_second: 75.362
|
25 |
+
- eval_steps_per_second: 9.42
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
+
- distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=10.0, loss_fn=kl, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=10.0, loss_fn=mse, layer_mapper=None, projector=None))
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 0.004
|
51 |
- train_batch_size: 8
|
|
|
56 |
- num_epochs: 1.0
|
57 |
|
58 |
### Resource Usage
|
59 |
+
Peak GPU Memory: 8.2666 GB
|
60 |
|
61 |
### Eval-Phase Metrics
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
|
65 |
+
| 0 | 0 | 8167.3613 | 48488.5742 | 38.4688 | 65.8686 | 75.909 | 9.489 | 3345.3254 | 73944.1484 |
|
66 |
+
| 3000 | 0.0485 | 145.5666 | 22094.8828 | 25.4406 | 65.8501 | 75.93 | 9.491 | 11.9420 | 179685.7344 |
|
67 |
+
| 6000 | 0.0970 | 146.8635 | 22376.7676 | 25.4394 | 66.4135 | 75.286 | 9.411 | 11.9667 | 183024.1719 |
|
68 |
+
| 9000 | 0.1455 | 148.8680 | 21987.7637 | 25.4402 | 66.3462 | 75.362 | 9.42 | 12.2941 | 181662.0469 |
|
69 |
+
| 12000 | 0.1939 | 151.0636 | 22504.7676 | 25.4400 | 66.2246 | 75.501 | 9.438 | 12.5052 | 181759.0938 |
|
70 |
+
| 15000 | 0.2424 | 146.5339 | 22604.8535 | 25.4392 | 66.1192 | 75.621 | 9.453 | 11.8540 | 189888.4375 |
|
71 |
+
| 18000 | 0.2909 | 147.3192 | 22481.0215 | 25.4400 | 66.2457 | 75.477 | 9.435 | 12.0058 | 183905.2969 |
|
72 |
+
| 21000 | 0.3394 | 150.9525 | 22555.5625 | 25.4390 | 66.2661 | 75.453 | 9.432 | 12.4310 | 188575.7344 |
|
73 |
+
| 24000 | 0.3879 | 149.8920 | 22155.6523 | 25.4404 | 66.2363 | 75.487 | 9.436 | 12.4593 | 177493.9531 |
|
74 |
+
| 27000 | 0.4364 | 147.1653 | 22531.7402 | 25.4398 | 66.3823 | 75.321 | 9.415 | 11.9514 | 183905.2969 |
|
75 |
+
| 30000 | 0.4848 | 150.4855 | 22580.9805 | 25.4400 | 66.2281 | 75.497 | 9.437 | 12.4172 | 183513.1875 |
|
76 |
+
| 33000 | 0.5333 | 145.7359 | 22307.5195 | 25.4400 | 66.4448 | 75.25 | 9.406 | 11.9159 | 180165.8438 |
|
77 |
+
| 36000 | 0.5818 | 148.7297 | 22495.2617 | 25.4396 | 66.2715 | 75.447 | 9.431 | 12.1426 | 186574.0156 |
|
78 |
+
| 39000 | 0.6303 | 147.5820 | 22807.9492 | 25.4406 | 66.6342 | 75.037 | 9.38 | 11.9944 | 187372.1406 |
|
79 |
+
| 42000 | 0.6788 | 150.2292 | 22193.125 | 25.4402 | 66.5873 | 75.089 | 9.386 | 12.5202 | 182050.1875 |
|
80 |
+
| 45000 | 0.7273 | 146.7725 | 22207.2051 | 25.4400 | 66.1476 | 75.589 | 9.449 | 11.9890 | 181468.2812 |
|
81 |
+
| 48000 | 0.7758 | 146.3014 | 22194.6914 | 25.4398 | 66.4166 | 75.282 | 9.41 | 11.9746 | 177588.7812 |
|
82 |
+
| 51000 | 0.8242 | 148.6375 | 22533.3301 | 25.4402 | 66.2612 | 75.459 | 9.432 | 12.1471 | 186275.5156 |
|
83 |
+
| 54000 | 0.8727 | 147.6220 | 22394.1035 | 25.4404 | 66.4085 | 75.292 | 9.411 | 12.1140 | 185581.1406 |
|
84 |
+
| 57000 | 0.9212 | 148.8161 | 22679.8047 | 25.4400 | 66.3328 | 75.377 | 9.422 | 12.1230 | 187872.7812 |
|
85 |
+
| 60000 | 0.9697 | 146.8180 | 22345.2695 | 25.4392 | 66.5261 | 75.158 | 9.395 | 12.0317 | 181371.5625 |
|
86 |
+
| 61875 | 1.0 | 149.0526 | 22099.5410 | 25.4400 | 66.498 | 75.19 | 9.399 | 12.3048 | 181371.5625 |
|
87 |
|
88 |
### Framework versions
|
89 |
- Distily 0.2.0
|
logs/attn_loss_fn=mse, attn_weight=10.0, hs_loss_fn=kl, hs_weight=10.0, learning_rate=0.004, warmup_ratio=0/events.out.tfevents.1723827667.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:28531dfe2f26a3344850a505958d64116767b0443a5e9b00640e291088d0f056
|
3 |
+
size 312
|