Update 'Training'
parent
f949055598
commit
af30bfa560
|
@ -118,12 +118,10 @@ After filling in the values, click `Save Training Configuration`, and it should
|
||||||
### Suggested Settings
|
### Suggested Settings
|
||||||
|
|
||||||
The following settings are robust enough that I can suggest them, for small or large datasets.
|
The following settings are robust enough that I can suggest them, for small or large datasets.
|
||||||
* Epochs: `100` (50 is usually "enough", large datasets can get 20)
|
* Epochs: `50` (increase/decrease as needed, according to how big your dataset is)
|
||||||
* Learning Rate: `0.0001`
|
* Learning Rate: `0.0001`
|
||||||
* Learning Rate Scheme: `MultiStepLR`
|
* Learning Rate Scheme: `MultiStepLR`
|
||||||
* Learning Rate Schedule:
|
* Learning Rate Schedule: `[2, 4, 9, 18, 25, 33, 50, 59]`
|
||||||
- small datasets: `[9, 18, 25, 33, 50, 59]`
|
|
||||||
- large datasets: `[2, 4, 9, 18, 25, 33, 50, 59]`
|
|
||||||
|
|
||||||
However, if you want accuracy, I suggest an LR of 1e-5 (0.00001), as longer training at low LRs definitely make the best models.
|
However, if you want accuracy, I suggest an LR of 1e-5 (0.00001), as longer training at low LRs definitely make the best models.
|
||||||
|
|
||||||
|
|
Loading…
Reference in New Issue
Block a user