1
0
Commit Graph

255 Commits

Author SHA1 Message Date
mrq
008a1f5f8f simplified spawning the training process by having it spawn the distributed training processes in the train.py script, so it should work on Windows too 2023-03-11 01:37:00 +00:00
mrq
2feb6da0c0 cleanups and fixes, fix DLAS throwing errors from '''too short of sound files''' by just culling them during transcription 2023-03-11 01:19:49 +00:00
mrq
7f2da0f5fb rewrote how AIVC gets training metrics (need to clean up later) 2023-03-10 22:35:32 +00:00
mrq
df0edacc60 fix the cleanup actually only doing 2 despite requesting more than 2, surprised no one has pointed it out 2023-03-10 14:04:07 +00:00
mrq
8e890d3023 forgot to fix reset settings to use the new arg-agnostic way 2023-03-10 13:49:39 +00:00
mrq
d250e0ec17 brain fried 2023-03-10 04:27:34 +00:00
mrq
0b364b590e maybe don't --force-reinstall to try and force downgrading, it just forces everything to uninstall then reinstall 2023-03-10 04:22:47 +00:00
mrq
c231d842aa make dependencies after the one in this repo force reinstall to downgrade, i hope, I hav eother things to do than validate this works 2023-03-10 03:53:21 +00:00
mrq
c92b006129 I really hate YAML 2023-03-10 03:48:46 +00:00
mrq
d3184004fd only God knows why the YAML spec lets you specify string values without quotes 2023-03-10 01:58:30 +00:00
mrq
eb1551ee92 what I thought was an override and not a ternary 2023-03-09 23:04:02 +00:00
mrq
c3b43d2429 today I learned adamw_zero actually negates ANY LR schemes 2023-03-09 19:42:31 +00:00
mrq
cb273b8428 cleanup 2023-03-09 18:34:52 +00:00
mrq
7c71f7239c expose options for CosineAnnealingLR_Restart (seems to be able to train very quickly due to the restarts 2023-03-09 14:17:01 +00:00
mrq
2f6dd9c076 some cleanup 2023-03-09 06:20:05 +00:00
mrq
5460e191b0 added loss graph, because I'm going to experiment with cosine annealing LR and I need to view my loss 2023-03-09 05:54:08 +00:00
mrq
a182df8f4e is 2023-03-09 04:33:12 +00:00
mrq
a01eb10960 (try to) unload voicefixer if it raises an error during loading voicefixer 2023-03-09 04:28:14 +00:00
mrq
dc1902b91c cleanup block that makes embedding latents for random/microphone happen, remove builtin voice options from voice list to avoid duplicates 2023-03-09 04:23:36 +00:00
mrq
797882336b maybe remedy an issue that crops up if you have a non-wav and non-json file in a results folder (assuming) 2023-03-09 04:06:07 +00:00
mrq
b64948d966 while I'm breaking things, migrating dependencies to modules folder for tidiness 2023-03-09 04:03:57 +00:00
mrq
b8867a5fb0 added the mysterious tortoise_compat flag mentioned in DLAS repo 2023-03-09 03:41:40 +00:00
mrq
3b4f4500d1 when you have three separate machines running and you test one one, but you accidentally revert changes because you then test on another 2023-03-09 03:26:18 +00:00
mrq
ef75dba995 I hate commas make tuples 2023-03-09 02:43:05 +00:00
mrq
f795dd5c20 you might be wondering why so many small commits instead of rolling the HEAD back one to just combine them, i don't want to force push and roll back the paperspace i'm testing in 2023-03-09 02:31:32 +00:00
mrq
51339671ec typo 2023-03-09 02:29:08 +00:00
mrq
1b18b3e335 forgot to save the simplified training input json first before touching any of the settings that dump to the yaml 2023-03-09 02:27:20 +00:00
mrq
221ac38b32 forgot to update to finetune subdir 2023-03-09 02:25:32 +00:00
mrq
0e80e311b0 added VRAM validation for a given batch:gradient accumulation size ratio (based emprically off of 6GiB, 16GiB, and 16x2GiB, would be nice to have more data on what's safe) 2023-03-09 02:08:06 +00:00
mrq
ef7b957fff oops 2023-03-09 00:53:00 +00:00
mrq
b0baa1909a forgot template 2023-03-09 00:32:35 +00:00
mrq
3f321fe664 big cleanup to make my life easier when i add more parameters 2023-03-09 00:26:47 +00:00
mrq
0ab091e7ff oops 2023-03-08 16:09:29 +00:00
mrq
40e8d0774e share if you 2023-03-08 15:59:16 +00:00
mrq
d58b67004a colab notebook uses venv and normal scripts to keep it on parity with a local install (and it literally just works stop creating issues for someething inconsistent with known solutions) 2023-03-08 15:51:13 +00:00
mrq
34dcb845b5 actually make using adamw_zero optimizer for multi-gpus work 2023-03-08 15:31:33 +00:00
mrq
8494628f3c normalize validation batch size because i oom'd without it getting scaled 2023-03-08 05:27:20 +00:00
mrq
d7e75a51cf I forgot about the changelog and never kept up with it, so I'll just not use a changelog 2023-03-08 05:14:50 +00:00
mrq
ff07f707cb disable validation if validation dataset not found, clamp validation batch size to validation dataset size instead of simply reusing batch size, switch to adamw_zero optimizier when training with multi-gpus (because the yaml comment said to and I think it might be why I'm absolutely having garbage luck training this japanese dataset) 2023-03-08 04:47:05 +00:00
mrq
f1788a5639 lazy wrap around the voicefixer block because sometimes it just an heros itself despite having a specific block to load it beforehand 2023-03-08 04:12:22 +00:00
mrq
83b5125854 fixed notebooks, provided paperspace notebook 2023-03-08 03:29:12 +00:00
mrq
b4098dca73 made validation working (will document later) 2023-03-08 02:58:00 +00:00
mrq
a7e0dc9127 oops 2023-03-08 00:51:51 +00:00
mrq
e862169e7f set validation to save rate and validation file if exists (need to test later) 2023-03-07 20:38:31 +00:00
mrq
fe8bf7a9d1 added helper script to cull short enough lines from training set as a validation set (if it yields good results doing validation during training, i'll add it to the web ui) 2023-03-07 20:16:49 +00:00
mrq
7f89e8058a fixed update checker for dlas+tortoise-tts 2023-03-07 19:33:56 +00:00
mrq
6d7e143f53 added override for large training plots 2023-03-07 19:29:09 +00:00
mrq
3718e9d0fb set NaN alarm to show the iteration it happened it 2023-03-07 19:22:11 +00:00
mrq
c27ee3ce95 added update checking for dlas and tortoise-tts, caching voices (for a given model and voice name) so random latents will remain the same 2023-03-07 17:04:45 +00:00
mrq
166d491a98 fixes 2023-03-07 13:40:41 +00:00