1
0
Fork 0
Commit Graph

246 Commits (fe03ae5839d838121f3a8a6aaca14f8a9ed2ed0a)
 

Author SHA1 Message Date
mrq 6925ec731b I don't remember. 2023-02-27 19:20:06 +07:00
mrq 47abde224c compat with python3.10+ finally (and maybe a small perf uplift with using cu117) 2023-02-26 17:46:57 +07:00
mrq 92553973be Added option to disable bitsandbytesoptimizations for systems that do not support it (systems without a Turing-onward Nvidia card), saves use of float16 and bitsandbytes for training into the config json 2023-02-26 01:57:56 +07:00
mrq aafeb9f96a actually fixed the training output text parser 2023-02-25 16:44:25 +07:00
mrq 65329dba31 oops, epoch increments twice 2023-02-25 15:31:18 +07:00
mrq 8b4da29d5f csome adjustments to the training output parser, now updates per iteration for really large batches (like the one I'm doing for a dataset size of 19420) 2023-02-25 13:55:25 +07:00
mrq d5d8821a9d fixed some files not copying for bitsandbytes (I was wrong to assume it copied folders too), fixed stopping generating and training, some other thing that I forgot since it's been slowly worked on in my small free times 2023-02-24 23:13:13 +07:00
mrq e5e16bc5b5 updating gitmodules to latest commits 2023-02-24 19:32:18 +07:00
mrq bedbb893ac clarified import dataset settings button 2023-02-24 16:40:22 +07:00
mrq f31ea9d5bc oops 2023-02-24 16:23:30 +07:00
mrq 2104dbdbc5 ops 2023-02-24 13:05:08 +07:00
mrq f6d0b66e10 finally added model refresh button, also searches in the training folder for outputted models so you don't even need to copy them 2023-02-24 12:58:41 +07:00
mrq 1e0fec4358 god i finally found some time and focus: reworded print/save freq per epoch => print/save freq (in epochs), added import config button to reread the last used settings (will check for the output folder's configs first, then the generated ones) and auto-grab the last resume state (if available), some other cleanups i genuinely don't remember what I did when I spaced out for 20 minutes 2023-02-23 23:22:23 +07:00
mrq 7d1220e83e forgot to mult by batch size 2023-02-23 15:38:04 +07:00
mrq 487f2ebf32 fixed the brain worm discrepancy between epochs, iterations, and steps 2023-02-23 15:31:43 +07:00
mrq 1cbcf14cff oops 2023-02-23 13:18:51 +07:00
mrq 41fca1a101 ugh 2023-02-23 07:20:40 +07:00
mrq 941a27d2b3 removed the logic to toggle BNB capabilities, since I guess I can't do that from outside the module 2023-02-23 07:05:39 +07:00
mrq 225dee22d4 huge success 2023-02-23 06:24:54 +07:00
mrq aa96edde2f Updated notebook to put userdata under a dedicated folder (and some safeties to not nuke them if you double run the script like I did thinking rm -r [symlink] would just remove the symlink 2023-02-22 15:45:41 +07:00
mrq 526a430c2a how did this revert... 2023-02-22 13:24:03 +07:00
mrq 2aa70532e8 added '''suggested''' voice chunk size (it just updates it to how many files you have, not based on combined voice length, like it should 2023-02-22 03:31:46 +07:00
mrq cc47ed7242 kmsing 2023-02-22 03:27:28 +07:00
mrq 93b061fb4d oops 2023-02-22 03:21:03 +07:00
mrq c4b41e07fa properly placed the line toe xtract starting iteration 2023-02-22 01:17:09 +07:00
mrq fefc7aba03 oops 2023-02-21 22:13:30 +07:00
mrq 9e64dad785 clamp batch size to sample count when generating for the sickos that want that, added setting to remove non-final output after a generation, something else I forgot already 2023-02-21 21:50:05 +07:00
mrq f119993fb5 explicitly use python3 because some OSs will not have python alias to python3, allow batch size 1 2023-02-21 20:20:52 +07:00
mrq 8a1a48f31e Added very experimental float16 training for cards with not enough VRAM (10GiB and below, maybe) \!NOTE\! this is VERY EXPERIMETNAL, I have zero free time to validate it right now, I'll do it later 2023-02-21 19:31:57 +07:00
mrq ed2cf9f5ee wrap checking for metadata when adding a voice in case it throws an error 2023-02-21 17:35:30 +07:00
mrq b6f7aa6264 fixes 2023-02-21 04:22:11 +07:00
mrq bbc2d26289 I finally figured out how to fix gr.Dropdown.change, so a lot of dumb UI decisions are fixed and makes sense 2023-02-21 03:00:45 +07:00
mrq 7d1936adad actually cleaned the notebook 2023-02-20 23:12:53 +07:00
mrq 1fd88afcca updated notebook for newer setup structure, added formatting of getting it/s and lass loss rate (have not tested loss rate yet) 2023-02-20 22:56:39 +07:00
mrq bacac6daea handled paths that contain spaces because python for whatever god forsaken reason will always split on spaces even if wrapping an argument in quotes 2023-02-20 20:23:22 +07:00
mrq 37ffa60d14 brain worms forgot a global, hate global semantics 2023-02-20 15:31:38 +07:00
mrq d17f6fafb0 clean up, reordered, added some rather liberal loading/unloading auxiliary models, can't really focus right now to keep testing it, report any issues and I'll get around to it 2023-02-20 00:21:16 +07:00
mrq c99cacec2e oops 2023-02-19 23:29:12 +07:00
mrq 109757d56d I forgot submodules existed 2023-02-19 21:41:51 +07:00
mrq ee95616dfd optimize batch sizes to be as evenly divisible as possible (noticed the calculated epochs mismatched the inputted epochs) 2023-02-19 21:06:14 +07:00
mrq 6260594a1e Forgot to base print/save frequencies in terms of epochs in the UI, will get converted when saving the YAML 2023-02-19 20:38:00 +07:00
mrq 4694d622f4 doing something completely unrelated had me realize it's 1000x easier to just base things in terms of epochs, and calculate iteratsions from there 2023-02-19 20:22:03 +07:00
mrq ec76676b16 i hate gradio I hate having to specify step=1 2023-02-19 17:12:39 +07:00
mrq 4f79b3724b Fixed model setting not getting updated when TTS is unloaded, for when you change it and then load TTS (sorry for that brain worm) 2023-02-19 16:24:06 +07:00
mrq 092dd7b2d7 added more safeties and parameters to training yaml generator, I think I tested it extensively enough 2023-02-19 16:16:44 +07:00
mrq f4e82fcf08 I swear I committed forwarding arguments from the start scripts 2023-02-19 15:01:16 +07:00
mrq 3891870b5d Update notebook to follow the \'other\' way of installing mrq/tortoise-tts 2023-02-19 07:22:22 +07:00
mrq d89b7d60e0 forgot to divide checkpoint freq by iterations to get checkpoint counts 2023-02-19 07:05:11 +07:00
mrq 485319c2bb don't know what brain worms had me throw printing training output under verbose 2023-02-19 06:28:53 +07:00
mrq debdf6049a forgot to copy again from dev folder to git folder 2023-02-19 06:04:46 +07:00