1
0
Fork 0
Commit Graph

108 Commits (e205322c8db1808e063ea10f3154dd006ff08395)
 

Author SHA1 Message Date
mrq e205322c8d added setup script for bitsandbytes-rocm (soon: multi-gpu testing, because I am finally making use of my mispurchased second 6800XT) 2023-03-03 02:58:34 +07:00
mrq 59773a7637 just uninstall bitsandbytes on ROCm systems for now, I'll need to get it working tomorrow 2023-03-02 03:04:11 +07:00
mrq c956d81baf added button to just load a training set's loss information, added installing broncotc/bitsandbytes-rocm when running setup-rocm.sh 2023-03-02 01:35:12 +07:00
mrq 534a761e49 added loading/saving of voice latents by model hash, so no more needing to manually regenerate every time you change models 2023-03-02 00:46:52 +07:00
mrq 5a41db978e oops 2023-03-01 19:39:43 +07:00
mrq b989123bd4 leverage tensorboard to parse tb_logger files when starting training (it seems to give a nicer resolution of training data, need to see about reading it directly while training) 2023-03-01 19:32:11 +07:00
mrq c2726fa0d4 added new training tunable: loss_text_ce_loss weight, added option to specify source model in case you want to finetune a finetuned model (for example, train a Japanese finetune on a large dataset, then finetune for a specific voice, need to truly validate if it produces usable output), some bug fixes that came up for some reason now and not earlier 2023-03-01 01:17:38 +07:00
mrq 5037752059 oops 2023-02-28 22:13:21 +07:00
mrq 787b44807a added to embedded metadata: datetime, model path, model hash 2023-02-28 15:36:06 +07:00
mrq 81eb58f0d6 show different losses, rewordings 2023-02-28 06:18:18 +07:00
mrq fda47156ec oops 2023-02-28 01:08:07 +07:00
mrq bc0d9ab3ed added graph to chart loss_gpt_total rate, added option to prune X number of previous models/states, something else 2023-02-28 01:01:50 +07:00
mrq 6925ec731b I don't remember. 2023-02-27 19:20:06 +07:00
mrq 47abde224c compat with python3.10+ finally (and maybe a small perf uplift with using cu117) 2023-02-26 17:46:57 +07:00
mrq 92553973be Added option to disable bitsandbytesoptimizations for systems that do not support it (systems without a Turing-onward Nvidia card), saves use of float16 and bitsandbytes for training into the config json 2023-02-26 01:57:56 +07:00
mrq aafeb9f96a actually fixed the training output text parser 2023-02-25 16:44:25 +07:00
mrq 65329dba31 oops, epoch increments twice 2023-02-25 15:31:18 +07:00
mrq 8b4da29d5f csome adjustments to the training output parser, now updates per iteration for really large batches (like the one I'm doing for a dataset size of 19420) 2023-02-25 13:55:25 +07:00
mrq d5d8821a9d fixed some files not copying for bitsandbytes (I was wrong to assume it copied folders too), fixed stopping generating and training, some other thing that I forgot since it's been slowly worked on in my small free times 2023-02-24 23:13:13 +07:00
mrq e5e16bc5b5 updating gitmodules to latest commits 2023-02-24 19:32:18 +07:00
mrq bedbb893ac clarified import dataset settings button 2023-02-24 16:40:22 +07:00
mrq f31ea9d5bc oops 2023-02-24 16:23:30 +07:00
mrq 2104dbdbc5 ops 2023-02-24 13:05:08 +07:00
mrq f6d0b66e10 finally added model refresh button, also searches in the training folder for outputted models so you don't even need to copy them 2023-02-24 12:58:41 +07:00
mrq 1e0fec4358 god i finally found some time and focus: reworded print/save freq per epoch => print/save freq (in epochs), added import config button to reread the last used settings (will check for the output folder's configs first, then the generated ones) and auto-grab the last resume state (if available), some other cleanups i genuinely don't remember what I did when I spaced out for 20 minutes 2023-02-23 23:22:23 +07:00
mrq 7d1220e83e forgot to mult by batch size 2023-02-23 15:38:04 +07:00
mrq 487f2ebf32 fixed the brain worm discrepancy between epochs, iterations, and steps 2023-02-23 15:31:43 +07:00
mrq 1cbcf14cff oops 2023-02-23 13:18:51 +07:00
mrq 41fca1a101 ugh 2023-02-23 07:20:40 +07:00
mrq 941a27d2b3 removed the logic to toggle BNB capabilities, since I guess I can't do that from outside the module 2023-02-23 07:05:39 +07:00
mrq 225dee22d4 huge success 2023-02-23 06:24:54 +07:00
mrq aa96edde2f Updated notebook to put userdata under a dedicated folder (and some safeties to not nuke them if you double run the script like I did thinking rm -r [symlink] would just remove the symlink 2023-02-22 15:45:41 +07:00
mrq 526a430c2a how did this revert... 2023-02-22 13:24:03 +07:00
mrq 2aa70532e8 added '''suggested''' voice chunk size (it just updates it to how many files you have, not based on combined voice length, like it should 2023-02-22 03:31:46 +07:00
mrq cc47ed7242 kmsing 2023-02-22 03:27:28 +07:00
mrq 93b061fb4d oops 2023-02-22 03:21:03 +07:00
mrq c4b41e07fa properly placed the line toe xtract starting iteration 2023-02-22 01:17:09 +07:00
mrq fefc7aba03 oops 2023-02-21 22:13:30 +07:00
mrq 9e64dad785 clamp batch size to sample count when generating for the sickos that want that, added setting to remove non-final output after a generation, something else I forgot already 2023-02-21 21:50:05 +07:00
mrq f119993fb5 explicitly use python3 because some OSs will not have python alias to python3, allow batch size 1 2023-02-21 20:20:52 +07:00
mrq 8a1a48f31e Added very experimental float16 training for cards with not enough VRAM (10GiB and below, maybe) \!NOTE\! this is VERY EXPERIMETNAL, I have zero free time to validate it right now, I'll do it later 2023-02-21 19:31:57 +07:00
mrq ed2cf9f5ee wrap checking for metadata when adding a voice in case it throws an error 2023-02-21 17:35:30 +07:00
mrq b6f7aa6264 fixes 2023-02-21 04:22:11 +07:00
mrq bbc2d26289 I finally figured out how to fix gr.Dropdown.change, so a lot of dumb UI decisions are fixed and makes sense 2023-02-21 03:00:45 +07:00
mrq 7d1936adad actually cleaned the notebook 2023-02-20 23:12:53 +07:00
mrq 1fd88afcca updated notebook for newer setup structure, added formatting of getting it/s and lass loss rate (have not tested loss rate yet) 2023-02-20 22:56:39 +07:00
mrq bacac6daea handled paths that contain spaces because python for whatever god forsaken reason will always split on spaces even if wrapping an argument in quotes 2023-02-20 20:23:22 +07:00
mrq 37ffa60d14 brain worms forgot a global, hate global semantics 2023-02-20 15:31:38 +07:00
mrq d17f6fafb0 clean up, reordered, added some rather liberal loading/unloading auxiliary models, can't really focus right now to keep testing it, report any issues and I'll get around to it 2023-02-20 00:21:16 +07:00
mrq c99cacec2e oops 2023-02-19 23:29:12 +07:00