1
0
Fork 0
Commit Graph

303 Commits (fd9b2e082c318a0266de47862a0ee011baef6ce3)
 

Author SHA1 Message Date
mrq fd9b2e082c x_lim and y_lim for graph 2023-03-25 02:34:14 +07:00
mrq 9856db5900 actually make parsing VALL-E metrics work 2023-03-23 15:42:51 +07:00
mrq 69d84bb9e0 I forget 2023-03-23 04:53:31 +07:00
mrq 444bcdaf62 my sanitizer actually did work, it was just batch sizes leading to problems when transcribing 2023-03-23 04:41:56 +07:00
mrq a6daf289bc when the sanitizer thingy works in testing but it doesn't outside of testing, and you have to retranscribe for the fourth time today 2023-03-23 02:37:44 +07:00
mrq 86589fff91 why does this keep happening to me 2023-03-23 01:55:16 +07:00
mrq 0ea93a7f40 more cleanup, use 24KHz for preparing for VALL-E (encodec will resample to 24Khz anyways, makes audio a little nicer), some other things 2023-03-23 01:52:26 +07:00
mrq d2a9ab9e41 remove redundant phonemize for vall-e (oops), quantize all files and then phonemize all files for cope optimization, load alignment model once instead of for every transcription (speedup with whisperx) 2023-03-23 00:22:25 +07:00
mrq 19c0854e6a do not write current whisper.json if there's no changes 2023-03-22 22:24:07 +07:00
mrq 932eaccdf5 added whisper transcription 'sanitizing' (collapse very short transcriptions to the previous segment) (I really have to stop having several copies spanning several machines for AIVC, I keep reverting shit) 2023-03-22 22:10:01 +07:00
mrq 736cdc8926 disable diarization for whisperx as it's just a useless performance hit (I don't have anything that's multispeaker within the same audio file at the moment) 2023-03-22 20:38:58 +07:00
mrq aa5bdafb06 ugh 2023-03-22 20:26:28 +07:00
mrq 13605f980c now whisperx should output json that aligns with what's expected 2023-03-22 20:01:30 +07:00
mrq 8877960062 fixes for whisperx batching 2023-03-22 19:53:42 +07:00
mrq 4056a27bcb begrudgingly added back whisperx integration (VAD/Diarization testing, I really, really need accurate timestamps before dumping mondo amounts of time on training a dataset) 2023-03-22 19:24:53 +07:00
mrq b8c3c4cfe2 Fixed #167 2023-03-22 18:21:37 +07:00
mrq da96161aaa oops 2023-03-22 18:07:46 +07:00
mrq f822c87344 cleanups, realigning vall-e training 2023-03-22 17:47:23 +07:00
mrq 909325bb5a ugh 2023-03-21 22:18:57 +07:00
mrq 5a5fd9ca87 Added option to unsqueeze sample batches after sampling 2023-03-21 21:34:26 +07:00
mrq 9657c1d4ce oops 2023-03-21 20:31:01 +07:00
mrq 0c2a9168f8 DLAS is PIPified (but I'm still cloning it as a submodule to make updating it easier) 2023-03-21 15:46:53 +07:00
mrq 34ef0467b9 VALL-E config edits 2023-03-20 01:22:53 +07:00
mrq 2e33bf071a forgot to not require it to be relative 2023-03-19 22:05:33 +07:00
mrq 5cb86106ce option to set results folder location 2023-03-19 22:03:41 +07:00
mrq 74510e8623 doing what I do best: sourcing other configs and banging until it works (it doesnt work) 2023-03-18 15:16:15 +07:00
mrq da9b4b5fb5 tweaks 2023-03-18 15:14:22 +07:00
mrq f44895978d brain worms 2023-03-17 20:08:08 +07:00
mrq b17260cddf added japanese tokenizer (experimental) 2023-03-17 20:04:40 +07:00
mrq f34cc382c5 yammed 2023-03-17 18:57:36 +07:00
mrq 96b7f9d2cc yammed 2023-03-17 13:08:34 +07:00
mrq 249c6019af cleanup, metrics are grabbed for vall-e trainer 2023-03-17 05:33:49 +07:00
mrq 1b72d0bba0 forgot to separate phonemes by spaces for [redacted] 2023-03-17 02:08:07 +07:00
mrq d4c50967a6 cleaned up some prepare dataset code 2023-03-17 01:24:02 +07:00
mrq 0b62ccc112 setup bnb on windows as needed 2023-03-16 20:48:48 +07:00
mrq c4edfb7d5e unbump rocm5.4.2 because it does not work for me desu 2023-03-16 15:33:23 +07:00
mrq 520fbcd163 bumped torch up (CUDA: 11.8, ROCm, 5.4.2) 2023-03-16 15:09:11 +07:00
mrq 1a8c5de517 unk hunting 2023-03-16 14:59:12 +07:00
mrq 46ff3c476a fixes v2 2023-03-16 14:41:40 +07:00
mrq 0408d44602 fixed reload tts being broken due to being as untouched as I am 2023-03-16 14:24:44 +07:00
mrq aeb904a800 yammed 2023-03-16 14:23:47 +07:00
mrq f9154c4db1 fixes 2023-03-16 14:19:56 +07:00
mrq 54f2fc792a ops 2023-03-16 05:14:15 +07:00
mrq 0a7d6f02a7 ops 2023-03-16 04:54:17 +07:00
mrq 4ac43fa3a3 I forgot I undid the thing in DLAS 2023-03-16 04:51:35 +07:00
mrq da4f92681e oops 2023-03-16 04:35:12 +07:00
mrq ee8270bdfb preparations for training an IPA-based finetune 2023-03-16 04:25:33 +07:00
mrq 7b80f7a42f fixed not cleaning up states while training (oops) 2023-03-15 02:48:05 +07:00
mrq b31bf1206e oops 2023-03-15 01:51:04 +07:00
mrq d752a22331 print a warning if automatically deduced batch size returns 1 2023-03-15 01:20:15 +07:00