Commit Graph

294 Commits

Author SHA1 Message Date
James Betker
e9fb2ead9a m2v stuff 2022-05-20 11:01:17 -06:00
James Betker
519151d83f m2v 2022-05-17 15:37:59 -06:00
James Betker
eb64d18075 Fix phoneme tokenizer 2022-05-13 17:56:26 -06:00
James Betker
1177c35dec music fid updates 2022-05-08 18:49:39 -06:00
James Betker
58ed27d7a8 new gap_filler 2022-05-07 12:44:23 -06:00
James Betker
1609101a42 musical gap filler 2022-05-05 16:47:08 -06:00
James Betker
47662b9ec5 some random crap 2022-05-04 20:29:23 -06:00
James Betker
c42c53e75a Add a trainable network for converting a normal distribution into a latent space 2022-05-02 09:47:30 -06:00
James Betker
e402089556 abstractify 2022-05-02 00:11:26 -06:00
James Betker
b712d3b72b break out get_conditioning_latent from unified_voice 2022-05-01 23:04:44 -06:00
James Betker
f02b01bd9d reverse univnet classifier 2022-04-20 21:37:55 -06:00
James Betker
419f4d37bd gen2 music 2022-04-19 23:38:37 -06:00
James Betker
48cb6a5abd misc 2022-04-16 20:28:04 -06:00
James Betker
efe12cb816 Update clvp to add masking probabilities in conditioning and to support code inputs 2022-04-15 09:11:23 -06:00
James Betker
f2c172291f fix audio_diffusion_fid for autoregressive latent inputs 2022-04-11 12:08:15 -06:00
James Betker
3f8d7955ef unified_voice with rotary embeddings 2022-04-07 20:11:14 -06:00
James Betker
e6387c7613 Fix eval logic to not run immediately 2022-04-07 11:29:57 -06:00
James Betker
3d916e7687 Fix evaluation when using multiple batch sizes 2022-04-05 07:51:09 -06:00
James Betker
572d137589 track iteration rate 2022-04-04 12:33:25 -06:00
James Betker
4cdb0169d0 update training data encountered when using force_start_step 2022-04-04 12:25:00 -06:00
James Betker
2b6ff09225 autoregressive_codegen v1 2022-04-02 15:07:39 -06:00
James Betker
2a29a71c37 attempt to force meaningful codes by adding a surrogate loss 2022-03-26 08:31:40 -06:00
James Betker
57da6d0ddf more simplifications 2022-03-22 11:46:03 -06:00
James Betker
bf08519d71 fixes 2022-03-17 10:53:39 -06:00
James Betker
54202aa099 fix mel normalization 2022-03-16 09:26:55 -06:00
James Betker
d553808d24 misc 2022-03-08 15:52:16 -07:00
James Betker
d1dc8dbb35 Support tts9 2022-03-05 20:14:36 -07:00
James Betker
e1052a5e32 Move log consensus to train for efficiency 2022-03-04 13:41:32 -07:00
James Betker
ce6dfdf255 Distributed "fixes" 2022-03-04 12:46:41 -07:00
James Betker
70fa780edb Add mechanism to export grad norms 2022-03-01 20:19:52 -07:00
James Betker
db0c3340ac Implement guidance-free diffusion in eval
And a few other fixes
2022-03-01 11:49:36 -07:00
James Betker
ac920798bb misc 2022-02-27 14:49:11 -07:00
James Betker
896ac029ae allow continuation of samples encountered 2022-02-21 19:12:50 -07:00
James Betker
79e8f36d30 Convert CLIP models into new folder 2022-02-15 20:53:07 -07:00
James Betker
2bdb515068 A few mods to make wav2vec2 trainable with DDP on DLAS 2022-02-15 06:28:54 -07:00
James Betker
29534180b2 w2v fine tuner 2022-02-12 20:00:59 -07:00
James Betker
4abc094b47 fix train bug 2022-02-11 11:18:15 -07:00
James Betker
a930f2576e Begin a migration to specifying training rate on megasamples instead of arbitrary "steps"
This should help me greatly in tuning models.  It's also necessary now that batch size isn't really
respected; we simply step once the gradient direction becomes unstable.
2022-02-09 17:25:05 -07:00
James Betker
3d946356f8 batch_size_optimizer works. sweet! no more tuning batch sizes. 2022-02-09 14:26:23 -07:00
James Betker
f44b064c5e Update scripts 2022-02-07 19:43:18 -07:00
James Betker
8fb147e8ab add an autoregressive ctc code generator 2022-02-04 11:00:15 -07:00
James Betker
7f4fc55344 Update SR model 2022-02-03 21:42:53 -07:00
James Betker
4249681c4b Mods to support a autoregressive CTC code generator 2022-02-03 19:58:54 -07:00
James Betker
fbea6e8eac Adjustments to diffusion networks 2022-01-30 16:14:06 -07:00
James Betker
e58dab14c3 new diffusion updates from testing 2022-01-29 11:01:01 -07:00
James Betker
0152174c0e Add wandb_step_factor argument 2022-01-27 19:58:58 -07:00
James Betker
8c255811ad more fixes 2022-01-25 17:57:16 -07:00
James Betker
798ed7730a i like wasting time 2022-01-24 18:12:08 -07:00
James Betker
fc09cff4b3 angry 2022-01-24 18:09:29 -07:00
James Betker
3a9e3a9db3 consolidate state 2022-01-24 17:59:31 -07:00