Commit Graph

106 Commits

Author SHA1 Message Date
James Betker
208a703080 use gelu act 2022-05-18 09:34:01 -06:00
James Betker
b2b37453df make the codebook bigger 2022-05-17 20:58:56 -06:00
James Betker
9a9c3cafba Make feature encoder a bit more descriptive 2022-05-17 18:14:52 -06:00
James Betker
ee364f4eeb just take the mean... 2022-05-17 18:09:23 -06:00
James Betker
6130391a85 fix div 2022-05-17 18:04:20 -06:00
James Betker
7213ad2b89 Do grad reduction 2022-05-17 17:59:40 -06:00
James Betker
7c82e18c6c darn mpi 2022-05-17 17:16:09 -06:00
James Betker
88ec0512f7 Scale losses 2022-05-17 17:12:20 -06:00
James Betker
a6397ce84a Fix incorrect projections 2022-05-17 16:53:52 -06:00
James Betker
c37fc3b4ed m2v grad norm groups 2022-05-17 16:29:36 -06:00
James Betker
c1bdb4f9a1 degrade gumbel softmax over time 2022-05-17 16:23:04 -06:00
James Betker
3853f37257 stable layernorm 2022-05-17 16:07:03 -06:00
James Betker
519151d83f m2v 2022-05-17 15:37:59 -06:00
James Betker
d1de94d75c Stash mel2vec work (gonna throw it all away..) 2022-05-17 12:35:01 -06:00
James Betker
ee218ab9b7 uv3 2022-05-13 17:57:47 -06:00
James Betker
545453077e uv3 2022-05-09 15:36:22 -06:00
James Betker
96a5cc66ee uv3 2022-05-09 15:35:51 -06:00
James Betker
b42b4e18de clean up unified voice
- remove unused code
- fix inference model to use the terms "prior" and "posterior" to properly define the modeling order (they were inverted before)
- default some settings I never intend to change in the future
2022-05-09 14:45:49 -06:00
James Betker
7812c23c7a revert fill_gaps back to old masking behavior 2022-05-08 00:10:19 -06:00
James Betker
58ed27d7a8 new gap_filler 2022-05-07 12:44:23 -06:00
James Betker
6c8032b4be more work 2022-05-06 21:56:49 -06:00
James Betker
79543e5488 Simpler form of the wavegen model 2022-05-06 16:37:04 -06:00
James Betker
d8925ccde5 few things with gap filling 2022-05-06 14:33:44 -06:00
James Betker
b13d983c24 and mel_head 2022-05-06 00:25:27 -06:00
James Betker
d5fb79564a remove mel_pred 2022-05-06 00:24:05 -06:00
James Betker
e9bb692490 fixed aligned_latent 2022-05-06 00:20:21 -06:00
James Betker
1609101a42 musical gap filler 2022-05-05 16:47:08 -06:00
James Betker
d66ab2d28c Remove unused waveform_gens 2022-05-04 21:06:54 -06:00
James Betker
c42c53e75a Add a trainable network for converting a normal distribution into a latent space 2022-05-02 09:47:30 -06:00
James Betker
ab219fbefb output variance 2022-05-02 00:10:33 -06:00
James Betker
3b074aac34 add checkpointing 2022-05-02 00:07:42 -06:00
James Betker
ae5f934ea1 diffwave 2022-05-02 00:05:04 -06:00
James Betker
b712d3b72b break out get_conditioning_latent from unified_voice 2022-05-01 23:04:44 -06:00
James Betker
afa2df57c9 gen3 2022-04-30 10:41:38 -06:00
James Betker
8aa6651fc7 fix surrogate loss return in waveform_gen2 2022-04-28 10:10:11 -06:00
James Betker
f02b01bd9d reverse univnet classifier 2022-04-20 21:37:55 -06:00
James Betker
9df85c902e New gen2
Which is basically a autoencoder with a giant diffusion appendage attached
2022-04-20 21:37:34 -06:00
James Betker
b4549eed9f uv2 fix 2022-04-20 00:27:38 -06:00
James Betker
24fdafd855 fix2 2022-04-20 00:03:29 -06:00
James Betker
0af0051399 fix 2022-04-20 00:01:57 -06:00
James Betker
419f4d37bd gen2 music 2022-04-19 23:38:37 -06:00
James Betker
8fe0dff33c support tts typing 2022-04-16 23:36:57 -06:00
James Betker
546ecd5aeb music! 2022-04-15 21:21:37 -06:00
James Betker
8ea5c307fb Fixes for training the diffusion model on autoregressive inputs 2022-04-11 11:02:44 -06:00
James Betker
a3622462c1 Change latent_conditioner back 2022-04-11 09:00:13 -06:00
James Betker
19ca5b26c1 Remove flat0 and move it into flat 2022-04-10 21:01:59 -06:00
James Betker
81c952a00a undo relative 2022-04-08 16:32:52 -06:00
James Betker
944b4c3335 more undos 2022-04-08 16:31:08 -06:00
James Betker
032983e2ed fix bug and allow position encodings to be trained separately from the rest of the model 2022-04-08 16:26:01 -06:00
James Betker
09ab1aa9bc revert rotary embeddings work
I'm not really sure that this is going to work. I'd rather explore re-using what I've already trained
2022-04-08 16:18:35 -06:00