Commit Graph

256 Commits

Author SHA1 Message Date
James Betker
536c8558ae fix 2022-05-28 22:32:38 -06:00
James Betker
da367da411 df5 2022-05-28 22:30:23 -06:00
James Betker
6b43915eb8 support projecting to vectors 2022-05-28 22:27:45 -06:00
James Betker
86694aef4e tfd5 2022-05-28 22:27:04 -06:00
James Betker
b6b4f10e1b ... 2022-05-28 10:59:03 -06:00
James Betker
0d3b831cf9 big fatty 2022-05-28 10:55:43 -06:00
James Betker
490d39b967 some stuff 2022-05-27 11:40:31 -06:00
James Betker
5efeee6b97 fix type bug 2022-05-27 11:19:30 -06:00
James Betker
0659fe3d1e tfd3 mods 2022-05-27 11:16:26 -06:00
James Betker
bed3df4888 propagate type 2022-05-27 11:12:03 -06:00
James Betker
c46da0285c Move stuff around 2022-05-27 11:06:58 -06:00
James Betker
9852599b34 tfd5 - with clvp! 2022-05-27 09:49:10 -06:00
James Betker
3db862dd32 adf update 2022-05-27 09:25:53 -06:00
James Betker
8587a18717 fd fix 2022-05-26 20:19:09 -06:00
James Betker
dd13b883ac td4 2022-05-26 14:56:03 -06:00
James Betker
1dbe0b6b2e a 2022-05-26 10:13:27 -06:00
James Betker
aa653115f1 tfd3 2022-05-26 10:09:11 -06:00
James Betker
36c68692a6 forgot to add rotary embeddings 2022-05-26 09:25:42 -06:00
James Betker
8ce48f04ff transformer diffusion 2 2022-05-26 09:08:35 -06:00
James Betker
56f19a23cd fix nh 2022-05-25 12:31:56 -06:00
James Betker
52a20f3aa3 und10 2022-05-25 12:19:21 -06:00
James Betker
8b4b5ffa72 slight rework 2022-05-24 14:38:37 -06:00
James Betker
48aab2babe ressurect ctc code gen with some cool new ideas 2022-05-24 14:02:33 -06:00
James Betker
65b441d74e transformer diffusion 2022-05-24 14:02:05 -06:00
James Betker
1e1bbe1a27 whoops 2022-05-23 12:28:36 -06:00
James Betker
560b83e770 default to residual encoder 2022-05-23 12:24:00 -06:00
James Betker
f432bdf7ae deeper resblock encoder 2022-05-23 11:46:40 -06:00
James Betker
dc471f5c6d residual features 2022-05-23 09:58:30 -06:00
James Betker
1f521d6a1d add reconstruction loss to m2v 2022-05-23 09:28:41 -06:00
James Betker
2270c89fdc . 2022-05-23 08:47:15 -06:00
James Betker
40f844657b tolong 2022-05-23 08:27:54 -06:00
James Betker
10f4a742bd reintroduce attention masks 2022-05-23 08:16:04 -06:00
James Betker
68c0afcbcc m2v frequency masking 2022-05-23 07:04:12 -06:00
James Betker
4093e38717 revert flat diffusion back... 2022-05-22 23:10:58 -06:00
James Betker
8f28404645 another fix 2022-05-22 21:32:43 -06:00
James Betker
41809a6330 Add 8x dim reductor 2022-05-22 20:23:16 -06:00
James Betker
1095248caf Revert "retest"
This reverts commit ed7768c73b.
2022-05-22 19:23:01 -06:00
James Betker
ed7768c73b retest 2022-05-22 16:30:09 -06:00
James Betker
2dd0b9e6e9 mel_head should be optional 2022-05-22 12:25:45 -06:00
James Betker
0c60f22197 fix unused parameters 2022-05-22 08:16:31 -06:00
James Betker
57d6f6d366 Big rework of flat_diffusion
Back to the drawing board, boys. Time to waste some resources catching bugs....
2022-05-22 08:09:33 -06:00
James Betker
be937d202e new attempt 2022-05-20 17:04:22 -06:00
James Betker
968660c248 another update 2022-05-20 11:25:00 -06:00
James Betker
28f950b7d3 fix 2022-05-20 11:18:52 -06:00
James Betker
b317c68ac9 fix 2022-05-20 11:12:53 -06:00
James Betker
3121bc4e43 flat diffusion 2022-05-20 11:01:48 -06:00
James Betker
e9fb2ead9a m2v stuff 2022-05-20 11:01:17 -06:00
James Betker
c9c16e3b01 misc updates 2022-05-19 13:39:32 -06:00
James Betker
10378fc37f make codebooks specifiable 2022-05-18 11:07:12 -06:00
James Betker
efc2657b48 fiddle with init 2022-05-18 10:56:01 -06:00
James Betker
208a703080 use gelu act 2022-05-18 09:34:01 -06:00
James Betker
b2b37453df make the codebook bigger 2022-05-17 20:58:56 -06:00
James Betker
9a9c3cafba Make feature encoder a bit more descriptive 2022-05-17 18:14:52 -06:00
James Betker
ee364f4eeb just take the mean... 2022-05-17 18:09:23 -06:00
James Betker
6130391a85 fix div 2022-05-17 18:04:20 -06:00
James Betker
7213ad2b89 Do grad reduction 2022-05-17 17:59:40 -06:00
James Betker
7c82e18c6c darn mpi 2022-05-17 17:16:09 -06:00
James Betker
88ec0512f7 Scale losses 2022-05-17 17:12:20 -06:00
James Betker
a6397ce84a Fix incorrect projections 2022-05-17 16:53:52 -06:00
James Betker
c37fc3b4ed m2v grad norm groups 2022-05-17 16:29:36 -06:00
James Betker
c1bdb4f9a1 degrade gumbel softmax over time 2022-05-17 16:23:04 -06:00
James Betker
3853f37257 stable layernorm 2022-05-17 16:07:03 -06:00
James Betker
519151d83f m2v 2022-05-17 15:37:59 -06:00
James Betker
d1de94d75c Stash mel2vec work (gonna throw it all away..) 2022-05-17 12:35:01 -06:00
James Betker
ee218ab9b7 uv3 2022-05-13 17:57:47 -06:00
James Betker
545453077e uv3 2022-05-09 15:36:22 -06:00
James Betker
96a5cc66ee uv3 2022-05-09 15:35:51 -06:00
James Betker
b42b4e18de clean up unified voice
- remove unused code
- fix inference model to use the terms "prior" and "posterior" to properly define the modeling order (they were inverted before)
- default some settings I never intend to change in the future
2022-05-09 14:45:49 -06:00
James Betker
7812c23c7a revert fill_gaps back to old masking behavior 2022-05-08 00:10:19 -06:00
James Betker
58ed27d7a8 new gap_filler 2022-05-07 12:44:23 -06:00
James Betker
6c8032b4be more work 2022-05-06 21:56:49 -06:00
James Betker
79543e5488 Simpler form of the wavegen model 2022-05-06 16:37:04 -06:00
James Betker
d8925ccde5 few things with gap filling 2022-05-06 14:33:44 -06:00
James Betker
b13d983c24 and mel_head 2022-05-06 00:25:27 -06:00
James Betker
d5fb79564a remove mel_pred 2022-05-06 00:24:05 -06:00
James Betker
e9bb692490 fixed aligned_latent 2022-05-06 00:20:21 -06:00
James Betker
1609101a42 musical gap filler 2022-05-05 16:47:08 -06:00
James Betker
d66ab2d28c Remove unused waveform_gens 2022-05-04 21:06:54 -06:00
James Betker
c42c53e75a Add a trainable network for converting a normal distribution into a latent space 2022-05-02 09:47:30 -06:00
James Betker
ab219fbefb output variance 2022-05-02 00:10:33 -06:00
James Betker
3b074aac34 add checkpointing 2022-05-02 00:07:42 -06:00
James Betker
ae5f934ea1 diffwave 2022-05-02 00:05:04 -06:00
James Betker
b712d3b72b break out get_conditioning_latent from unified_voice 2022-05-01 23:04:44 -06:00
James Betker
afa2df57c9 gen3 2022-04-30 10:41:38 -06:00
James Betker
8aa6651fc7 fix surrogate loss return in waveform_gen2 2022-04-28 10:10:11 -06:00
James Betker
f02b01bd9d reverse univnet classifier 2022-04-20 21:37:55 -06:00
James Betker
9df85c902e New gen2
Which is basically a autoencoder with a giant diffusion appendage attached
2022-04-20 21:37:34 -06:00
James Betker
b4549eed9f uv2 fix 2022-04-20 00:27:38 -06:00
James Betker
24fdafd855 fix2 2022-04-20 00:03:29 -06:00
James Betker
0af0051399 fix 2022-04-20 00:01:57 -06:00
James Betker
419f4d37bd gen2 music 2022-04-19 23:38:37 -06:00
James Betker
8fe0dff33c support tts typing 2022-04-16 23:36:57 -06:00
James Betker
546ecd5aeb music! 2022-04-15 21:21:37 -06:00
James Betker
8ea5c307fb Fixes for training the diffusion model on autoregressive inputs 2022-04-11 11:02:44 -06:00
James Betker
a3622462c1 Change latent_conditioner back 2022-04-11 09:00:13 -06:00
James Betker
19ca5b26c1 Remove flat0 and move it into flat 2022-04-10 21:01:59 -06:00
James Betker
81c952a00a undo relative 2022-04-08 16:32:52 -06:00
James Betker
944b4c3335 more undos 2022-04-08 16:31:08 -06:00
James Betker
032983e2ed fix bug and allow position encodings to be trained separately from the rest of the model 2022-04-08 16:26:01 -06:00
James Betker
09ab1aa9bc revert rotary embeddings work
I'm not really sure that this is going to work. I'd rather explore re-using what I've already trained
2022-04-08 16:18:35 -06:00
James Betker
2fb9ffb0aa Align autoregressive text using start and stop tokens 2022-04-08 09:41:59 -06:00
James Betker
e634996a9c autoregressive_codegen: support key_value caching for faster inference 2022-04-07 23:08:46 -07:00
James Betker
7c578eb59b Fix inference in new autoregressive_codegen 2022-04-07 21:22:46 -06:00
James Betker
3f8d7955ef unified_voice with rotary embeddings 2022-04-07 20:11:14 -06:00
James Betker
71b73db044 clean up 2022-04-07 11:34:10 -06:00
James Betker
6fc4f49e86 some dumb stuff 2022-04-07 11:32:34 -06:00
James Betker
305dc95e4b cg2 2022-04-06 21:24:36 -06:00
James Betker
e011166dd6 autoregressive_codegen r3 2022-04-06 21:04:23 -06:00
James Betker
37bdfe82b2 Modify x_transformers to do checkpointing and use relative positional biases 2022-04-06 00:35:29 -06:00
James Betker
cdd12ff46c Add code validation to autoregressive_codegen 2022-04-04 09:51:41 -06:00
James Betker
99de63a922 man I'm really on it tonight.... 2022-04-02 22:01:33 -06:00
James Betker
a4bdc80933 moikmadsf 2022-04-02 21:59:50 -06:00
James Betker
1cf20b7337 sdfds 2022-04-02 21:58:09 -06:00
James Betker
b6afc4d542 dsfa 2022-04-02 21:57:00 -06:00
James Betker
4c6bdfc9e2 get rid of relative position embeddings, which do not work with DDP & checkpointing 2022-04-02 21:55:32 -06:00
James Betker
b6d62aca5d add inference model on top of codegen 2022-04-02 21:25:10 -06:00
James Betker
2b6ff09225 autoregressive_codegen v1 2022-04-02 15:07:39 -06:00
James Betker
00767219fc undo latent converter change 2022-04-01 20:46:27 -06:00
James Betker
55c86e02c7 Flat fix 2022-04-01 19:13:33 -06:00
James Betker
8623c51902 fix bug 2022-04-01 16:11:34 -06:00
James Betker
f6a8b0a5ca prep flat0 for feeding from autoregressive_latent_converter 2022-04-01 15:53:45 -06:00
James Betker
3e97abc8a9 update flat0 to break out timestep-independent inference steps 2022-04-01 14:38:53 -06:00
James Betker
a6181a489b Fix loss gapping caused by poor gradients into mel_pred 2022-03-26 22:49:14 -06:00
James Betker
1feade23ff support x-transformers in text_voice_clip and support relative positional embeddings 2022-03-26 22:48:10 -06:00
James Betker
6909f196b4 make code pred returns optional 2022-03-26 08:33:30 -06:00
James Betker
2a29a71c37 attempt to force meaningful codes by adding a surrogate loss 2022-03-26 08:31:40 -06:00
James Betker
45804177b8 more stuff 2022-03-25 00:03:18 -06:00
James Betker
d4218d8443 mods 2022-03-24 23:31:20 -06:00
James Betker
a15970dd97 disable checkpointing in conditioning encoder 2022-03-24 11:49:04 -06:00
James Betker
cc5fc91562 flat0 work 2022-03-24 11:46:53 -06:00
James Betker
b0d2827fad flat0 2022-03-24 11:30:40 -06:00
James Betker
8707a3e0c3 drop full layers in layerdrop, not half layers 2022-03-23 17:15:08 -06:00
James Betker
57da6d0ddf more simplifications 2022-03-22 11:46:03 -06:00
James Betker
f3f391b372 undo sandwich 2022-03-22 11:43:24 -06:00
James Betker
927731f3b4 tts9: fix position embeddings snafu 2022-03-22 11:41:32 -06:00
James Betker
536511fc4b unified_voice: relative position encodings 2022-03-22 11:41:13 -06:00
James Betker
5405ce4363 fix flat 2022-03-22 11:39:39 -06:00
James Betker
e47a759ed8 ....... 2022-03-21 17:22:35 -06:00
James Betker
cc4c9faf9a resolve more issues 2022-03-21 17:20:05 -06:00
James Betker
9e97cd800c take the conditioning mean rather than the first element 2022-03-21 16:58:03 -06:00
James Betker
9c7598dc9a fix conditioning_free signal 2022-03-21 15:29:17 -06:00
James Betker
2a65c982ca dont double nest checkpointing 2022-03-21 15:27:51 -06:00
James Betker
723f324eda Make it even better 2022-03-21 14:50:59 -06:00
James Betker
e735d8e1fa unified_voice fixes 2022-03-21 14:44:00 -06:00
James Betker
1ad18d29a8 Flat fixes 2022-03-21 14:43:52 -06:00
James Betker
26dcf7f1a2 r2 of the flat diffusion 2022-03-21 11:40:43 -06:00
James Betker
c14fc003ed flat diffusion 2022-03-17 17:45:27 -06:00
James Betker
428911cd4d flat diffusion network 2022-03-17 10:53:56 -06:00
James Betker
d186414566 More spring cleaning 2022-03-16 12:04:00 -06:00
James Betker
8b376e63d9 More improvements 2022-03-16 10:16:34 -06:00