Commit Graph

1787 Commits

Author SHA1 Message Date
James Betker
305dc95e4b cg2 2022-04-06 21:24:36 -06:00
James Betker
e011166dd6 autoregressive_codegen r3 2022-04-06 21:04:23 -06:00
James Betker
33ef17e9e5 fix context 2022-04-06 00:45:42 -06:00
James Betker
37bdfe82b2 Modify x_transformers to do checkpointing and use relative positional biases 2022-04-06 00:35:29 -06:00
James Betker
09879b434d bring in x_transformers 2022-04-06 00:21:58 -06:00
James Betker
3d916e7687 Fix evaluation when using multiple batch sizes 2022-04-05 07:51:09 -06:00
James Betker
572d137589 track iteration rate 2022-04-04 12:33:25 -06:00
James Betker
4cdb0169d0 update training data encountered when using force_start_step 2022-04-04 12:25:00 -06:00
James Betker
cdd12ff46c Add code validation to autoregressive_codegen 2022-04-04 09:51:41 -06:00
James Betker
99de63a922 man I'm really on it tonight.... 2022-04-02 22:01:33 -06:00
James Betker
a4bdc80933 moikmadsf 2022-04-02 21:59:50 -06:00
James Betker
1cf20b7337 sdfds 2022-04-02 21:58:09 -06:00
James Betker
b6afc4d542 dsfa 2022-04-02 21:57:00 -06:00
James Betker
4c6bdfc9e2 get rid of relative position embeddings, which do not work with DDP & checkpointing 2022-04-02 21:55:32 -06:00
James Betker
b6d62aca5d add inference model on top of codegen 2022-04-02 21:25:10 -06:00
James Betker
2b6ff09225 autoregressive_codegen v1 2022-04-02 15:07:39 -06:00
James Betker
00767219fc undo latent converter change 2022-04-01 20:46:27 -06:00
James Betker
55c86e02c7 Flat fix 2022-04-01 19:13:33 -06:00
James Betker
8623c51902 fix bug 2022-04-01 16:11:34 -06:00
James Betker
035bcd9f6c fwd fix 2022-04-01 16:03:07 -06:00
James Betker
f6a8b0a5ca prep flat0 for feeding from autoregressive_latent_converter 2022-04-01 15:53:45 -06:00
James Betker
3e97abc8a9 update flat0 to break out timestep-independent inference steps 2022-04-01 14:38:53 -06:00
James Betker
a6181a489b Fix loss gapping caused by poor gradients into mel_pred 2022-03-26 22:49:14 -06:00
James Betker
0070867d0f inference script for diffusion image models 2022-03-26 22:48:24 -06:00
James Betker
1feade23ff support x-transformers in text_voice_clip and support relative positional embeddings 2022-03-26 22:48:10 -06:00
James Betker
9b90472e15 feed direct inputs into gd 2022-03-26 08:36:19 -06:00
James Betker
6909f196b4 make code pred returns optional 2022-03-26 08:33:30 -06:00
James Betker
2a29a71c37 attempt to force meaningful codes by adding a surrogate loss 2022-03-26 08:31:40 -06:00
James Betker
45804177b8 more stuff 2022-03-25 00:03:18 -06:00
James Betker
d4218d8443 mods 2022-03-24 23:31:20 -06:00
James Betker
9c79fec734 update adf 2022-03-24 21:20:29 -06:00
James Betker
07731d5491 Fix ET 2022-03-24 21:20:22 -06:00
James Betker
a15970dd97 disable checkpointing in conditioning encoder 2022-03-24 11:49:04 -06:00
James Betker
cc5fc91562 flat0 work 2022-03-24 11:46:53 -06:00
James Betker
b0d2827fad flat0 2022-03-24 11:30:40 -06:00
James Betker
8707a3e0c3 drop full layers in layerdrop, not half layers 2022-03-23 17:15:08 -06:00
James Betker
57da6d0ddf more simplifications 2022-03-22 11:46:03 -06:00
James Betker
f3f391b372 undo sandwich 2022-03-22 11:43:24 -06:00
James Betker
927731f3b4 tts9: fix position embeddings snafu 2022-03-22 11:41:32 -06:00
James Betker
536511fc4b unified_voice: relative position encodings 2022-03-22 11:41:13 -06:00
James Betker
be5f052255 misc 2022-03-22 11:40:56 -06:00
James Betker
963f0e9cee fix unscaler 2022-03-22 11:40:02 -06:00
James Betker
5405ce4363 fix flat 2022-03-22 11:39:39 -06:00
James Betker
e47a759ed8 ....... 2022-03-21 17:22:35 -06:00
James Betker
cc4c9faf9a resolve more issues 2022-03-21 17:20:05 -06:00
James Betker
3692c4cae3 map vocoder into cpu 2022-03-21 17:10:57 -06:00
James Betker
9e97cd800c take the conditioning mean rather than the first element 2022-03-21 16:58:03 -06:00
James Betker
9c7598dc9a fix conditioning_free signal 2022-03-21 15:29:17 -06:00
James Betker
2a65c982ca dont double nest checkpointing 2022-03-21 15:27:51 -06:00
James Betker
723f324eda Make it even better 2022-03-21 14:50:59 -06:00