Commit Graph

1812 Commits

Author SHA1 Message Date
James Betker
82aad335ba add distributued logic for loss 2022-04-15 09:31:48 -06:00
James Betker
efe12cb816 Update clvp to add masking probabilities in conditioning and to support code inputs 2022-04-15 09:11:23 -06:00
James Betker
3cad1b8114 more fixes 2022-04-11 15:18:44 -06:00
James Betker
6dea7da7a8 another fix 2022-04-11 12:29:43 -06:00
James Betker
f2c172291f fix audio_diffusion_fid for autoregressive latent inputs 2022-04-11 12:08:15 -06:00
James Betker
8ea5c307fb Fixes for training the diffusion model on autoregressive inputs 2022-04-11 11:02:44 -06:00
James Betker
a3622462c1 Change latent_conditioner back 2022-04-11 09:00:13 -06:00
James Betker
03d0b90bda fixes 2022-04-10 21:02:12 -06:00
James Betker
19ca5b26c1 Remove flat0 and move it into flat 2022-04-10 21:01:59 -06:00
James Betker
81c952a00a undo relative 2022-04-08 16:32:52 -06:00
James Betker
944b4c3335 more undos 2022-04-08 16:31:08 -06:00
James Betker
032983e2ed fix bug and allow position encodings to be trained separately from the rest of the model 2022-04-08 16:26:01 -06:00
James Betker
09ab1aa9bc revert rotary embeddings work
I'm not really sure that this is going to work. I'd rather explore re-using what I've already trained
2022-04-08 16:18:35 -06:00
James Betker
2fb9ffb0aa Align autoregressive text using start and stop tokens 2022-04-08 09:41:59 -06:00
James Betker
628569af7b Another fix 2022-04-08 09:41:18 -06:00
James Betker
423293e518 fix xtransformers bug 2022-04-08 09:12:46 -06:00
James Betker
048f6f729a remove lightweight_gan 2022-04-07 23:12:08 -07:00
James Betker
e634996a9c autoregressive_codegen: support key_value caching for faster inference 2022-04-07 23:08:46 -07:00
James Betker
d05e162f95 reformat x_transformers 2022-04-07 23:08:03 -07:00
James Betker
7c578eb59b Fix inference in new autoregressive_codegen 2022-04-07 21:22:46 -06:00
James Betker
3f8d7955ef unified_voice with rotary embeddings 2022-04-07 20:11:14 -06:00
James Betker
573e5552b9 CLVP v1 2022-04-07 20:10:57 -06:00
James Betker
71b73db044 clean up 2022-04-07 11:34:10 -06:00
James Betker
6fc4f49e86 some dumb stuff 2022-04-07 11:32:34 -06:00
James Betker
e6387c7613 Fix eval logic to not run immediately 2022-04-07 11:29:57 -06:00
James Betker
305dc95e4b cg2 2022-04-06 21:24:36 -06:00
James Betker
e011166dd6 autoregressive_codegen r3 2022-04-06 21:04:23 -06:00
James Betker
33ef17e9e5 fix context 2022-04-06 00:45:42 -06:00
James Betker
37bdfe82b2 Modify x_transformers to do checkpointing and use relative positional biases 2022-04-06 00:35:29 -06:00
James Betker
09879b434d bring in x_transformers 2022-04-06 00:21:58 -06:00
James Betker
3d916e7687 Fix evaluation when using multiple batch sizes 2022-04-05 07:51:09 -06:00
James Betker
572d137589 track iteration rate 2022-04-04 12:33:25 -06:00
James Betker
4cdb0169d0 update training data encountered when using force_start_step 2022-04-04 12:25:00 -06:00
James Betker
cdd12ff46c Add code validation to autoregressive_codegen 2022-04-04 09:51:41 -06:00
James Betker
99de63a922 man I'm really on it tonight.... 2022-04-02 22:01:33 -06:00
James Betker
a4bdc80933 moikmadsf 2022-04-02 21:59:50 -06:00
James Betker
1cf20b7337 sdfds 2022-04-02 21:58:09 -06:00
James Betker
b6afc4d542 dsfa 2022-04-02 21:57:00 -06:00
James Betker
4c6bdfc9e2 get rid of relative position embeddings, which do not work with DDP & checkpointing 2022-04-02 21:55:32 -06:00
James Betker
b6d62aca5d add inference model on top of codegen 2022-04-02 21:25:10 -06:00
James Betker
2b6ff09225 autoregressive_codegen v1 2022-04-02 15:07:39 -06:00
James Betker
00767219fc undo latent converter change 2022-04-01 20:46:27 -06:00
James Betker
55c86e02c7 Flat fix 2022-04-01 19:13:33 -06:00
James Betker
8623c51902 fix bug 2022-04-01 16:11:34 -06:00
James Betker
035bcd9f6c fwd fix 2022-04-01 16:03:07 -06:00
James Betker
f6a8b0a5ca prep flat0 for feeding from autoregressive_latent_converter 2022-04-01 15:53:45 -06:00
James Betker
3e97abc8a9 update flat0 to break out timestep-independent inference steps 2022-04-01 14:38:53 -06:00
James Betker
a6181a489b Fix loss gapping caused by poor gradients into mel_pred 2022-03-26 22:49:14 -06:00
James Betker
0070867d0f inference script for diffusion image models 2022-03-26 22:48:24 -06:00
James Betker
1feade23ff support x-transformers in text_voice_clip and support relative positional embeddings 2022-03-26 22:48:10 -06:00