Commit Graph

331 Commits

Author SHA1 Message Date
James Betker
e23c322089 uhh2.0 2022-07-12 22:48:46 -06:00
James Betker
ebfe72d502 fix obo 2022-07-12 22:28:20 -06:00
James Betker
f46d6645da tfdpcv5 updates 2022-07-12 21:48:18 -06:00
James Betker
7b4dcbf136 Support causal diffusion! 2022-07-08 12:30:05 -06:00
James Betker
28d5b6a80a optionally disable checkpointing in x_transformers (and make it so with the cond_encoder in tfdpc_v5) 2022-07-06 16:55:57 -06:00
James Betker
48270272e7 Use corner alignment for linear interpolation in TFDPC and TFD12
I noticed from experimentation that when this is not enabled, the interpolation edges are "sticky",
which is to say there is more variance in the center of the interpolation than at the edges.
2022-07-06 16:45:03 -06:00
James Betker
5816a4595e ugh 2022-07-05 11:14:09 -06:00
James Betker
7440e43531 Fix some bugs 2022-07-05 10:37:47 -06:00
James Betker
2b128730e7 Improve conditioning separation logic 2022-07-05 10:30:28 -06:00
James Betker
802998674e Fix another edge case 2022-07-04 16:47:57 -06:00
James Betker
455943779b Fix bug in conditioning segment fetching 2022-07-04 08:16:14 -06:00
James Betker
e5859acff7 Rework tfdpc_v5 further.. 2022-07-03 18:19:01 -06:00
James Betker
58f26b1900 mods to support cheater ar prior in tfd12 2022-07-03 17:54:22 -06:00
James Betker
286918c581 conditioning masking is random 2022-07-01 21:43:30 -06:00
James Betker
1953887122 Add conditoning_masking to tfdpcv5 2022-07-01 00:44:40 -06:00
James Betker
f5c246b879 AR cheater gen & centroid injector 2022-06-28 23:52:54 -06:00
James Betker
ee3b426dae Ffix tfdpc_v5 conditioning 2022-06-27 14:06:09 -06:00
James Betker
0278576f37 Update mdf to be compatible with cheater_gen 2022-06-27 10:11:23 -06:00
James Betker
69b614e08a tfdpc5 2022-06-26 19:46:57 -06:00
James Betker
d0f2560396 fix 2022-06-25 21:22:08 -06:00
James Betker
f12f0200d6 tfdpc_v4
parametric efficiency improvements and lets try feeding the timestep into the conditioning encoder
2022-06-25 21:17:00 -06:00
James Betker
42de09d983 tfdpc_v3 inference 2022-06-25 21:16:13 -06:00
James Betker
b210e5025c Le encoder shalt always be frozen. 2022-06-23 11:34:46 -06:00
James Betker
3330fa2c10 tfdpc_v3 2022-06-20 15:37:48 -06:00
James Betker
2209b4f301 tfdpc2 2022-06-20 09:36:21 -06:00
James Betker
0e5a3f4712 We don't need that encoder either.. 2022-06-19 23:24:42 -06:00
James Betker
56c4a00e71 whoops 2022-06-19 23:22:30 -06:00
James Betker
a659cd865c All the stuff needed for cheater latent generation 2022-06-19 23:12:52 -06:00
James Betker
c5ea2bee52 More mods 2022-06-19 22:30:46 -06:00
James Betker
691ed196da fix codes 2022-06-19 21:29:40 -06:00
James Betker
b9f53a3ff9 don't populate gn5 2022-06-19 21:07:13 -06:00
James Betker
a5d2123daa more cleanup 2022-06-19 21:04:51 -06:00
James Betker
fef1066687 couple more alterations 2022-06-19 20:58:24 -06:00
James Betker
02ead8c05c update params 2022-06-19 20:47:06 -06:00
James Betker
ff8b0533ac gen3 waveform 2022-06-19 19:23:48 -06:00
James Betker
b19b0a74da Merge branch 'master' of https://github.com/neonbjb/DL-Art-School 2022-06-19 18:56:27 -06:00
James Betker
0b9588708c rrdb-inspired waveform gen 2022-06-19 18:56:17 -06:00
James Betker
f425afc965 permute codes 2022-06-19 18:00:30 -06:00
James Betker
8c8efbe131 fix code_emb 2022-06-19 17:54:08 -06:00
James Betker
c000e489fa . 2022-06-17 09:40:11 -06:00
James Betker
7ca532c7cc handle unused encoder parameters 2022-06-17 09:37:07 -06:00
James Betker
e025183bfb and the other ones..
really need to unify this file better.
2022-06-17 09:30:25 -06:00
James Betker
3081c893d4 Don't augment grad scale when the grad don't exist! 2022-06-17 09:27:04 -06:00
James Betker
3efd64ed7a expand codes before the code converters for cheater latents 2022-06-17 09:20:29 -06:00
James Betker
f70b16214d allow more interesting code interpolation 2022-06-17 09:12:44 -06:00
James Betker
87a86ae6a8 integrate tfd12 with cheater network 2022-06-17 09:08:56 -06:00
James Betker
9d7ce42630 add tanh to the end of the latent thingy 2022-06-16 20:31:23 -06:00
James Betker
b7758f25a9 and fix 2022-06-16 20:06:27 -06:00
James Betker
e3287c9d95 Reduce severity of gpt_music reduction 2022-06-16 20:05:05 -06:00
James Betker
41abf776d9 better downsampler 2022-06-16 15:19:08 -06:00
James Betker
28d95e3141 gptmusic work 2022-06-16 15:09:47 -06:00
James Betker
781c43c1fc Clean up old TFD models 2022-06-15 16:49:06 -06:00
James Betker
157d5d56c3 another debugging fix 2022-06-15 09:19:34 -06:00
James Betker
3757ff9526 uv back to tortoise days 2022-06-15 09:04:41 -06:00
James Betker
b51ff8a176 whoops! 2022-06-15 09:01:20 -06:00
James Betker
ff5c03b460 tfd12 with ar prior 2022-06-15 08:58:02 -06:00
James Betker
3f10ce275b some tfd12 fixes to support multivae 2022-06-14 23:53:50 -06:00
James Betker
fae05229ec asdf 2022-06-14 21:52:22 -06:00
James Betker
6bc19d1328 multivqvae tfd12 2022-06-14 15:58:18 -06:00
James Betker
d29ea0df5e Update ADF to be compatible with classical mel spectrograms 2022-06-14 15:19:52 -06:00
James Betker
c68669e1e1 uv2 add alignment head 2022-06-14 15:18:58 -06:00
James Betker
47330d603b Pretrained vqvae option for tfd12.. 2022-06-13 11:19:33 -06:00
James Betker
1fde3e5a08 more reworks 2022-06-13 08:40:23 -06:00
James Betker
fc3a7ed5e3 tfd12 2022-06-12 18:09:59 -06:00
James Betker
0c95be1624 Fix MDF evaluator for current generation of 2022-06-12 14:41:06 -06:00
James Betker
11e70dde14 update tfd11 2022-06-11 17:53:27 -06:00
James Betker
5c6c8f6904 back to convs? 2022-06-11 15:26:10 -06:00
James Betker
b684622b06 tfd10 2022-06-11 14:06:19 -06:00
James Betker
2a787ec910 more mods 2022-06-11 11:44:33 -06:00
James Betker
999a140f9f whoops 2022-06-11 11:22:34 -06:00
James Betker
00b9f332ee rework arch 2022-06-11 11:17:16 -06:00
James Betker
36b5e89a69 Rework the diet blocks a bit 2022-06-11 08:28:10 -06:00
James Betker
0dd3883662 one last update.. 2022-06-11 08:06:29 -06:00
James Betker
41170f97e9 one more adjustment 2022-06-11 08:01:46 -06:00
James Betker
df0cdf1a4f tfd9 returns with some optimizations 2022-06-11 08:00:09 -06:00
James Betker
acfe9cf880 fp16 2022-06-10 22:39:15 -06:00
James Betker
aca9024d9b qcodes 2022-06-10 16:23:08 -06:00
James Betker
dca16e6447 . 2022-06-10 15:35:36 -06:00
James Betker
6d85fe05f6 :/ oh well. 2022-06-10 15:17:41 -06:00
James Betker
33178e89c4 harharhack 2022-06-10 15:13:24 -06:00
James Betker
7198bd8bd0 forgot other customizations I want to keep 2022-06-10 15:09:05 -06:00
James Betker
8f40108f5b lets try a different tact 2022-06-10 14:51:59 -06:00
James Betker
2158383fa4 Revert previous changes 2022-06-10 14:34:05 -06:00
James Betker
89bd40d39f eval bug fix 2022-06-10 13:51:06 -06:00
James Betker
84469f3538 get rid of encoder checkpointing 2022-06-10 10:50:34 -06:00
James Betker
97b32dd39d try to make tfd8 be able to be trained e2e in quantizer mode 2022-06-10 10:40:56 -06:00
James Betker
e78c4b422c tfd8 2022-06-10 09:24:41 -06:00
James Betker
d98b895307 loss aware fix and report gumbel temperature 2022-06-09 21:56:47 -06:00
James Betker
47b34f5cb9 mup work checkin 2022-06-09 21:15:09 -06:00
James Betker
e67e82be2d misc 2022-06-09 21:14:48 -06:00
James Betker
16936881e5 allow freezing the upper quantizer 2022-06-08 18:30:22 -06:00
James Betker
43f225c35c debug gumbel temperature 2022-06-08 12:12:08 -06:00
James Betker
91be38cba3 . 2022-06-08 11:54:46 -06:00
James Betker
dee2b72786 checkpointing bugs, smh 2022-06-08 11:53:10 -06:00
James Betker
c61cd64bc9 network updates 2022-06-08 09:26:59 -06:00
James Betker
5a54d7db11 unet with ar prior 2022-06-07 17:52:36 -06:00
James Betker
5028703b3d ci not required 2022-06-06 09:26:25 -06:00
James Betker
08597bfaf5 fix 2022-06-06 09:21:58 -06:00
James Betker
49568ee16f some updates 2022-06-06 09:13:47 -06:00
James Betker
602df0abbc revert changes to dietattentionblock 2022-06-05 10:06:17 -06:00