Commit Graph

2014 Commits

Author SHA1 Message Date
James Betker
ee3b426dae Ffix tfdpc_v5 conditioning 2022-06-27 14:06:09 -06:00
James Betker
0278576f37 Update mdf to be compatible with cheater_gen 2022-06-27 10:11:23 -06:00
James Betker
69b614e08a tfdpc5 2022-06-26 19:46:57 -06:00
James Betker
d0f2560396 fix 2022-06-25 21:22:08 -06:00
James Betker
f12f0200d6 tfdpc_v4
parametric efficiency improvements and lets try feeding the timestep into the conditioning encoder
2022-06-25 21:17:00 -06:00
James Betker
42de09d983 tfdpc_v3 inference 2022-06-25 21:16:13 -06:00
James Betker
7a9c4310e8 support reading cheaters directly 2022-06-23 11:39:10 -06:00
James Betker
b210e5025c Le encoder shalt always be frozen. 2022-06-23 11:34:46 -06:00
James Betker
aeff1a4cc7 divided by zero 2022-06-21 20:26:19 -06:00
James Betker
f0117150d0 produce correct clip_lengths.. 2022-06-21 20:21:12 -06:00
James Betker
4a1f3aba31 come on guys... :(( 2022-06-21 20:12:54 -06:00
James Betker
fcfb3a1525 fix 2022-06-21 20:09:59 -06:00
James Betker
24e60bd510 report quantile losses for diffusion 2022-06-21 20:04:16 -06:00
James Betker
1394213f1e allow variable size crops 2022-06-21 19:48:07 -06:00
James Betker
3330fa2c10 tfdpc_v3 2022-06-20 15:37:48 -06:00
James Betker
2209b4f301 tfdpc2 2022-06-20 09:36:21 -06:00
James Betker
0e5a3f4712 We don't need that encoder either.. 2022-06-19 23:24:42 -06:00
James Betker
56c4a00e71 whoops 2022-06-19 23:22:30 -06:00
James Betker
a659cd865c All the stuff needed for cheater latent generation 2022-06-19 23:12:52 -06:00
James Betker
c5ea2bee52 More mods 2022-06-19 22:30:46 -06:00
James Betker
691ed196da fix codes 2022-06-19 21:29:40 -06:00
James Betker
b9f53a3ff9 don't populate gn5 2022-06-19 21:07:13 -06:00
James Betker
a5d2123daa more cleanup 2022-06-19 21:04:51 -06:00
James Betker
fef1066687 couple more alterations 2022-06-19 20:58:24 -06:00
James Betker
02ead8c05c update params 2022-06-19 20:47:06 -06:00
James Betker
ff8b0533ac gen3 waveform 2022-06-19 19:23:48 -06:00
James Betker
b19b0a74da Merge branch 'master' of https://github.com/neonbjb/DL-Art-School 2022-06-19 18:56:27 -06:00
James Betker
0b9588708c rrdb-inspired waveform gen 2022-06-19 18:56:17 -06:00
James Betker
f425afc965 permute codes 2022-06-19 18:00:30 -06:00
James Betker
90b232f965 gen_long_mels 2022-06-19 17:54:37 -06:00
James Betker
8c8efbe131 fix code_emb 2022-06-19 17:54:08 -06:00
James Betker
368dca18b1 mdf fixes + support for tfd-based waveform gen 2022-06-19 15:07:24 -06:00
James Betker
cb7569ee5e resample real inputs for music_diffusion_fid 2022-06-18 10:40:48 -06:00
James Betker
c000e489fa . 2022-06-17 09:40:11 -06:00
James Betker
7ca532c7cc handle unused encoder parameters 2022-06-17 09:37:07 -06:00
James Betker
e025183bfb and the other ones..
really need to unify this file better.
2022-06-17 09:30:25 -06:00
James Betker
3081c893d4 Don't augment grad scale when the grad don't exist! 2022-06-17 09:27:04 -06:00
James Betker
3efd64ed7a expand codes before the code converters for cheater latents 2022-06-17 09:20:29 -06:00
James Betker
f70b16214d allow more interesting code interpolation 2022-06-17 09:12:44 -06:00
James Betker
87a86ae6a8 integrate tfd12 with cheater network 2022-06-17 09:08:56 -06:00
James Betker
9d7ce42630 add tanh to the end of the latent thingy 2022-06-16 20:31:23 -06:00
James Betker
b7758f25a9 and fix 2022-06-16 20:06:27 -06:00
James Betker
e3287c9d95 Reduce severity of gpt_music reduction 2022-06-16 20:05:05 -06:00
James Betker
41abf776d9 better downsampler 2022-06-16 15:19:08 -06:00
James Betker
28d95e3141 gptmusic work 2022-06-16 15:09:47 -06:00
James Betker
781c43c1fc Clean up old TFD models 2022-06-15 16:49:06 -06:00
James Betker
34d9d5f202 adf for ar-latent tfd 2022-06-15 16:41:08 -06:00
James Betker
157d5d56c3 another debugging fix 2022-06-15 09:19:34 -06:00
James Betker
6fc86bbbe7 get rid of unused param 2022-06-15 09:14:06 -06:00
James Betker
3757ff9526 uv back to tortoise days 2022-06-15 09:04:41 -06:00
James Betker
b51ff8a176 whoops! 2022-06-15 09:01:20 -06:00
James Betker
ff5c03b460 tfd12 with ar prior 2022-06-15 08:58:02 -06:00
James Betker
3f10ce275b some tfd12 fixes to support multivae 2022-06-14 23:53:50 -06:00
James Betker
fae05229ec asdf 2022-06-14 21:52:22 -06:00
James Betker
804b365d5f make adf compatible with 7 gpus 2022-06-14 21:49:26 -06:00
James Betker
6bc19d1328 multivqvae tfd12 2022-06-14 15:58:18 -06:00
James Betker
d29ea0df5e Update ADF to be compatible with classical mel spectrograms 2022-06-14 15:19:52 -06:00
James Betker
c68669e1e1 uv2 add alignment head 2022-06-14 15:18:58 -06:00
James Betker
7ff1fbe2be channel clipper 2022-06-13 20:37:35 -06:00
James Betker
47330d603b Pretrained vqvae option for tfd12.. 2022-06-13 11:19:33 -06:00
James Betker
1fde3e5a08 more reworks 2022-06-13 08:40:23 -06:00
James Betker
7a36668870 whoops! 2022-06-12 21:11:34 -06:00
James Betker
efabcf5008 When ema is on CPU, only update every 10 steps. 2022-06-12 18:34:58 -06:00
James Betker
fc3a7ed5e3 tfd12 2022-06-12 18:09:59 -06:00
James Betker
798166015a provide conditioning ijnput as mel_norm 2022-06-12 14:51:56 -06:00
James Betker
0c95be1624 Fix MDF evaluator for current generation of 2022-06-12 14:41:06 -06:00
James Betker
a3da7f186e add tfd audio diffusion 2022-06-12 13:59:22 -06:00
James Betker
11e70dde14 update tfd11 2022-06-11 17:53:27 -06:00
James Betker
5c6c8f6904 back to convs? 2022-06-11 15:26:10 -06:00
James Betker
b684622b06 tfd10 2022-06-11 14:06:19 -06:00
James Betker
2a787ec910 more mods 2022-06-11 11:44:33 -06:00
James Betker
999a140f9f whoops 2022-06-11 11:22:34 -06:00
James Betker
00b9f332ee rework arch 2022-06-11 11:17:16 -06:00
James Betker
36b5e89a69 Rework the diet blocks a bit 2022-06-11 08:28:10 -06:00
James Betker
0dd3883662 one last update.. 2022-06-11 08:06:29 -06:00
James Betker
41170f97e9 one more adjustment 2022-06-11 08:01:46 -06:00
James Betker
df0cdf1a4f tfd9 returns with some optimizations 2022-06-11 08:00:09 -06:00
James Betker
acfe9cf880 fp16 2022-06-10 22:39:15 -06:00
James Betker
aca9024d9b qcodes 2022-06-10 16:23:08 -06:00
James Betker
38a00f29c0 now theres deprecation warnings, fml 2022-06-10 15:41:39 -06:00
James Betker
561a6b8ff7 damn this sucks 2022-06-10 15:38:59 -06:00
James Betker
0316063e2d . 2022-06-10 15:37:02 -06:00
James Betker
dca16e6447 . 2022-06-10 15:35:36 -06:00
James Betker
ee2827dee9 Debug warmup state 2022-06-10 15:23:31 -06:00
James Betker
6d85fe05f6 :/ oh well. 2022-06-10 15:17:41 -06:00
James Betker
33178e89c4 harharhack 2022-06-10 15:13:24 -06:00
James Betker
7198bd8bd0 forgot other customizations I want to keep 2022-06-10 15:09:05 -06:00
James Betker
8f40108f5b lets try a different tact 2022-06-10 14:51:59 -06:00
James Betker
2158383fa4 Revert previous changes 2022-06-10 14:34:05 -06:00
James Betker
89bd40d39f eval bug fix 2022-06-10 13:51:06 -06:00
James Betker
84469f3538 get rid of encoder checkpointing 2022-06-10 10:50:34 -06:00
James Betker
97b32dd39d try to make tfd8 be able to be trained e2e in quantizer mode 2022-06-10 10:40:56 -06:00
James Betker
e78c4b422c tfd8 2022-06-10 09:24:41 -06:00
James Betker
d98b895307 loss aware fix and report gumbel temperature 2022-06-09 21:56:47 -06:00
James Betker
6e57eaa186 fix bug 2022-06-09 21:52:57 -06:00
James Betker
07bdd865dc some checks 2022-06-09 21:46:32 -06:00
James Betker
34005367fd setup for partial channel diffusion 2022-06-09 21:41:20 -06:00
James Betker
47b34f5cb9 mup work checkin 2022-06-09 21:15:09 -06:00
James Betker
e67e82be2d misc 2022-06-09 21:14:48 -06:00
James Betker
16936881e5 allow freezing the upper quantizer 2022-06-08 18:30:22 -06:00