James Betker
|
691ed196da
|
fix codes
|
2022-06-19 21:29:40 -06:00 |
|
James Betker
|
b9f53a3ff9
|
don't populate gn5
|
2022-06-19 21:07:13 -06:00 |
|
James Betker
|
a5d2123daa
|
more cleanup
|
2022-06-19 21:04:51 -06:00 |
|
James Betker
|
fef1066687
|
couple more alterations
|
2022-06-19 20:58:24 -06:00 |
|
James Betker
|
02ead8c05c
|
update params
|
2022-06-19 20:47:06 -06:00 |
|
James Betker
|
ff8b0533ac
|
gen3 waveform
|
2022-06-19 19:23:48 -06:00 |
|
James Betker
|
b19b0a74da
|
Merge branch 'master' of https://github.com/neonbjb/DL-Art-School
|
2022-06-19 18:56:27 -06:00 |
|
James Betker
|
0b9588708c
|
rrdb-inspired waveform gen
|
2022-06-19 18:56:17 -06:00 |
|
James Betker
|
f425afc965
|
permute codes
|
2022-06-19 18:00:30 -06:00 |
|
James Betker
|
8c8efbe131
|
fix code_emb
|
2022-06-19 17:54:08 -06:00 |
|
James Betker
|
c000e489fa
|
.
|
2022-06-17 09:40:11 -06:00 |
|
James Betker
|
7ca532c7cc
|
handle unused encoder parameters
|
2022-06-17 09:37:07 -06:00 |
|
James Betker
|
e025183bfb
|
and the other ones..
really need to unify this file better.
|
2022-06-17 09:30:25 -06:00 |
|
James Betker
|
3081c893d4
|
Don't augment grad scale when the grad don't exist!
|
2022-06-17 09:27:04 -06:00 |
|
James Betker
|
3efd64ed7a
|
expand codes before the code converters for cheater latents
|
2022-06-17 09:20:29 -06:00 |
|
James Betker
|
f70b16214d
|
allow more interesting code interpolation
|
2022-06-17 09:12:44 -06:00 |
|
James Betker
|
87a86ae6a8
|
integrate tfd12 with cheater network
|
2022-06-17 09:08:56 -06:00 |
|
James Betker
|
9d7ce42630
|
add tanh to the end of the latent thingy
|
2022-06-16 20:31:23 -06:00 |
|
James Betker
|
b7758f25a9
|
and fix
|
2022-06-16 20:06:27 -06:00 |
|
James Betker
|
e3287c9d95
|
Reduce severity of gpt_music reduction
|
2022-06-16 20:05:05 -06:00 |
|
James Betker
|
41abf776d9
|
better downsampler
|
2022-06-16 15:19:08 -06:00 |
|
James Betker
|
28d95e3141
|
gptmusic work
|
2022-06-16 15:09:47 -06:00 |
|
James Betker
|
781c43c1fc
|
Clean up old TFD models
|
2022-06-15 16:49:06 -06:00 |
|
James Betker
|
157d5d56c3
|
another debugging fix
|
2022-06-15 09:19:34 -06:00 |
|
James Betker
|
3757ff9526
|
uv back to tortoise days
|
2022-06-15 09:04:41 -06:00 |
|
James Betker
|
b51ff8a176
|
whoops!
|
2022-06-15 09:01:20 -06:00 |
|
James Betker
|
ff5c03b460
|
tfd12 with ar prior
|
2022-06-15 08:58:02 -06:00 |
|
James Betker
|
3f10ce275b
|
some tfd12 fixes to support multivae
|
2022-06-14 23:53:50 -06:00 |
|
James Betker
|
fae05229ec
|
asdf
|
2022-06-14 21:52:22 -06:00 |
|
James Betker
|
6bc19d1328
|
multivqvae tfd12
|
2022-06-14 15:58:18 -06:00 |
|
James Betker
|
d29ea0df5e
|
Update ADF to be compatible with classical mel spectrograms
|
2022-06-14 15:19:52 -06:00 |
|
James Betker
|
c68669e1e1
|
uv2 add alignment head
|
2022-06-14 15:18:58 -06:00 |
|
James Betker
|
47330d603b
|
Pretrained vqvae option for tfd12..
|
2022-06-13 11:19:33 -06:00 |
|
James Betker
|
1fde3e5a08
|
more reworks
|
2022-06-13 08:40:23 -06:00 |
|
James Betker
|
fc3a7ed5e3
|
tfd12
|
2022-06-12 18:09:59 -06:00 |
|
James Betker
|
0c95be1624
|
Fix MDF evaluator for current generation of
|
2022-06-12 14:41:06 -06:00 |
|
James Betker
|
11e70dde14
|
update tfd11
|
2022-06-11 17:53:27 -06:00 |
|
James Betker
|
5c6c8f6904
|
back to convs?
|
2022-06-11 15:26:10 -06:00 |
|
James Betker
|
b684622b06
|
tfd10
|
2022-06-11 14:06:19 -06:00 |
|
James Betker
|
2a787ec910
|
more mods
|
2022-06-11 11:44:33 -06:00 |
|
James Betker
|
999a140f9f
|
whoops
|
2022-06-11 11:22:34 -06:00 |
|
James Betker
|
00b9f332ee
|
rework arch
|
2022-06-11 11:17:16 -06:00 |
|
James Betker
|
36b5e89a69
|
Rework the diet blocks a bit
|
2022-06-11 08:28:10 -06:00 |
|
James Betker
|
0dd3883662
|
one last update..
|
2022-06-11 08:06:29 -06:00 |
|
James Betker
|
41170f97e9
|
one more adjustment
|
2022-06-11 08:01:46 -06:00 |
|
James Betker
|
df0cdf1a4f
|
tfd9 returns with some optimizations
|
2022-06-11 08:00:09 -06:00 |
|
James Betker
|
acfe9cf880
|
fp16
|
2022-06-10 22:39:15 -06:00 |
|
James Betker
|
aca9024d9b
|
qcodes
|
2022-06-10 16:23:08 -06:00 |
|
James Betker
|
dca16e6447
|
.
|
2022-06-10 15:35:36 -06:00 |
|
James Betker
|
6d85fe05f6
|
:/ oh well.
|
2022-06-10 15:17:41 -06:00 |
|
James Betker
|
33178e89c4
|
harharhack
|
2022-06-10 15:13:24 -06:00 |
|
James Betker
|
7198bd8bd0
|
forgot other customizations I want to keep
|
2022-06-10 15:09:05 -06:00 |
|
James Betker
|
8f40108f5b
|
lets try a different tact
|
2022-06-10 14:51:59 -06:00 |
|
James Betker
|
2158383fa4
|
Revert previous changes
|
2022-06-10 14:34:05 -06:00 |
|
James Betker
|
89bd40d39f
|
eval bug fix
|
2022-06-10 13:51:06 -06:00 |
|
James Betker
|
84469f3538
|
get rid of encoder checkpointing
|
2022-06-10 10:50:34 -06:00 |
|
James Betker
|
97b32dd39d
|
try to make tfd8 be able to be trained e2e in quantizer mode
|
2022-06-10 10:40:56 -06:00 |
|
James Betker
|
e78c4b422c
|
tfd8
|
2022-06-10 09:24:41 -06:00 |
|
James Betker
|
d98b895307
|
loss aware fix and report gumbel temperature
|
2022-06-09 21:56:47 -06:00 |
|
James Betker
|
47b34f5cb9
|
mup work checkin
|
2022-06-09 21:15:09 -06:00 |
|
James Betker
|
e67e82be2d
|
misc
|
2022-06-09 21:14:48 -06:00 |
|
James Betker
|
16936881e5
|
allow freezing the upper quantizer
|
2022-06-08 18:30:22 -06:00 |
|
James Betker
|
43f225c35c
|
debug gumbel temperature
|
2022-06-08 12:12:08 -06:00 |
|
James Betker
|
91be38cba3
|
.
|
2022-06-08 11:54:46 -06:00 |
|
James Betker
|
dee2b72786
|
checkpointing bugs, smh
|
2022-06-08 11:53:10 -06:00 |
|
James Betker
|
c61cd64bc9
|
network updates
|
2022-06-08 09:26:59 -06:00 |
|
James Betker
|
5a54d7db11
|
unet with ar prior
|
2022-06-07 17:52:36 -06:00 |
|
James Betker
|
5028703b3d
|
ci not required
|
2022-06-06 09:26:25 -06:00 |
|
James Betker
|
08597bfaf5
|
fix
|
2022-06-06 09:21:58 -06:00 |
|
James Betker
|
49568ee16f
|
some updates
|
2022-06-06 09:13:47 -06:00 |
|
James Betker
|
602df0abbc
|
revert changes to dietattentionblock
|
2022-06-05 10:06:17 -06:00 |
|
James Betker
|
51d1908e94
|
update
|
2022-06-05 09:35:43 -06:00 |
|
James Betker
|
f9ebcf11d8
|
fix2
|
2022-06-05 01:31:37 -06:00 |
|
James Betker
|
aac92b01b3
|
fix
|
2022-06-05 01:27:28 -06:00 |
|
James Betker
|
38d8b17d18
|
tfd8 gets real verbose grad norm metrics
|
2022-06-04 23:09:54 -06:00 |
|
James Betker
|
0a9d4d4afc
|
bunch of new stuff
|
2022-06-04 22:23:08 -06:00 |
|
James Betker
|
8f8b189025
|
Support legacy vqvae quantizer in music_quantizer
|
2022-06-04 10:16:24 -06:00 |
|
James Betker
|
40ba802104
|
padding
|
2022-06-03 12:09:59 -06:00 |
|
James Betker
|
581bc7ac5c
|
udmc update
|
2022-06-03 12:02:22 -06:00 |
|
James Betker
|
9d8c2bddb1
|
classical unet for music
|
2022-06-03 11:03:14 -06:00 |
|
James Betker
|
2f4d990ad1
|
tfd7
|
2022-06-02 09:27:40 -06:00 |
|
James Betker
|
b2a83efe50
|
a few fixes
|
2022-06-01 16:35:15 -06:00 |
|
James Betker
|
712e0e82f7
|
fix bug
|
2022-06-01 14:21:44 -06:00 |
|
James Betker
|
de54be5570
|
propagate diversity loss
|
2022-06-01 14:18:50 -06:00 |
|
James Betker
|
4c6ef42b38
|
freeze quantizer until step
|
2022-06-01 08:06:05 -06:00 |
|
James Betker
|
64b6ae2f4a
|
fix
|
2022-06-01 01:01:32 -06:00 |
|
James Betker
|
1ac02acdc3
|
tfd7
|
2022-06-01 00:50:40 -06:00 |
|
James Betker
|
e8cb93a4e9
|
fix size issues
|
2022-05-31 21:23:26 -06:00 |
|
James Betker
|
8a1b8e3e62
|
add checkpointing
|
2022-05-31 21:09:05 -06:00 |
|
James Betker
|
c0db85bf4f
|
music quantizer
|
2022-05-31 21:06:54 -06:00 |
|
James Betker
|
29b55d42a5
|
one more
|
2022-05-30 16:33:49 -06:00 |
|
James Betker
|
71cf654957
|
fix unused parameters
|
2022-05-30 16:31:40 -06:00 |
|
James Betker
|
f7d237a50a
|
train quantizer with diffusion
|
2022-05-30 16:25:33 -06:00 |
|
James Betker
|
136021bf8d
|
tfd6
|
2022-05-30 09:09:42 -06:00 |
|
James Betker
|
eab1162d2b
|
hmm..
|
2022-05-29 22:32:25 -06:00 |
|
James Betker
|
2e72fddaeb
|
td_tts_2
|
2022-05-29 22:22:14 -06:00 |
|
James Betker
|
536c8558ae
|
fix
|
2022-05-28 22:32:38 -06:00 |
|
James Betker
|
da367da411
|
df5
|
2022-05-28 22:30:23 -06:00 |
|
James Betker
|
6b43915eb8
|
support projecting to vectors
|
2022-05-28 22:27:45 -06:00 |
|
James Betker
|
86694aef4e
|
tfd5
|
2022-05-28 22:27:04 -06:00 |
|