James Betker
89bd40d39f
eval bug fix
2022-06-10 13:51:06 -06:00
James Betker
84469f3538
get rid of encoder checkpointing
2022-06-10 10:50:34 -06:00
James Betker
97b32dd39d
try to make tfd8 be able to be trained e2e in quantizer mode
2022-06-10 10:40:56 -06:00
James Betker
e78c4b422c
tfd8
2022-06-10 09:24:41 -06:00
James Betker
d98b895307
loss aware fix and report gumbel temperature
2022-06-09 21:56:47 -06:00
James Betker
47b34f5cb9
mup work checkin
2022-06-09 21:15:09 -06:00
James Betker
e67e82be2d
misc
2022-06-09 21:14:48 -06:00
James Betker
16936881e5
allow freezing the upper quantizer
2022-06-08 18:30:22 -06:00
James Betker
43f225c35c
debug gumbel temperature
2022-06-08 12:12:08 -06:00
James Betker
91be38cba3
.
2022-06-08 11:54:46 -06:00
James Betker
dee2b72786
checkpointing bugs, smh
2022-06-08 11:53:10 -06:00
James Betker
c61cd64bc9
network updates
2022-06-08 09:26:59 -06:00
James Betker
5a54d7db11
unet with ar prior
2022-06-07 17:52:36 -06:00
James Betker
5028703b3d
ci not required
2022-06-06 09:26:25 -06:00
James Betker
08597bfaf5
fix
2022-06-06 09:21:58 -06:00
James Betker
49568ee16f
some updates
2022-06-06 09:13:47 -06:00
James Betker
602df0abbc
revert changes to dietattentionblock
2022-06-05 10:06:17 -06:00
James Betker
51d1908e94
update
2022-06-05 09:35:43 -06:00
James Betker
f9ebcf11d8
fix2
2022-06-05 01:31:37 -06:00
James Betker
aac92b01b3
fix
2022-06-05 01:27:28 -06:00
James Betker
38d8b17d18
tfd8 gets real verbose grad norm metrics
2022-06-04 23:09:54 -06:00
James Betker
0a9d4d4afc
bunch of new stuff
2022-06-04 22:23:08 -06:00
James Betker
8f8b189025
Support legacy vqvae quantizer in music_quantizer
2022-06-04 10:16:24 -06:00
James Betker
40ba802104
padding
2022-06-03 12:09:59 -06:00
James Betker
581bc7ac5c
udmc update
2022-06-03 12:02:22 -06:00
James Betker
9d8c2bddb1
classical unet for music
2022-06-03 11:03:14 -06:00
James Betker
2f4d990ad1
tfd7
2022-06-02 09:27:40 -06:00
James Betker
b2a83efe50
a few fixes
2022-06-01 16:35:15 -06:00
James Betker
712e0e82f7
fix bug
2022-06-01 14:21:44 -06:00
James Betker
de54be5570
propagate diversity loss
2022-06-01 14:18:50 -06:00
James Betker
4c6ef42b38
freeze quantizer until step
2022-06-01 08:06:05 -06:00
James Betker
64b6ae2f4a
fix
2022-06-01 01:01:32 -06:00
James Betker
1ac02acdc3
tfd7
2022-06-01 00:50:40 -06:00
James Betker
e8cb93a4e9
fix size issues
2022-05-31 21:23:26 -06:00
James Betker
8a1b8e3e62
add checkpointing
2022-05-31 21:09:05 -06:00
James Betker
c0db85bf4f
music quantizer
2022-05-31 21:06:54 -06:00
James Betker
29b55d42a5
one more
2022-05-30 16:33:49 -06:00
James Betker
71cf654957
fix unused parameters
2022-05-30 16:31:40 -06:00
James Betker
f7d237a50a
train quantizer with diffusion
2022-05-30 16:25:33 -06:00
James Betker
136021bf8d
tfd6
2022-05-30 09:09:42 -06:00
James Betker
eab1162d2b
hmm..
2022-05-29 22:32:25 -06:00
James Betker
2e72fddaeb
td_tts_2
2022-05-29 22:22:14 -06:00
James Betker
536c8558ae
fix
2022-05-28 22:32:38 -06:00
James Betker
da367da411
df5
2022-05-28 22:30:23 -06:00
James Betker
6b43915eb8
support projecting to vectors
2022-05-28 22:27:45 -06:00
James Betker
86694aef4e
tfd5
2022-05-28 22:27:04 -06:00
James Betker
b6b4f10e1b
...
2022-05-28 10:59:03 -06:00
James Betker
0d3b831cf9
big fatty
2022-05-28 10:55:43 -06:00
James Betker
490d39b967
some stuff
2022-05-27 11:40:31 -06:00
James Betker
5efeee6b97
fix type bug
2022-05-27 11:19:30 -06:00
James Betker
0659fe3d1e
tfd3 mods
2022-05-27 11:16:26 -06:00
James Betker
bed3df4888
propagate type
2022-05-27 11:12:03 -06:00
James Betker
c46da0285c
Move stuff around
2022-05-27 11:06:58 -06:00
James Betker
9852599b34
tfd5 - with clvp!
2022-05-27 09:49:10 -06:00
James Betker
3db862dd32
adf update
2022-05-27 09:25:53 -06:00
James Betker
8587a18717
fd fix
2022-05-26 20:19:09 -06:00
James Betker
dd13b883ac
td4
2022-05-26 14:56:03 -06:00
James Betker
1dbe0b6b2e
a
2022-05-26 10:13:27 -06:00
James Betker
aa653115f1
tfd3
2022-05-26 10:09:11 -06:00
James Betker
36c68692a6
forgot to add rotary embeddings
2022-05-26 09:25:42 -06:00
James Betker
8ce48f04ff
transformer diffusion 2
2022-05-26 09:08:35 -06:00
James Betker
56f19a23cd
fix nh
2022-05-25 12:31:56 -06:00
James Betker
52a20f3aa3
und10
2022-05-25 12:19:21 -06:00
James Betker
8b4b5ffa72
slight rework
2022-05-24 14:38:37 -06:00
James Betker
48aab2babe
ressurect ctc code gen with some cool new ideas
2022-05-24 14:02:33 -06:00
James Betker
65b441d74e
transformer diffusion
2022-05-24 14:02:05 -06:00
James Betker
1e1bbe1a27
whoops
2022-05-23 12:28:36 -06:00
James Betker
560b83e770
default to residual encoder
2022-05-23 12:24:00 -06:00
James Betker
f432bdf7ae
deeper resblock encoder
2022-05-23 11:46:40 -06:00
James Betker
dc471f5c6d
residual features
2022-05-23 09:58:30 -06:00
James Betker
1f521d6a1d
add reconstruction loss to m2v
2022-05-23 09:28:41 -06:00
James Betker
2270c89fdc
.
2022-05-23 08:47:15 -06:00
James Betker
40f844657b
tolong
2022-05-23 08:27:54 -06:00
James Betker
10f4a742bd
reintroduce attention masks
2022-05-23 08:16:04 -06:00
James Betker
68c0afcbcc
m2v frequency masking
2022-05-23 07:04:12 -06:00
James Betker
4093e38717
revert flat diffusion back...
2022-05-22 23:10:58 -06:00
James Betker
8f28404645
another fix
2022-05-22 21:32:43 -06:00
James Betker
41809a6330
Add 8x dim reductor
2022-05-22 20:23:16 -06:00
James Betker
1095248caf
Revert "retest"
...
This reverts commit ed7768c73b
.
2022-05-22 19:23:01 -06:00
James Betker
ed7768c73b
retest
2022-05-22 16:30:09 -06:00
James Betker
2dd0b9e6e9
mel_head should be optional
2022-05-22 12:25:45 -06:00
James Betker
0c60f22197
fix unused parameters
2022-05-22 08:16:31 -06:00
James Betker
57d6f6d366
Big rework of flat_diffusion
...
Back to the drawing board, boys. Time to waste some resources catching bugs....
2022-05-22 08:09:33 -06:00
James Betker
be937d202e
new attempt
2022-05-20 17:04:22 -06:00
James Betker
968660c248
another update
2022-05-20 11:25:00 -06:00
James Betker
28f950b7d3
fix
2022-05-20 11:18:52 -06:00
James Betker
b317c68ac9
fix
2022-05-20 11:12:53 -06:00
James Betker
3121bc4e43
flat diffusion
2022-05-20 11:01:48 -06:00
James Betker
e9fb2ead9a
m2v stuff
2022-05-20 11:01:17 -06:00
James Betker
c9c16e3b01
misc updates
2022-05-19 13:39:32 -06:00
James Betker
10378fc37f
make codebooks specifiable
2022-05-18 11:07:12 -06:00
James Betker
efc2657b48
fiddle with init
2022-05-18 10:56:01 -06:00
James Betker
208a703080
use gelu act
2022-05-18 09:34:01 -06:00
James Betker
b2b37453df
make the codebook bigger
2022-05-17 20:58:56 -06:00
James Betker
9a9c3cafba
Make feature encoder a bit more descriptive
2022-05-17 18:14:52 -06:00
James Betker
ee364f4eeb
just take the mean...
2022-05-17 18:09:23 -06:00
James Betker
6130391a85
fix div
2022-05-17 18:04:20 -06:00
James Betker
7213ad2b89
Do grad reduction
2022-05-17 17:59:40 -06:00
James Betker
7c82e18c6c
darn mpi
2022-05-17 17:16:09 -06:00
James Betker
88ec0512f7
Scale losses
2022-05-17 17:12:20 -06:00
James Betker
a6397ce84a
Fix incorrect projections
2022-05-17 16:53:52 -06:00
James Betker
c37fc3b4ed
m2v grad norm groups
2022-05-17 16:29:36 -06:00
James Betker
c1bdb4f9a1
degrade gumbel softmax over time
2022-05-17 16:23:04 -06:00
James Betker
3853f37257
stable layernorm
2022-05-17 16:07:03 -06:00
James Betker
519151d83f
m2v
2022-05-17 15:37:59 -06:00
James Betker
d1de94d75c
Stash mel2vec work (gonna throw it all away..)
2022-05-17 12:35:01 -06:00
James Betker
ee218ab9b7
uv3
2022-05-13 17:57:47 -06:00
James Betker
545453077e
uv3
2022-05-09 15:36:22 -06:00
James Betker
96a5cc66ee
uv3
2022-05-09 15:35:51 -06:00
James Betker
b42b4e18de
clean up unified voice
...
- remove unused code
- fix inference model to use the terms "prior" and "posterior" to properly define the modeling order (they were inverted before)
- default some settings I never intend to change in the future
2022-05-09 14:45:49 -06:00
James Betker
7812c23c7a
revert fill_gaps back to old masking behavior
2022-05-08 00:10:19 -06:00
James Betker
58ed27d7a8
new gap_filler
2022-05-07 12:44:23 -06:00
James Betker
6c8032b4be
more work
2022-05-06 21:56:49 -06:00
James Betker
79543e5488
Simpler form of the wavegen model
2022-05-06 16:37:04 -06:00
James Betker
d8925ccde5
few things with gap filling
2022-05-06 14:33:44 -06:00
James Betker
b13d983c24
and mel_head
2022-05-06 00:25:27 -06:00
James Betker
d5fb79564a
remove mel_pred
2022-05-06 00:24:05 -06:00
James Betker
e9bb692490
fixed aligned_latent
2022-05-06 00:20:21 -06:00
James Betker
1609101a42
musical gap filler
2022-05-05 16:47:08 -06:00
James Betker
d66ab2d28c
Remove unused waveform_gens
2022-05-04 21:06:54 -06:00
James Betker
c42c53e75a
Add a trainable network for converting a normal distribution into a latent space
2022-05-02 09:47:30 -06:00
James Betker
ab219fbefb
output variance
2022-05-02 00:10:33 -06:00
James Betker
3b074aac34
add checkpointing
2022-05-02 00:07:42 -06:00
James Betker
ae5f934ea1
diffwave
2022-05-02 00:05:04 -06:00
James Betker
b712d3b72b
break out get_conditioning_latent from unified_voice
2022-05-01 23:04:44 -06:00
James Betker
afa2df57c9
gen3
2022-04-30 10:41:38 -06:00
James Betker
8aa6651fc7
fix surrogate loss return in waveform_gen2
2022-04-28 10:10:11 -06:00
James Betker
f02b01bd9d
reverse univnet classifier
2022-04-20 21:37:55 -06:00
James Betker
9df85c902e
New gen2
...
Which is basically a autoencoder with a giant diffusion appendage attached
2022-04-20 21:37:34 -06:00
James Betker
b4549eed9f
uv2 fix
2022-04-20 00:27:38 -06:00
James Betker
24fdafd855
fix2
2022-04-20 00:03:29 -06:00
James Betker
0af0051399
fix
2022-04-20 00:01:57 -06:00
James Betker
419f4d37bd
gen2 music
2022-04-19 23:38:37 -06:00
James Betker
8fe0dff33c
support tts typing
2022-04-16 23:36:57 -06:00
James Betker
546ecd5aeb
music!
2022-04-15 21:21:37 -06:00
James Betker
8ea5c307fb
Fixes for training the diffusion model on autoregressive inputs
2022-04-11 11:02:44 -06:00
James Betker
a3622462c1
Change latent_conditioner back
2022-04-11 09:00:13 -06:00
James Betker
19ca5b26c1
Remove flat0 and move it into flat
2022-04-10 21:01:59 -06:00
James Betker
81c952a00a
undo relative
2022-04-08 16:32:52 -06:00
James Betker
944b4c3335
more undos
2022-04-08 16:31:08 -06:00
James Betker
032983e2ed
fix bug and allow position encodings to be trained separately from the rest of the model
2022-04-08 16:26:01 -06:00
James Betker
09ab1aa9bc
revert rotary embeddings work
...
I'm not really sure that this is going to work. I'd rather explore re-using what I've already trained
2022-04-08 16:18:35 -06:00
James Betker
2fb9ffb0aa
Align autoregressive text using start and stop tokens
2022-04-08 09:41:59 -06:00
James Betker
e634996a9c
autoregressive_codegen: support key_value caching for faster inference
2022-04-07 23:08:46 -07:00
James Betker
7c578eb59b
Fix inference in new autoregressive_codegen
2022-04-07 21:22:46 -06:00
James Betker
3f8d7955ef
unified_voice with rotary embeddings
2022-04-07 20:11:14 -06:00
James Betker
71b73db044
clean up
2022-04-07 11:34:10 -06:00
James Betker
6fc4f49e86
some dumb stuff
2022-04-07 11:32:34 -06:00
James Betker
305dc95e4b
cg2
2022-04-06 21:24:36 -06:00
James Betker
e011166dd6
autoregressive_codegen r3
2022-04-06 21:04:23 -06:00