James Betker
|
79543e5488
|
Simpler form of the wavegen model
|
2022-05-06 16:37:04 -06:00 |
|
James Betker
|
d8925ccde5
|
few things with gap filling
|
2022-05-06 14:33:44 -06:00 |
|
James Betker
|
b13d983c24
|
and mel_head
|
2022-05-06 00:25:27 -06:00 |
|
James Betker
|
d5fb79564a
|
remove mel_pred
|
2022-05-06 00:24:05 -06:00 |
|
James Betker
|
e9bb692490
|
fixed aligned_latent
|
2022-05-06 00:20:21 -06:00 |
|
James Betker
|
1609101a42
|
musical gap filler
|
2022-05-05 16:47:08 -06:00 |
|
James Betker
|
d66ab2d28c
|
Remove unused waveform_gens
|
2022-05-04 21:06:54 -06:00 |
|
James Betker
|
c42c53e75a
|
Add a trainable network for converting a normal distribution into a latent space
|
2022-05-02 09:47:30 -06:00 |
|
James Betker
|
ab219fbefb
|
output variance
|
2022-05-02 00:10:33 -06:00 |
|
James Betker
|
3b074aac34
|
add checkpointing
|
2022-05-02 00:07:42 -06:00 |
|
James Betker
|
ae5f934ea1
|
diffwave
|
2022-05-02 00:05:04 -06:00 |
|
James Betker
|
b712d3b72b
|
break out get_conditioning_latent from unified_voice
|
2022-05-01 23:04:44 -06:00 |
|
James Betker
|
afa2df57c9
|
gen3
|
2022-04-30 10:41:38 -06:00 |
|
James Betker
|
8aa6651fc7
|
fix surrogate loss return in waveform_gen2
|
2022-04-28 10:10:11 -06:00 |
|
James Betker
|
f02b01bd9d
|
reverse univnet classifier
|
2022-04-20 21:37:55 -06:00 |
|
James Betker
|
9df85c902e
|
New gen2
Which is basically a autoencoder with a giant diffusion appendage attached
|
2022-04-20 21:37:34 -06:00 |
|
James Betker
|
b4549eed9f
|
uv2 fix
|
2022-04-20 00:27:38 -06:00 |
|
James Betker
|
24fdafd855
|
fix2
|
2022-04-20 00:03:29 -06:00 |
|
James Betker
|
0af0051399
|
fix
|
2022-04-20 00:01:57 -06:00 |
|
James Betker
|
419f4d37bd
|
gen2 music
|
2022-04-19 23:38:37 -06:00 |
|
James Betker
|
8fe0dff33c
|
support tts typing
|
2022-04-16 23:36:57 -06:00 |
|
James Betker
|
546ecd5aeb
|
music!
|
2022-04-15 21:21:37 -06:00 |
|
James Betker
|
8ea5c307fb
|
Fixes for training the diffusion model on autoregressive inputs
|
2022-04-11 11:02:44 -06:00 |
|
James Betker
|
a3622462c1
|
Change latent_conditioner back
|
2022-04-11 09:00:13 -06:00 |
|
James Betker
|
19ca5b26c1
|
Remove flat0 and move it into flat
|
2022-04-10 21:01:59 -06:00 |
|
James Betker
|
81c952a00a
|
undo relative
|
2022-04-08 16:32:52 -06:00 |
|
James Betker
|
944b4c3335
|
more undos
|
2022-04-08 16:31:08 -06:00 |
|
James Betker
|
032983e2ed
|
fix bug and allow position encodings to be trained separately from the rest of the model
|
2022-04-08 16:26:01 -06:00 |
|
James Betker
|
09ab1aa9bc
|
revert rotary embeddings work
I'm not really sure that this is going to work. I'd rather explore re-using what I've already trained
|
2022-04-08 16:18:35 -06:00 |
|
James Betker
|
2fb9ffb0aa
|
Align autoregressive text using start and stop tokens
|
2022-04-08 09:41:59 -06:00 |
|
James Betker
|
e634996a9c
|
autoregressive_codegen: support key_value caching for faster inference
|
2022-04-07 23:08:46 -07:00 |
|
James Betker
|
7c578eb59b
|
Fix inference in new autoregressive_codegen
|
2022-04-07 21:22:46 -06:00 |
|
James Betker
|
3f8d7955ef
|
unified_voice with rotary embeddings
|
2022-04-07 20:11:14 -06:00 |
|
James Betker
|
71b73db044
|
clean up
|
2022-04-07 11:34:10 -06:00 |
|
James Betker
|
6fc4f49e86
|
some dumb stuff
|
2022-04-07 11:32:34 -06:00 |
|
James Betker
|
305dc95e4b
|
cg2
|
2022-04-06 21:24:36 -06:00 |
|
James Betker
|
e011166dd6
|
autoregressive_codegen r3
|
2022-04-06 21:04:23 -06:00 |
|
James Betker
|
37bdfe82b2
|
Modify x_transformers to do checkpointing and use relative positional biases
|
2022-04-06 00:35:29 -06:00 |
|
James Betker
|
cdd12ff46c
|
Add code validation to autoregressive_codegen
|
2022-04-04 09:51:41 -06:00 |
|
James Betker
|
99de63a922
|
man I'm really on it tonight....
|
2022-04-02 22:01:33 -06:00 |
|
James Betker
|
a4bdc80933
|
moikmadsf
|
2022-04-02 21:59:50 -06:00 |
|
James Betker
|
1cf20b7337
|
sdfds
|
2022-04-02 21:58:09 -06:00 |
|
James Betker
|
b6afc4d542
|
dsfa
|
2022-04-02 21:57:00 -06:00 |
|
James Betker
|
4c6bdfc9e2
|
get rid of relative position embeddings, which do not work with DDP & checkpointing
|
2022-04-02 21:55:32 -06:00 |
|
James Betker
|
b6d62aca5d
|
add inference model on top of codegen
|
2022-04-02 21:25:10 -06:00 |
|
James Betker
|
2b6ff09225
|
autoregressive_codegen v1
|
2022-04-02 15:07:39 -06:00 |
|
James Betker
|
00767219fc
|
undo latent converter change
|
2022-04-01 20:46:27 -06:00 |
|
James Betker
|
55c86e02c7
|
Flat fix
|
2022-04-01 19:13:33 -06:00 |
|
James Betker
|
8623c51902
|
fix bug
|
2022-04-01 16:11:34 -06:00 |
|
James Betker
|
f6a8b0a5ca
|
prep flat0 for feeding from autoregressive_latent_converter
|
2022-04-01 15:53:45 -06:00 |
|