James Betker
|
032983e2ed
|
fix bug and allow position encodings to be trained separately from the rest of the model
|
2022-04-08 16:26:01 -06:00 |
|
James Betker
|
09ab1aa9bc
|
revert rotary embeddings work
I'm not really sure that this is going to work. I'd rather explore re-using what I've already trained
|
2022-04-08 16:18:35 -06:00 |
|
James Betker
|
2fb9ffb0aa
|
Align autoregressive text using start and stop tokens
|
2022-04-08 09:41:59 -06:00 |
|
James Betker
|
628569af7b
|
Another fix
|
2022-04-08 09:41:18 -06:00 |
|
James Betker
|
423293e518
|
fix xtransformers bug
|
2022-04-08 09:12:46 -06:00 |
|
James Betker
|
048f6f729a
|
remove lightweight_gan
|
2022-04-07 23:12:08 -07:00 |
|
James Betker
|
e634996a9c
|
autoregressive_codegen: support key_value caching for faster inference
|
2022-04-07 23:08:46 -07:00 |
|
James Betker
|
d05e162f95
|
reformat x_transformers
|
2022-04-07 23:08:03 -07:00 |
|
James Betker
|
7c578eb59b
|
Fix inference in new autoregressive_codegen
|
2022-04-07 21:22:46 -06:00 |
|
James Betker
|
3f8d7955ef
|
unified_voice with rotary embeddings
|
2022-04-07 20:11:14 -06:00 |
|
James Betker
|
573e5552b9
|
CLVP v1
|
2022-04-07 20:10:57 -06:00 |
|
James Betker
|
71b73db044
|
clean up
|
2022-04-07 11:34:10 -06:00 |
|
James Betker
|
6fc4f49e86
|
some dumb stuff
|
2022-04-07 11:32:34 -06:00 |
|
James Betker
|
e6387c7613
|
Fix eval logic to not run immediately
|
2022-04-07 11:29:57 -06:00 |
|
James Betker
|
305dc95e4b
|
cg2
|
2022-04-06 21:24:36 -06:00 |
|
James Betker
|
e011166dd6
|
autoregressive_codegen r3
|
2022-04-06 21:04:23 -06:00 |
|
James Betker
|
33ef17e9e5
|
fix context
|
2022-04-06 00:45:42 -06:00 |
|
James Betker
|
37bdfe82b2
|
Modify x_transformers to do checkpointing and use relative positional biases
|
2022-04-06 00:35:29 -06:00 |
|
James Betker
|
09879b434d
|
bring in x_transformers
|
2022-04-06 00:21:58 -06:00 |
|
James Betker
|
3d916e7687
|
Fix evaluation when using multiple batch sizes
|
2022-04-05 07:51:09 -06:00 |
|
James Betker
|
572d137589
|
track iteration rate
|
2022-04-04 12:33:25 -06:00 |
|
James Betker
|
4cdb0169d0
|
update training data encountered when using force_start_step
|
2022-04-04 12:25:00 -06:00 |
|
James Betker
|
cdd12ff46c
|
Add code validation to autoregressive_codegen
|
2022-04-04 09:51:41 -06:00 |
|
James Betker
|
99de63a922
|
man I'm really on it tonight....
|
2022-04-02 22:01:33 -06:00 |
|
James Betker
|
a4bdc80933
|
moikmadsf
|
2022-04-02 21:59:50 -06:00 |
|
James Betker
|
1cf20b7337
|
sdfds
|
2022-04-02 21:58:09 -06:00 |
|
James Betker
|
b6afc4d542
|
dsfa
|
2022-04-02 21:57:00 -06:00 |
|
James Betker
|
4c6bdfc9e2
|
get rid of relative position embeddings, which do not work with DDP & checkpointing
|
2022-04-02 21:55:32 -06:00 |
|
James Betker
|
b6d62aca5d
|
add inference model on top of codegen
|
2022-04-02 21:25:10 -06:00 |
|
James Betker
|
2b6ff09225
|
autoregressive_codegen v1
|
2022-04-02 15:07:39 -06:00 |
|
James Betker
|
00767219fc
|
undo latent converter change
|
2022-04-01 20:46:27 -06:00 |
|
James Betker
|
55c86e02c7
|
Flat fix
|
2022-04-01 19:13:33 -06:00 |
|
James Betker
|
8623c51902
|
fix bug
|
2022-04-01 16:11:34 -06:00 |
|
James Betker
|
035bcd9f6c
|
fwd fix
|
2022-04-01 16:03:07 -06:00 |
|
James Betker
|
f6a8b0a5ca
|
prep flat0 for feeding from autoregressive_latent_converter
|
2022-04-01 15:53:45 -06:00 |
|
James Betker
|
3e97abc8a9
|
update flat0 to break out timestep-independent inference steps
|
2022-04-01 14:38:53 -06:00 |
|
James Betker
|
a6181a489b
|
Fix loss gapping caused by poor gradients into mel_pred
|
2022-03-26 22:49:14 -06:00 |
|
James Betker
|
0070867d0f
|
inference script for diffusion image models
|
2022-03-26 22:48:24 -06:00 |
|
James Betker
|
1feade23ff
|
support x-transformers in text_voice_clip and support relative positional embeddings
|
2022-03-26 22:48:10 -06:00 |
|
James Betker
|
9b90472e15
|
feed direct inputs into gd
|
2022-03-26 08:36:19 -06:00 |
|
James Betker
|
6909f196b4
|
make code pred returns optional
|
2022-03-26 08:33:30 -06:00 |
|
James Betker
|
2a29a71c37
|
attempt to force meaningful codes by adding a surrogate loss
|
2022-03-26 08:31:40 -06:00 |
|
James Betker
|
45804177b8
|
more stuff
|
2022-03-25 00:03:18 -06:00 |
|
James Betker
|
d4218d8443
|
mods
|
2022-03-24 23:31:20 -06:00 |
|
James Betker
|
9c79fec734
|
update adf
|
2022-03-24 21:20:29 -06:00 |
|
James Betker
|
07731d5491
|
Fix ET
|
2022-03-24 21:20:22 -06:00 |
|
James Betker
|
a15970dd97
|
disable checkpointing in conditioning encoder
|
2022-03-24 11:49:04 -06:00 |
|
James Betker
|
cc5fc91562
|
flat0 work
|
2022-03-24 11:46:53 -06:00 |
|
James Betker
|
b0d2827fad
|
flat0
|
2022-03-24 11:30:40 -06:00 |
|
James Betker
|
8707a3e0c3
|
drop full layers in layerdrop, not half layers
|
2022-03-23 17:15:08 -06:00 |
|