James Betker
|
aa653115f1
|
tfd3
|
2022-05-26 10:09:11 -06:00 |
|
James Betker
|
36c68692a6
|
forgot to add rotary embeddings
|
2022-05-26 09:25:42 -06:00 |
|
James Betker
|
8ce48f04ff
|
transformer diffusion 2
|
2022-05-26 09:08:35 -06:00 |
|
James Betker
|
56f19a23cd
|
fix nh
|
2022-05-25 12:31:56 -06:00 |
|
James Betker
|
52a20f3aa3
|
und10
|
2022-05-25 12:19:21 -06:00 |
|
James Betker
|
8b4b5ffa72
|
slight rework
|
2022-05-24 14:38:37 -06:00 |
|
James Betker
|
48aab2babe
|
ressurect ctc code gen with some cool new ideas
|
2022-05-24 14:02:33 -06:00 |
|
James Betker
|
65b441d74e
|
transformer diffusion
|
2022-05-24 14:02:05 -06:00 |
|
James Betker
|
1e1bbe1a27
|
whoops
|
2022-05-23 12:28:36 -06:00 |
|
James Betker
|
560b83e770
|
default to residual encoder
|
2022-05-23 12:24:00 -06:00 |
|
James Betker
|
f432bdf7ae
|
deeper resblock encoder
|
2022-05-23 11:46:40 -06:00 |
|
James Betker
|
dc471f5c6d
|
residual features
|
2022-05-23 09:58:30 -06:00 |
|
James Betker
|
1f521d6a1d
|
add reconstruction loss to m2v
|
2022-05-23 09:28:41 -06:00 |
|
James Betker
|
2270c89fdc
|
.
|
2022-05-23 08:47:15 -06:00 |
|
James Betker
|
40f844657b
|
tolong
|
2022-05-23 08:27:54 -06:00 |
|
James Betker
|
10f4a742bd
|
reintroduce attention masks
|
2022-05-23 08:16:04 -06:00 |
|
James Betker
|
68c0afcbcc
|
m2v frequency masking
|
2022-05-23 07:04:12 -06:00 |
|
James Betker
|
4093e38717
|
revert flat diffusion back...
|
2022-05-22 23:10:58 -06:00 |
|
James Betker
|
8f28404645
|
another fix
|
2022-05-22 21:32:43 -06:00 |
|
James Betker
|
41809a6330
|
Add 8x dim reductor
|
2022-05-22 20:23:16 -06:00 |
|
James Betker
|
1095248caf
|
Revert "retest"
This reverts commit ed7768c73b .
|
2022-05-22 19:23:01 -06:00 |
|
James Betker
|
ed7768c73b
|
retest
|
2022-05-22 16:30:09 -06:00 |
|
James Betker
|
2dd0b9e6e9
|
mel_head should be optional
|
2022-05-22 12:25:45 -06:00 |
|
James Betker
|
0c60f22197
|
fix unused parameters
|
2022-05-22 08:16:31 -06:00 |
|
James Betker
|
57d6f6d366
|
Big rework of flat_diffusion
Back to the drawing board, boys. Time to waste some resources catching bugs....
|
2022-05-22 08:09:33 -06:00 |
|
James Betker
|
be937d202e
|
new attempt
|
2022-05-20 17:04:22 -06:00 |
|
James Betker
|
968660c248
|
another update
|
2022-05-20 11:25:00 -06:00 |
|
James Betker
|
28f950b7d3
|
fix
|
2022-05-20 11:18:52 -06:00 |
|
James Betker
|
b317c68ac9
|
fix
|
2022-05-20 11:12:53 -06:00 |
|
James Betker
|
3121bc4e43
|
flat diffusion
|
2022-05-20 11:01:48 -06:00 |
|
James Betker
|
e9fb2ead9a
|
m2v stuff
|
2022-05-20 11:01:17 -06:00 |
|
James Betker
|
c9c16e3b01
|
misc updates
|
2022-05-19 13:39:32 -06:00 |
|
James Betker
|
10378fc37f
|
make codebooks specifiable
|
2022-05-18 11:07:12 -06:00 |
|
James Betker
|
efc2657b48
|
fiddle with init
|
2022-05-18 10:56:01 -06:00 |
|
James Betker
|
208a703080
|
use gelu act
|
2022-05-18 09:34:01 -06:00 |
|
James Betker
|
b2b37453df
|
make the codebook bigger
|
2022-05-17 20:58:56 -06:00 |
|
James Betker
|
9a9c3cafba
|
Make feature encoder a bit more descriptive
|
2022-05-17 18:14:52 -06:00 |
|
James Betker
|
ee364f4eeb
|
just take the mean...
|
2022-05-17 18:09:23 -06:00 |
|
James Betker
|
6130391a85
|
fix div
|
2022-05-17 18:04:20 -06:00 |
|
James Betker
|
7213ad2b89
|
Do grad reduction
|
2022-05-17 17:59:40 -06:00 |
|
James Betker
|
7c82e18c6c
|
darn mpi
|
2022-05-17 17:16:09 -06:00 |
|
James Betker
|
88ec0512f7
|
Scale losses
|
2022-05-17 17:12:20 -06:00 |
|
James Betker
|
a6397ce84a
|
Fix incorrect projections
|
2022-05-17 16:53:52 -06:00 |
|
James Betker
|
c37fc3b4ed
|
m2v grad norm groups
|
2022-05-17 16:29:36 -06:00 |
|
James Betker
|
c1bdb4f9a1
|
degrade gumbel softmax over time
|
2022-05-17 16:23:04 -06:00 |
|
James Betker
|
3853f37257
|
stable layernorm
|
2022-05-17 16:07:03 -06:00 |
|
James Betker
|
519151d83f
|
m2v
|
2022-05-17 15:37:59 -06:00 |
|
James Betker
|
d1de94d75c
|
Stash mel2vec work (gonna throw it all away..)
|
2022-05-17 12:35:01 -06:00 |
|
James Betker
|
ee218ab9b7
|
uv3
|
2022-05-13 17:57:47 -06:00 |
|
James Betker
|
3d7e2a2846
|
fix collection
|
2022-05-11 21:50:05 -06:00 |
|