|
6676c89c0e
|
I sucked off the hyptothetical wizard again, just using BNB's ADAM optimizer nets HUGE savings, but I don't know the output costs, will need to test
|
2023-02-23 02:42:17 +00:00 |
|
|
4427d7fb84
|
initial conversion (errors out)
|
2023-02-22 23:07:05 +00:00 |
|
James Betker
|
9502e0755e
|
ugh
|
2022-10-10 12:15:51 -06:00 |
|
James Betker
|
fce2c8f5db
|
and listify them
|
2022-10-10 12:13:49 -06:00 |
|
James Betker
|
3cf78e3c44
|
train mel head even when not
|
2022-10-10 12:10:56 -06:00 |
|
James Betker
|
cc74a43675
|
Checkin
|
2022-10-10 11:30:20 -06:00 |
|
James Betker
|
ff5c03b460
|
tfd12 with ar prior
|
2022-06-15 08:58:02 -06:00 |
|
James Betker
|
c42c53e75a
|
Add a trainable network for converting a normal distribution into a latent space
|
2022-05-02 09:47:30 -06:00 |
|
James Betker
|
a3622462c1
|
Change latent_conditioner back
|
2022-04-11 09:00:13 -06:00 |
|
James Betker
|
19ca5b26c1
|
Remove flat0 and move it into flat
|
2022-04-10 21:01:59 -06:00 |
|
James Betker
|
8707a3e0c3
|
drop full layers in layerdrop, not half layers
|
2022-03-23 17:15:08 -06:00 |
|
James Betker
|
57da6d0ddf
|
more simplifications
|
2022-03-22 11:46:03 -06:00 |
|
James Betker
|
f3f391b372
|
undo sandwich
|
2022-03-22 11:43:24 -06:00 |
|
James Betker
|
5405ce4363
|
fix flat
|
2022-03-22 11:39:39 -06:00 |
|
James Betker
|
cc4c9faf9a
|
resolve more issues
|
2022-03-21 17:20:05 -06:00 |
|
James Betker
|
9e97cd800c
|
take the conditioning mean rather than the first element
|
2022-03-21 16:58:03 -06:00 |
|
James Betker
|
9c7598dc9a
|
fix conditioning_free signal
|
2022-03-21 15:29:17 -06:00 |
|
James Betker
|
723f324eda
|
Make it even better
|
2022-03-21 14:50:59 -06:00 |
|
James Betker
|
1ad18d29a8
|
Flat fixes
|
2022-03-21 14:43:52 -06:00 |
|
James Betker
|
26dcf7f1a2
|
r2 of the flat diffusion
|
2022-03-21 11:40:43 -06:00 |
|
James Betker
|
c14fc003ed
|
flat diffusion
|
2022-03-17 17:45:27 -06:00 |
|
James Betker
|
428911cd4d
|
flat diffusion network
|
2022-03-17 10:53:56 -06:00 |
|