This website requires JavaScript.
a4afad8837
PIP-ified (credit to https://git.ecker.tech/eschmidbauer )
master
1729689070354518036/tmp_refs/heads/master
1729689070354518036/master
mrq
2023-03-21 15:39:28 +0000
fe24641763
PIP-ified part 1
mrq
2023-03-21 15:38:42 +0000
b0b3d6c626
fix setup.py so pip install does not fail
Emmanuel Schmidbauer
2023-03-20 10:41:57 -0400
efd038c076
forgot the other things that were in tortoise implementation but not here
1723703554532193194/tmp_refs/heads/master
1723703554532193194/master
1719357675652877588/tmp_refs/heads/master
1719357675652877588/master
mrq
2023-03-17 20:24:17 +0000
64a41fde24
added japanese preprocessor for tokenizer
mrq
2023-03-17 20:03:57 +0000
7b5e0592f8
migrated bitsandbytes out, since AIVC technically uses this more
mrq
2023-03-16 20:42:32 +0000
0db8ebc543
deduce if preprocessing text by checking the JSON itself instead
mrq
2023-03-16 14:41:21 +0000
730f56aa87
some day I'll get a commit right on the first try
mrq
2023-03-16 04:37:49 +0000
730a04708d
added flag to disable preprocessing (because some IPAs will turn into ASCII, implicitly enable for using the specific ipa.json tokenizer vocab)
mrq
2023-03-16 04:24:32 +0000
bea6174a19
fix for torch2.0.0 suddenly being imposed on us
mrq
2023-03-15 19:24:35 +0000
b253da6e35
save when training completes
mrq
2023-03-15 02:47:12 +0000
3fdf2a63aa
fixes
mrq
2023-03-11 01:18:25 +0000
b5c6acec9e
gutted bloat loggers, now all my useful metrics update per step
mrq
2023-03-10 22:34:37 +0000
bf94744514
I am going to scream
mrq
2023-03-09 22:47:46 +0000
84c8196da5
Shamelessly nabbed from ae80992817
(if this is makes a big enough difference in training i'm going to cum)
mrq
2023-03-09 03:39:23 +0000
0ee0f46596
.
mrq
2023-03-09 00:29:25 +0000
6eb7ebf847
silence printing the model because it's just useless noise
1722350424946054781/tmp_refs/heads/master
1722350424946054781/master
1719480455285567370/tmp_refs/heads/master
1719480455285567370/master
mrq
2023-03-04 16:38:24 +0000
71cc43e65c
added a flag (thanks gannybal)
mrq
2023-02-26 14:56:26 +0000
0f04206aa2
added ability to toggle some settings with envvars for later testing without needing to manually edit this file (and some other things like disabling it when a user requests it in the future)
mrq
2023-02-24 23:08:56 +0000
1433b7c0ea
working Embedding override
mrq
2023-02-23 07:28:27 +0000
94aefa3e4c
silence
mrq
2023-02-23 07:25:09 +0000
fd66c4104b
ugh
mrq
2023-02-23 07:18:07 +0000
7bcedca771
I guess I can't easily toggle it outside of here, but it works
mrq
2023-02-23 07:02:06 +0000
0ef8ab6872
shut up
mrq
2023-02-23 06:12:27 +0000
58600274ac
Disabling bitsandbytes optimization as default for now, in the off chance that it actually produces garbage (which shouldn't happen, there's no chance, if training at float16 from a model at float16 works fine, then this has to work)
mrq
2023-02-23 03:22:59 +0000
918473807f
Merge pull request 'bitsandbytes' (#2 ) from bitsandbytes into master
mrq
2023-02-23 03:16:25 +0000
6676c89c0e
I sucked off the hyptothetical wizard again, just using BNB's ADAM optimizer nets HUGE savings, but I don't know the output costs, will need to test
mrq
2023-02-23 02:42:17 +0000
01c0941a40
binaries
mrq
2023-02-22 23:09:27 +0000
4427d7fb84
initial conversion (errors out)
mrq
2023-02-22 23:07:05 +0000
6c284ef8ec
oops
mrq
2023-02-18 03:27:04 +0000
8db762fa17
thought I copied this over
mrq
2023-02-18 03:15:44 +0000
73d9c3bd46
set output folder to be sane with the cwd as a reference point
mrq
2023-02-18 02:01:09 +0000
5ecf7da881
Fix later
mrq
2023-02-17 20:49:29 +0000
e3e8801e5f
Fix I thought wasn't needed since it literally worked without it earlier
mrq
2023-02-17 20:41:20 +0000
535549c3f3
add some snark about the kludge I had to fix, and the kludge I used to fix it
mrq
2023-02-17 19:20:19 +0000
a09cf98c7f
more cleanup, pip-ifying won't work, got an alternative
mrq
2023-02-17 15:47:55 +0000
6afa2c299e
break if your dataset size is smaller than your batch size
mrq
2023-02-17 04:08:27 +0000
94d0f16608
Necessary fixes to get it to work
mrq
2023-02-17 02:03:00 +0000
49e23b226b
pip-ify
mrq
2023-02-17 00:33:50 +0000
f31a333c4f
more sampling fixes
James Betker
2022-10-10 20:11:28 -0600
5d172fbf7e
Fix eval
James Betker
2022-10-10 14:22:36 -0600
9502e0755e
ugh
James Betker
2022-10-10 12:15:51 -0600
fce2c8f5db
and listify them
James Betker
2022-10-10 12:13:49 -0600
3cf78e3c44
train mel head even when not
James Betker
2022-10-10 12:10:56 -0600
cc74a43675
Checkin
James Betker
2022-10-10 11:30:20 -0600
3cb14123bc
glc fix
James Betker
2022-07-29 11:24:36 -0600
4ddd01a7fb
support generating cheaters from the new cheater network
James Betker
2022-07-29 09:19:20 -0600
27a9b1b750
rename perplexity->log perplexity
James Betker
2022-07-28 09:48:40 -0600
1d68624828
fix some imports..
James Betker
2022-07-28 02:35:32 -0600
cfe907f13f
i like this better
James Betker
2022-07-28 02:33:23 -0600
d44ed5d12d
probably too harsh on ninfs
James Betker
2022-07-28 01:33:20 -0600
4509cfc705
track logperp for diffusion evals
James Betker
2022-07-28 01:30:44 -0600
19eb939ccf
gd perplexity
James Betker
2022-07-28 00:23:35 -0600
a1bbde8a43
few things
James Betker
2022-07-26 11:52:03 -0600
f8108cfdb2
update environment and fix a bunch of deps
James Betker
2022-07-24 23:43:25 -0600
45afefabed
fix booboo
James Betker
2022-07-24 18:00:14 -0600
cc62ba9cba
few more tfd13 things
James Betker
2022-07-24 17:39:33 -0600
f3d967dbf5
remove eta from mdf
James Betker
2022-07-24 17:21:20 -0600
76464ca063
some fixes to mdf to support new archs
James Betker
2022-07-21 10:55:50 -0600
13c263e9fb
go all in on m2wv3
James Betker
2022-07-21 00:51:27 -0600
24a78bd7d1
update tfd14 too
James Betker
2022-07-21 00:45:33 -0600
02ebda42f2
#yolo
James Betker
2022-07-21 00:43:03 -0600
b92ff8de78
misc
James Betker
2022-07-20 23:59:32 -0600
a1743d26aa
Revert "Try to squeeze a bit more performance out of this arch"
James Betker
2022-07-20 23:57:56 -0600
767f963392
Try to squeeze a bit more performance out of this arch
James Betker
2022-07-20 23:51:11 -0600
b9d0f7e6de
simplify parameterization a bit
James Betker
2022-07-20 23:41:54 -0600
ee8ceed6da
rework tfd13 further
James Betker
2022-07-20 23:28:29 -0600
40427de8e3
update tfd13 for inference
James Betker
2022-07-20 21:51:25 -0600
dbebe18602
Fix ts=0 with new formulation
James Betker
2022-07-20 12:12:33 -0600
82bd62019f
diffuse the cascaded prior for continuous sr model
James Betker
2022-07-20 11:54:09 -0600
b0e3be0a17
transition to nearest interpolation mode for downsampling
James Betker
2022-07-20 10:56:17 -0600
7b3fc79737
iq checkin
James Betker
2022-07-20 10:19:32 -0600
9a37f3ba42
reminder to future self
James Betker
2022-07-20 10:19:15 -0600
15decfdb98
misc
James Betker
2022-07-20 10:19:02 -0600
2997a640b0
fix mdf
James Betker
2022-07-19 19:39:29 -0600
c14bf6dfb2
fix conditioning free
James Betker
2022-07-19 18:04:49 -0600
fc0b291b21
do masking up proper
James Betker
2022-07-19 16:32:17 -0600
b203a7dc97
And remove unused parameters
James Betker
2022-07-19 15:05:12 -0600
17a07b2e33
readd one mdf function
James Betker
2022-07-19 15:04:36 -0600
c00398e955
scope attention in tfd13 as well
James Betker
2022-07-19 14:59:43 -0600
b157b28c7b
tfd14
James Betker
2022-07-19 13:30:05 -0600
4597447178
add assertions to mel generator script
James Betker
2022-07-19 11:23:54 -0600
1b6fe88bcb
spit out overages in GDI
James Betker
2022-07-19 11:19:59 -0600
73d7211a4c
fix script
James Betker
2022-07-19 11:17:43 -0600
6b1cfe8e66
ugh
James Betker
2022-07-19 11:14:20 -0600
da9e47ca0e
new bounds for MEL normalization and multi-resolution SR in MDF
James Betker
2022-07-19 11:11:46 -0600
eecb534e66
a few fixes to multiresolution sr
James Betker
2022-07-19 11:11:15 -0600
2fb85526bc
mdf cleanup
James Betker
2022-07-19 09:57:05 -0600
4aa840a494
be more stringent on min and max ranges in GDI.. this is gonna break somethings probably
James Betker
2022-07-19 09:14:08 -0600
eab7dc339d
iq checkin
James Betker
2022-07-19 09:13:27 -0600
625d7b6f38
music joiner checkin
James Betker
2022-07-18 18:40:25 -0600
0824708dc7
iq checkin
James Betker
2022-07-18 18:40:14 -0600
df27b98730
ddp doesnt like dropout on checkpointed values
James Betker
2022-07-18 17:17:04 -0600
8d7692c1e0
uh
James Betker
2022-07-18 17:15:27 -0600
c959e530cb
good ole ddp..
James Betker
2022-07-18 17:13:45 -0600
cf57c352c8
Another fix
James Betker
2022-07-18 17:09:13 -0600
83a4ef4149
default to use input for conditioning & add preprocessed input to GDI
James Betker
2022-07-18 17:01:19 -0600
1b4d9567f3
tfd13 for multi-resolution superscaling
James Betker
2022-07-18 16:36:22 -0600
1b648abd7c
iq2
James Betker
2022-07-18 10:12:23 -0600
7a10c3fed8
commit my own version of vq, with a fix for cosine similarity and support for masking
James Betker
2022-07-18 10:12:17 -0600