vall-e/vall_e
2024-11-13 09:07:10 -06:00
..
emb fixes 2024-11-10 20:37:50 -06:00
engines fixes 2024-11-10 20:37:50 -06:00
ext maybe final tweaks, I really needed to unify my json read/write and orjson is proven to be fast enough for me to try and rely on it more 2024-09-17 22:57:04 -05:00
models do not pass timestep token/embedding since it doesn't seem to matter at all after all, fixed training masking rate to 80% because a paper said so 2024-11-13 09:07:10 -06:00
utils new meme sampler PogChamp new meme sampler PogChamp (it sort of helps?) 2024-11-12 22:30:09 -06:00
__init__.py Rewrite init 2023-08-02 21:53:35 +00:00
__main__.py new meme sampler PogChamp new meme sampler PogChamp (it sort of helps?) 2024-11-12 22:30:09 -06:00
config.py overhauled inference/sampler kwargs to stop being a bloated mess 2024-11-11 20:21:16 -06:00
data.py new meme sampler PogChamp new meme sampler PogChamp (it sort of helps?) 2024-11-12 22:30:09 -06:00
demo.py new meme sampler PogChamp new meme sampler PogChamp (it sort of helps?) 2024-11-12 22:30:09 -06:00
export.py tweaks and fixes for lora stuffs 2024-09-08 18:05:21 -05:00
inference.py overhauled inference/sampler kwargs to stop being a bloated mess 2024-11-11 20:21:16 -06:00
plot.py very, very naive layerskip speculative sampling (it just checks if the current layer's state is good enough) 2024-11-02 11:49:05 -05:00
samplers.py new meme sampler PogChamp new meme sampler PogChamp (it sort of helps?) 2024-11-12 22:30:09 -06:00
train.py dropped subtrain dataloader since its useless to duplicate 2024-11-11 17:00:49 -06:00
webui.py set it to zero because it'll make the stop token hide more often than not 2024-11-12 22:30:50 -06:00