vall-e/vall_e/models/arch
2024-11-03 19:19:15 -06:00
..
attention
mamba_vasqu
retnet_syncdoth
__init__.py layer skip training implemented (need to gut the inferencing from the repo, and to actually see if the model can benefit from this) 2024-10-30 20:05:45 -05:00
bitnet.py
llama.py Windows specific fixes (to-do: find libespeak-ng.dll automatically because it cannot be trusted to do it by default) 2024-11-03 19:19:15 -06:00
mamba.py
mixtral.py
retnet.py
transformer.py