vall-e/vall_e/models/arch
2024-08-19 01:03:35 -05:00
..
mamba_vasqu
retnet_syncdoth
__init__.py
bitnet.py
llama.py fixed xformers and flash_attn to actually work now 2024-08-19 01:03:35 -05:00
mamba.py
mixtral.py added flash_attn LlamaAttention (including flash_attn==1.0.9) 2024-08-18 20:51:14 -05:00
retnet.py
transformer.py