vall-e/vall_e/models/arch
2024-08-26 19:13:34 -05:00
..
attention added fused_attn (triton-based fused attention) and simply just query for flash_attn under rocm 2024-08-26 19:13:34 -05:00
mamba_vasqu
retnet_syncdoth
__init__.py added fused_attn (triton-based fused attention) and simply just query for flash_attn under rocm 2024-08-26 19:13:34 -05:00
bitnet.py
llama.py added fused_attn (triton-based fused attention) and simply just query for flash_attn under rocm 2024-08-26 19:13:34 -05:00
mamba.py
mixtral.py
retnet.py
transformer.py