vall-e/vall_e/models/arch/attention
2024-08-26 19:13:34 -05:00
..
fused.py added fused_attn (triton-based fused attention) and simply just query for flash_attn under rocm 2024-08-26 19:13:34 -05:00