This website requires JavaScript.
Explore
Help
Register
Sign In
mrq
/
vall-e
Watch
5
Star
9
Fork
0
You've already forked vall-e
Code
Issues
8
Pull Requests
Packages
Projects
Releases
Wiki
Activity
0d706ec6a1
vall-e
/
vall_e
/
models
/
arch
/
attention
History
mrq
0d706ec6a1
added fused_attn (triton-based fused attention) and simply just query for flash_attn under rocm
2024-08-26 19:13:34 -05:00
..
fused.py
added fused_attn (triton-based fused attention) and simply just query for flash_attn under rocm
2024-08-26 19:13:34 -05:00