This website requires JavaScript.
Explore
Help
Register
Sign In
mrq
/
vall-e
Watch
5
Star
9
Fork
0
You've already forked vall-e
Code
Issues
8
Pull Requests
Packages
Projects
Releases
Wiki
Activity
40e1799adc
vall-e
/
vall_e
/
models
/
arch
History
mrq
40e1799adc
fixed xformers and flash_attn to actually work now
2024-08-19 01:03:35 -05:00
..
mamba_vasqu
retnet_syncdoth
__init__.py
bitnet.py
llama.py
fixed xformers and flash_attn to actually work now
2024-08-19 01:03:35 -05:00
mamba.py
mixtral.py
added flash_attn LlamaAttention (including flash_attn==1.0.9)
2024-08-18 20:51:14 -05:00
retnet.py
transformer.py