This website requires JavaScript.
Explore
Help
Register
Sign In
mrq
/
vall-e
Watch
5
Star
9
Fork
0
You've already forked vall-e
Code
Issues
8
Pull Requests
Packages
Projects
Releases
Wiki
Activity
24d888c47c
vall-e
/
vall_e
/
models
History
mrq
24d888c47c
temporarily dropping support for xformers because it's breaking when using an attention mask (which i dont remember commenting it out when being passed), default to not use wandb because it's being a pain when doing tests and not actual sessionsS)
2024-11-22 11:29:12 -06:00
..
arch
temporarily dropping support for xformers because it's breaking when using an attention mask (which i dont remember commenting it out when being passed), default to not use wandb because it's being a pain when doing tests and not actual sessionsS)
2024-11-22 11:29:12 -06:00
__init__.py
unified nar.py into ar_nar.py
2024-11-10 12:19:48 -06:00
ar_nar.py
added mixed modality AR+NAR-len to generate a short prefix through the AR, then inference with said prefix through the NAR-len (need to experiment with it more to ensure that the masked off tokens are the only tokens getting updated)
2024-11-20 14:22:12 -06:00
base.py
dont use timeembedding
2024-11-21 23:14:52 -06:00
experimental.py
lora.py