vall-e/vall_e/models
2025-01-28 21:55:05 -06:00
..
arch I should really just grab modelling_llama wholesale (fix for the adapted attention class) 2025-01-28 21:55:05 -06:00
__init__.py
ar_nar.py updated mixtral backend (need this for something else) 2025-01-20 21:50:56 -06:00
base.py updated mixtral backend (need this for something else) 2025-01-20 21:50:56 -06:00
lora.py