Commit Graph

16 Commits

Author SHA1 Message Date
mrq
6b0891448c pain (some shit to try and get some flash attention for ROCm (gfx1100) through triton fused attention but no good) 2024-08-25 20:07:27 -05:00
mrq
40e1799adc fixed xformers and flash_attn to actually work now 2024-08-19 01:03:35 -05:00
mrq
29c35528e5 the sooner I accept there's no FA for V100s the sooner I'll go to bed 2024-08-18 23:54:33 -05:00
mrq
d636edd3a2 added flash_attn LlamaAttention (including flash_attn==1.0.9) 2024-08-18 20:51:14 -05:00
mrq
2a1794c084 ughghghhhh 2024-08-09 21:15:01 -05:00
mrq
d04f6911b4 oops 2024-08-08 19:38:55 -05:00
mrq
949339a3fa do not include SDPA attention if there's no available SDPA backends 2024-08-06 20:42:39 -05:00
mrq
debcc93e7e add adapted MixtralAttention for when I make a bad decision to actually train a MoE 2024-08-04 22:03:22 -05:00
mrq
10aaf840e7 added export option to convert Llama to MixtralMoE for another dumb experiment 2024-08-04 20:25:06 -05:00
mrq
11fa3da665 some cleanup, fixed the wrapper attention to explicitly use other sdpa backends 2024-08-03 19:51:00 -05:00
mrq
9564ecda43 wrapper attention class for other sdpa backends + xformers seems to have broke... 2024-08-03 15:12:11 -05:00
mrq
ccb14c06ef mamba2-hf using vasqu/mamba2-torch because it lets me use mamba2 without triton ops (training with my 4xV100s are not happy with mamba2 because of triton) 2024-06-14 19:42:17 -05:00
mrq
83eab4fa59 actually going for the suggested "2x layers, no intermediate scaling" is wrong for VALL-E, directly copying the normal transformer structure fixes mamba2 performance in the test trainer 2024-06-13 20:08:22 -05:00
mrq
65a8960305 option to split classifier per-level instead of sharing one (at this point I'm just scrambling to try and cope with training a DAC model, the NAR is being a pain) 2024-06-11 22:28:59 -05:00
mrq
b2194b859a re-added loading multiple models because I'm now entertaining having split AR/NAR models again (and need a way to load both at once) 2024-06-06 09:48:43 -05:00
mrq
ff6fe6f1bc cleanup 2024-06-05 20:30:43 -05:00