Commit Graph

168 Commits

Author SHA1 Message Date
James Betker
a2afb25e42 Fix inference, always flow full text tokens through transformer 2021-08-07 20:11:10 -06:00
James Betker
4c678172d6 ugh 2021-08-06 22:10:18 -06:00
James Betker
e723137273 Make gpttts more configurable 2021-08-06 22:08:51 -06:00
James Betker
a7496b661c combined dvae ftw 2021-08-06 22:01:06 -06:00
James Betker
0237e96b34 Fix dvae bug 2021-08-06 14:17:01 -06:00
James Betker
0799d95af5 Use quantizer from rosinality/vqvae with openai dvae 2021-08-06 14:06:26 -06:00
James Betker
d3ace153af Add logic for performing inference using gpt_tts with dual-encoder modes 2021-08-06 12:04:12 -06:00
James Betker
b43683b772 Add lucidrains_dvae 2021-08-06 12:03:46 -06:00
James Betker
89d15c9e74 Move gpt-tts back to lucidrains implementation
Much better performance.
2021-08-05 22:15:13 -06:00
James Betker
c0f61a2e15 Rework how DVAE tokens are ordered
It might make more sense to have top tokens, then bottom tokens
with top tokens having different discretized values.
2021-08-05 07:07:17 -06:00
James Betker
4017236ba9 Fix up inference for gpt_tts 2021-08-05 06:46:30 -06:00
James Betker
341f28dd82 It works! 2021-08-04 20:07:51 -06:00
James Betker
36c7c1fbdb Fix training flow for NEXT TOKEN prediction instead of same token prediction
doh
2021-08-04 10:28:09 -06:00
James Betker
d9936df363 Add gpt_tts dataset and implement inference
- Adds a script which preprocesses quantized mels given a DVAE
- Adds a dataset which can consume preprocessed qmels
- Reworks GPT TTS to consume the outputs of that dataset (removes logic to add padding and start/end tokens)
- Adds inference to gpt_tts
2021-08-04 00:44:04 -06:00
James Betker
4c98b9703f Get dalle-style TTS to "work" 2021-08-03 21:08:27 -06:00
James Betker
0c9e75bc69 Improvements to GptTts 2021-07-31 15:57:57 -06:00
James Betker
31ee9ae262 Checkin 2021-07-30 23:07:35 -06:00
James Betker
dadc54795c Add gpt_tts 2021-07-27 20:33:30 -06:00