Commit Graph

134 Commits

Author SHA1 Message Date
James Betker
8d01f7685c Get rid of absolute positional embeddings in unifiedvoice 2021-12-26 00:10:24 -07:00
James Betker
6700f8851d moar verbosity 2021-12-25 23:23:21 -07:00
James Betker
8acf3b3097 Better dimensional asserting 2021-12-25 23:18:25 -07:00
James Betker
e959541494 Add position embeddings back into unified_voice
I think this may be the solution behind the days problems.
2021-12-25 23:10:56 -07:00
James Betker
ab9cafa572 Make tokenization configs more configurable 2021-12-25 12:17:50 -07:00
James Betker
52410fd9d9 256-bpe tokenizer 2021-12-25 08:52:08 -07:00
James Betker
8e26400ce2 Add inference for unified gpt 2021-12-24 13:27:06 -07:00
James Betker
8b19c37409 UnifiedGptVoice! 2021-12-23 15:20:26 -07:00
James Betker
e55d949855 GrandConjoinedDataset 2021-12-23 14:32:33 -07:00
James Betker
c737632eae Train and use a bespoke tokenizer 2021-12-22 15:06:14 -07:00
James Betker
66bc60aeff Re-add start_text_token 2021-12-22 14:10:35 -07:00
James Betker
a9629f7022 Try out using the GPT tokenizer rather than nv_tacotron
This results in a significant compression of the text domain, I'm curious what the
effect on speech quality will be.
2021-12-22 14:03:18 -07:00
James Betker
7ae7d423af VoiceCLIP model 2021-12-22 13:44:11 -07:00
James Betker
09f7f3e615 Remove obsolete lucidrains DALLE stuff, re-create it in a dedicated folder 2021-12-22 13:44:02 -07:00
James Betker
a42b94ab72 gpt_tts_hf inference fixes 2021-12-22 13:22:15 -07:00
James Betker
48e3ee9a5b Shuffle conditioning inputs along the positional axis to reduce fitting on prosody and other positional information
The mels should still retain some short-range positional information the model can use
for tone and frequencies, for example.
2021-12-20 19:05:56 -07:00
James Betker
53858b2055 Fix gpt_tts_hf inference 2021-12-20 17:45:26 -07:00
James Betker
712d746e9b gpt_tts: format conditioning inputs more for contextual voice clues and less for prosidy
also support single conditional inputs
2021-12-19 17:42:29 -07:00
James Betker
c813befd53 Remove dedicated positioning embeddings 2021-12-19 09:01:31 -07:00
James Betker
b4ddcd7111 More inference improvements 2021-12-19 09:01:19 -07:00
James Betker
f9c45d70f0 Fix mel terminator 2021-12-18 17:18:06 -07:00
James Betker
937045cb63 Fixes 2021-12-18 16:45:38 -07:00
James Betker
9b9f7ea61b GptTtsHf: Make the input/target placement easier to reason about 2021-12-17 10:24:14 -07:00
James Betker
2fb4213a3e More lossy fixes 2021-12-17 10:01:42 -07:00
James Betker
9e8a9bf6ca Various fixes to gpt_tts_hf 2021-12-16 23:28:44 -07:00
James Betker
62c8ed9a29 move speech utils 2021-12-16 20:47:37 -07:00
James Betker
4f8c4d130c gpt_tts_hf: pad mel tokens with an <end_of_sequence> token. 2021-12-12 20:04:50 -07:00
James Betker
8917c02a4d gpt_tts_hf inference first pass 2021-12-12 19:51:44 -07:00
James Betker
5a664aa56e misc 2021-12-11 08:17:26 -07:00
James Betker
6ccff3f49f Record codes more often 2021-12-07 09:22:45 -07:00
James Betker
d0b2f931bf Add feature to diffusion vocoder where the spectrogram conditioning layers can be re-trained apart from the rest of the model 2021-12-07 09:22:30 -07:00
James Betker
662920bde3 Log codes when simply fetching codebook_indices 2021-12-06 09:21:43 -07:00
James Betker
380a5d5475 gdi.. 2021-12-03 08:53:09 -07:00
James Betker
101a01f744 Fix dvae codes issue 2021-12-02 23:28:36 -07:00
James Betker
07b0124712 GptTtsHf! 2021-12-02 21:48:42 -07:00
James Betker
85542ec547 One last fix for gpt_asr_hf2 2021-12-02 21:19:28 -07:00
James Betker
04454ee63a Add evaluation logic for gpt_asr_hf2 2021-12-02 21:04:36 -07:00
James Betker
5956eb757c ffffff 2021-11-24 00:19:47 -07:00
James Betker
f1ed0588e3 another fix 2021-11-24 00:11:21 -07:00
James Betker
7a3c4a4fc6 Fix lr quantizer decode 2021-11-24 00:01:26 -07:00
James Betker
3f6ecfe0db q fix 2021-11-23 23:50:27 -07:00
James Betker
d9747fe623 Integrate with lr_quantizer 2021-11-23 19:48:22 -07:00
James Betker
82d0e7720e Add choke to lucidrains_dvae 2021-11-23 18:53:37 -07:00
James Betker
934395d4b8 A few fixes for gpt_asr_hf2 2021-11-23 09:29:29 -07:00
James Betker
01e635168b whoops 2021-11-22 17:24:13 -07:00
James Betker
973f47c525 misc nonfunctional 2021-11-22 17:16:39 -07:00
James Betker
3125ca38f5 Further wandb logs 2021-11-22 16:40:19 -07:00
James Betker
0604060580 Finish up mods for next version of GptAsrHf 2021-11-20 21:33:49 -07:00
James Betker
14f3155ec4 misc 2021-11-20 17:45:14 -07:00
James Betker
555b7e52ad Add rev2 of GptAsrHf 2021-11-18 20:02:24 -07:00