@ -16,4 +16,19 @@ Please consult [the wiki](https://git.ecker.tech/mrq/ai-voice-cloning/wiki) for
## Bug Reporting
If you run into any problems, please refer to the [issues you may encounter](https://git.ecker.tech/mrq/ai-voice-cloning/wiki/Issues) wiki page first. Please don't hesitate to submit an issue.
If you run into any problems, please refer to the [issues you may encounter](https://git.ecker.tech/mrq/ai-voice-cloning/wiki/Issues) wiki page first. Please don't hesitate to submit an issue.
## Changelogs
Below will be a rather-loose changelogss, as I don't think I have a way to chronicle them outside of commit messages:
### `2023.02.22`
* greatly reduced VRAM consumption through the use of [TimDettmers/bitsandbytes](https://github.com/TimDettmers/bitsandbytes)
* cleaned up section of code that handled parsing output from training script
* added button to reconnect to the training script's output (sometimes skips a line to update, but it's better than nothing)
* actually update submodules from the update script (somehow forgot to pass `--remote`)
self.training_started=True# could just leverage the above variable, but this is python, and there's no point in these aggressive microoptimizations
match=re.findall(r'iter: ([\d,]+)',line)
ifmatchandlen(match)>0:
it=int(match[0].replace(",",""))
self.it=int(match[0].replace(",",""))
elifprogressisnotNone:
ifline.find(' 0%|')==0:
open_state=True
elifline.find('100%|')==0andopen_state:
open_state=False
it=it+1
self.open_state=True
elifline.find('100%|')==0andself.open_state:
self.open_state=False
self.it=self.it+1
it_time_end=time.time()
it_time_delta=it_time_end-it_time_start
it_time_start=time.time()
it_rate=f'[{"{:.3f}".format(it_time_delta)}s/it]'ifit_time_delta>=1elsef'[{"{:.3f}".format(1/it_time_delta)}it/s]'# I doubt anyone will have it/s rates, but its here
self.it_rate=f'[{"{:.3f}".format(self.it_time_delta)}s/it]'ifself.it_time_delta>=1elsef'[{"{:.3f}".format(1/self.it_time_delta)}it/s]'# I doubt anyone will have it/s rates, but its here
# it would be nice for losses to be shown at every step
if'loss_gpt_total'ininfo:
status=f"Total loss at step {int(info['step'])}: {info['loss_gpt_total']}"
if'loss_gpt_total'inself.info:
# self.info['step'] returns the steps, not iterations, so we won't even bother ripping the reported step count, as iteration count won't get ripped from the regex
self.status=f"Total loss at iteration {self.it}: {self.info['loss_gpt_total']}"
elifline.find('Saving models and training states')>=0: