forked from mrq/ai-voice-cloning
I forgot about the changelog and never kept up with it, so I'll just not use a changelog
This commit is contained in:
parent
ff07f707cb
commit
d7e75a51cf
19
README.md
19
README.md
|
@ -12,23 +12,8 @@ This is not endorsed by [neonbjb](https://github.com/neonbjb/). I do not expect
|
|||
|
||||
## Documentation
|
||||
|
||||
Please consult [the wiki](https://git.ecker.tech/mrq/ai-voice-cloning/wiki) for the documentation, including how to install, prepare voices for, and use the software.
|
||||
Please consult [the wiki](https://git.ecker.tech/mrq/ai-voice-cloning/wiki) for the documentation.
|
||||
|
||||
## Bug Reporting
|
||||
|
||||
If you run into any problems, please refer to the [issues you may encounter](https://git.ecker.tech/mrq/ai-voice-cloning/wiki/Issues) wiki page first. Please don't hesitate to submit an issue.
|
||||
|
||||
## Changelogs
|
||||
|
||||
Below will be a rather-loose changelogss, as I don't think I have a way to chronicle them outside of commit messages:
|
||||
|
||||
### `2023.02.22`
|
||||
|
||||
* greatly reduced VRAM consumption through the use of [TimDettmers/bitsandbytes](https://github.com/TimDettmers/bitsandbytes)
|
||||
* cleaned up section of code that handled parsing output from training script
|
||||
* added button to reconnect to the training script's output (sometimes skips a line to update, but it's better than nothing)
|
||||
* actually update submodules from the update script (somehow forgot to pass `--remote`)
|
||||
|
||||
### `Before 2023.02.22`
|
||||
|
||||
Refer to commit logs.
|
||||
If you run into any problems, please refer to the [issues you may encounter](https://git.ecker.tech/mrq/ai-voice-cloning/wiki/Issues) wiki page first.
|
|
@ -21,9 +21,14 @@ if __name__ == "__main__":
|
|||
args = parser.parse_args()
|
||||
args.opt = " ".join(args.opt) # absolutely disgusting
|
||||
|
||||
|
||||
with open(args.opt, 'r') as file:
|
||||
opt_config = yaml.safe_load(file)
|
||||
|
||||
if "WORLD_SIZE" in os.environ:
|
||||
if int(os.environ["WORLD_SIZE"]) > 1 and opt_config["steps"]["gpt_train"]["optimizer"] == "adamw":
|
||||
opt_config["steps"]["gpt_train"]["optimizer"] = "adamw_zero"
|
||||
|
||||
if "ext" in opt_config and "bitsandbytes" in opt_config["ext"] and not opt_config["ext"]["bitsandbytes"]:
|
||||
os.environ['BITSANDBYTES_OVERRIDE_LINEAR'] = '0'
|
||||
os.environ['BITSANDBYTES_OVERRIDE_EMBEDDING'] = '0'
|
||||
|
|
Loading…
Reference in New Issue
Block a user