DL-Art-School/codes/utils
James Betker 93a3302819 Push training_state data to CPU memory before saving it
For whatever reason, keeping this on GPU memory just doesn't work.
When you load it, it consumes a large amount of GPU memory and that
utilization doesn't go away. Saving to CPU should fix this.
2022-03-04 17:57:33 -07:00
..
__init__.py mmsr 2019-08-23 21:42:47 +08:00
audio_resampler.py Stop dataset - attempt #2 2021-08-18 18:29:38 -06:00
audio.py misc nonfunctional 2021-11-22 17:16:39 -07:00
colors.py Add FDPL Loss 2020-07-30 20:47:57 -06:00
convert_model.py Add "dataset_debugger" support 2022-01-06 12:38:20 -07:00
distributed_checkpont.py Add distributed_checkpoint for more efficient checkpoints 2020-10-06 20:38:38 -06:00
gpu_mem_track.py Add GPU mem tracing module 2020-06-14 11:02:54 -06:00
kmeans.py test uresnet playground mods 2021-01-23 13:46:43 -07:00
loss_accumulator.py Move log consensus to train for efficiency 2022-03-04 13:41:32 -07:00
numeric_stability.py More refactor changes 2020-12-18 09:24:31 -07:00
options.py Fix options.py bug 2021-10-29 14:47:31 -06:00
util.py Push training_state data to CPU memory before saving it 2022-03-04 17:57:33 -07:00
weight_scheduler.py Add scheduling to quantizer, enable cudnn_benchmarking to be disabled 2021-09-24 17:01:36 -06:00