DL-Art-School/codes
James Betker 8ab595e427 Add FlatProcessorNet
After doing some thinking and reading on the subject, it occurred to me that
I was treating the generator like a discriminator by focusing the network
complexity at the feature levels. It makes far more sense to process each conv
level equally for the generator, hence the FlatProcessorNet in this commit. This
network borrows some of the residual pass-through logic from RRDB which makes
the gradient path exceptionally short for pretty much all model parameters and
can be trained in O1 optimization mode without overflows again.
2020-04-28 11:49:21 -06:00
..
data Implement downsample GAN 2020-04-24 00:00:46 -06:00
data_scripts Some random fixes/adjustments 2020-04-22 00:38:53 -06:00
metrics mmsr 2019-08-23 21:42:47 +08:00
models Add FlatProcessorNet 2020-04-28 11:49:21 -06:00
options Enable HighToLowResNet to do a 1:1 transform 2020-04-25 21:36:32 -06:00
scripts mmsr 2019-08-23 21:42:47 +08:00
utils mmsr 2019-08-23 21:42:47 +08:00
requirements.txt Create requirements.txt 2019-11-24 07:48:52 +00:00
run_scripts.sh mmsr 2019-08-23 21:42:47 +08:00
test_Vid4_REDS4_with_GT_DUF.py mmsr 2019-08-23 21:42:47 +08:00
test_Vid4_REDS4_with_GT_TOF.py mmsr 2019-08-23 21:42:47 +08:00
test_Vid4_REDS4_with_GT.py mmsr 2019-08-23 21:42:47 +08:00
test.py Config changes for discriminator advantage run 2020-04-25 11:24:28 -06:00
train.py Config changes for discriminator advantage run 2020-04-25 11:24:28 -06:00