James Betker
8341bf7646
Enable megabatching
2020-05-02 17:46:59 -06:00
James Betker
61d3040cf5
Add doCrop into LQGT
2020-05-02 17:46:30 -06:00
James Betker
9e1acfe396
Fixup upconv for the next attempt!
2020-05-01 19:56:14 -06:00
James Betker
7eaabce48d
Full resnet corrupt, no BN
...
And it works! Thanks fixup..
2020-04-30 19:17:30 -06:00
James Betker
03258445bc
tblogger..
2020-04-30 12:35:51 -06:00
James Betker
b6e036147a
Add more batch norms to FlatProcessorNet_arch
2020-04-30 11:47:21 -06:00
James Betker
66e91a3d9e
Revert "Enable skip-through connections from disc to gen"
...
This reverts commit b7857f35c3
.
2020-04-30 11:45:07 -06:00
James Betker
f027e888ed
Clear out tensorboard on job restart.
2020-04-30 11:44:53 -06:00
James Betker
b7857f35c3
Enable skip-through connections from disc to gen
2020-04-30 11:30:11 -06:00
James Betker
bf634fc9fa
Make resnet w/ BN discriminator use leaky relus
2020-04-30 11:28:59 -06:00
James Betker
3781ea725c
Add Resnet Discriminator with BN
2020-04-29 20:51:57 -06:00
James Betker
a5188bb7ca
Remover fixup code from arch_util
...
Going into it's own arch.
2020-04-29 15:17:43 -06:00
James Betker
5b8a77f02c
Discriminator part 1
...
New discriminator. Includes spectral norming.
2020-04-28 23:00:29 -06:00
James Betker
2c145c39b6
Misc changes
2020-04-28 11:50:16 -06:00
James Betker
46f550e42b
Change downsample_dataset to do no image modification
...
I'm preprocessing the images myself now. There's no need to have
the dataset do this processing as well.
2020-04-28 11:50:04 -06:00
James Betker
8ab595e427
Add FlatProcessorNet
...
After doing some thinking and reading on the subject, it occurred to me that
I was treating the generator like a discriminator by focusing the network
complexity at the feature levels. It makes far more sense to process each conv
level equally for the generator, hence the FlatProcessorNet in this commit. This
network borrows some of the residual pass-through logic from RRDB which makes
the gradient path exceptionally short for pretty much all model parameters and
can be trained in O1 optimization mode without overflows again.
2020-04-28 11:49:21 -06:00
James Betker
b8f67418d4
Retool HighToLowResNet
...
The receptive field of the original was *really* low. This new one has a
receptive field of 36x36px patches. It also has some gradient issues
that need to be worked out
2020-04-26 01:13:42 -06:00
James Betker
02ff4a57fd
Enable HighToLowResNet to do a 1:1 transform
2020-04-25 21:36:32 -06:00
James Betker
35bd1ecae4
Config changes for discriminator advantage run
...
Still going from high->low, discriminator discerns on low. Next up disc works on high.
2020-04-25 11:24:28 -06:00
James Betker
d95808f4ef
Implement downsample GAN
...
This bad boy is for a workflow where you train a model on disjoint image sets to
downsample a "good" set of images like a "bad" set of images looks. You then
use that downsampler to generate a training set of paired images for supersampling.
2020-04-24 00:00:46 -06:00
James Betker
ea54c7618a
Print error when image read fails
2020-04-23 23:59:32 -06:00
James Betker
e98d92fc77
Allow test to operate on batches
2020-04-23 23:59:09 -06:00
James Betker
8ead9ae183
Lots more config files
2020-04-23 23:58:27 -06:00
James Betker
ea5f432f5a
Log total gen loss
2020-04-22 14:02:10 -06:00
James Betker
79aff886b5
Modifications that allow developer to explicitly specify a different image set for PIX and feature losses
2020-04-22 10:11:14 -06:00
James Betker
12d92dc443
Add GTLQ dataset
2020-04-22 00:40:38 -06:00
James Betker
4d269fdac6
Support independent PIX dataroot
2020-04-22 00:40:13 -06:00
James Betker
05aafef938
Support variant input sizes and scales
2020-04-22 00:39:55 -06:00
James Betker
ebda70fcba
Fix AMP
2020-04-22 00:39:31 -06:00
James Betker
f4b33b0531
Some random fixes/adjustments
2020-04-22 00:38:53 -06:00
James Betker
2538ca9f33
Add my own configs
2020-04-22 00:37:54 -06:00
James Betker
af5dfaa90d
Change GT_size to target_size
2020-04-22 00:37:41 -06:00
James Betker
cc834bd5a3
Support >128px image squares
2020-04-21 16:32:59 -06:00
James Betker
4f6d3f0dfb
Enable AMP optimizations & write sample train images to folder.
2020-04-21 16:28:06 -06:00
PRAGMA
1fb12871fd
Create requirements.txt
2019-11-24 07:48:52 +00:00
XintaoWang
a25ee9464d
test w/o GT
2019-09-01 22:20:10 +08:00
XintaoWang
0098663b6b
SRGAN model supprots dist training
2019-09-01 22:14:29 +08:00
XintaoWang
866a858e59
add deform_conv_cuda_kernel.cu
2019-08-27 17:49:12 +08:00
XintaoWang
037933ba66
mmsr
2019-08-23 21:42:47 +08:00