James Betker
59b0533b06
Fix attimage step size
2020-06-17 18:45:24 -06:00
James Betker
645d0ca767
ResidualGen mods
...
- Add filters_mid spec which allows a expansion->squeeze for the transformation layers.
- Add scale and bias AFTER the switch
- Remove identity transform (models were converging on this)
- Move attention image generation and temperature setting into new function which gets called every step with a save path
2020-06-17 17:18:28 -06:00
James Betker
6f8406fbdc
Fixed ConfigurableSwitchedGenerator bug
2020-06-16 16:53:57 -06:00
James Betker
7d541642aa
Get rid of SwitchedResidualGenerator
...
Just use the configurable one instead..
2020-06-16 16:23:29 -06:00
James Betker
379b96eb55
Output histograms with SwitchedResidualGenerator
...
This also fixes the initialization weight for the configurable generator.
2020-06-16 15:54:37 -06:00
James Betker
f8b67f134b
Get proper contiguous view for backwards compatibility
2020-06-16 14:27:16 -06:00
James Betker
2def96203e
Mods to SwitchedResidualGenerator_arch
...
- Increased processing for high-resolution switches
- Do stride=2 first in HalvingProcessingBlock
2020-06-16 14:19:12 -06:00
James Betker
70c764b9d4
Create a configurable SwichedResidualGenerator
...
Also move attention image generator out of repo
2020-06-16 13:24:07 -06:00
James Betker
df1046c318
New arch: SwitchedResidualGenerator_arch
...
The concept here is to use switching to split the generator into two functions:
interpretation and transformation. Transformation is done at the pixel level by
relatively simple conv layers, while interpretation is computed at various levels
by far more complicated conv stacks. The two are merged using the switching
mechanism.
This architecture is far less computationally intensive that RRDB.
2020-06-16 11:23:50 -06:00
James Betker
ddfd7f67a0
Get rid of biggan
...
Not really sure it's a great fit for what is being done here.
2020-06-16 11:21:44 -06:00
James Betker
0a714e8451
Fix initialization in mhead switched rrdb
2020-06-15 21:32:03 -06:00
James Betker
be7982b9ae
Add skip heads to switcher
...
These pass through the input so that it can be selected by the attention mechanism.
2020-06-14 12:46:54 -06:00
James Betker
6c27ddc9b5
Misc
2020-06-14 11:03:02 -06:00
James Betker
6c0e9f45c7
Add GPU mem tracing module
2020-06-14 11:02:54 -06:00
James Betker
48532a0a8a
Fix initial_stride on lowdim models
2020-06-14 11:02:16 -06:00
James Betker
532704af40
Multiple modifications for experimental RRDB architectures
...
- Add LowDimRRDB; essentially a "normal RRDB" but the RDB blocks process at a low dimension using PixelShuffle
- Add switching wrappers around it
- Add support for switching on top of multi-headed inputs and outputs
- Moves PixelUnshuffle to arch_util
2020-06-13 11:37:27 -06:00
James Betker
e89f28ead0
Update multirrdb to do HR fixing in the base image dimension.
2020-06-11 08:43:39 -06:00
James Betker
d3b2cbfe7c
Fix loading new state dicts for RRDB
2020-06-11 08:25:57 -06:00
James Betker
5ca53e7786
Add alternative first block for PixShuffleRRDB
2020-06-10 21:45:24 -06:00
James Betker
43b7fccc89
Fix mhead attention integration bug for RRDB
2020-06-10 12:02:33 -06:00
James Betker
12e8fad079
Add serveral new RRDB architectures
2020-06-09 13:28:55 -06:00
James Betker
296135ec18
Add doResizeLoss to dataset
...
doResizeLoss has a 50% chance to resize the LQ image to 50% size,
then back to original size. This is useful to training a generator to
recover these lost pixel values while also being able to do
repairs on higher resolution images during training.
2020-06-08 11:27:06 -06:00
James Betker
786a4288d6
Allow switched RRDBNet to record metrics and decay temperature
2020-06-08 11:10:38 -06:00
James Betker
ae3301c0ea
SwitchedRRDB work
...
Renames AttentiveRRDB to SwitchedRRDB. Moves SwitchedConv to
an external repo (neonbjb/switchedconv). Switchs RDB blocks instead
of conv blocks. Works good!
2020-06-08 08:47:34 -06:00
James Betker
93528ff8df
Merge branch 'gan_lab' of https://github.com/neonbjb/mmsr into gan_lab
2020-06-07 16:59:31 -06:00
James Betker
805bd129b7
Switched conv partial impl
2020-06-07 16:59:22 -06:00
James Betker
9e203d07c4
Merge remote-tracking branch 'origin/gan_lab' into gan_lab
2020-06-07 16:56:12 -06:00
James Betker
299d855b34
Enable forced learning rates
2020-06-07 16:56:05 -06:00
James Betker
efb5b3d078
Add switched_conv
2020-06-07 16:45:07 -06:00
James Betker
063719c5cc
Fix attention conv bugs
2020-06-06 18:31:02 -06:00
James Betker
cbedd6340a
Add RRDB with attention
2020-06-05 21:02:08 -06:00
James Betker
ef5d8a0ed1
Misc
2020-06-05 21:01:50 -06:00
James Betker
318a604405
Allow weighting of input data
...
This essentially allows you to give some datasets more
importance than others for the purposes of reaching a more
refined network.
2020-06-04 10:05:21 -06:00
James Betker
edf0f8582e
Fix rrdb bug
2020-06-02 11:15:55 -06:00
James Betker
dc17545083
Add RRDB Initial Stride
...
Allows downsampling immediately before processing, which reduces network complexity on
higher resolution images but keeps a higher filter count.
2020-06-02 10:47:15 -06:00
James Betker
76a38b6a53
Misc
2020-06-02 09:35:52 -06:00
James Betker
726d1913ac
Allow validating in batches, remove val size limit
2020-06-02 08:41:22 -06:00
James Betker
90125f5bed
Allow blurring to be specified
2020-06-02 08:40:52 -06:00
James Betker
8355f3d1b3
Only log discriminator data when gan is activated
2020-06-01 15:48:16 -06:00
James Betker
f1a1fd14b1
Introduce (untested) colab mode
2020-06-01 15:09:52 -06:00
James Betker
a38dd62489
Only train discriminator/gan losses when gan_w > 0
2020-06-01 15:09:10 -06:00
James Betker
1eb9c5a059
Fix grayscale
2020-05-29 22:04:50 -06:00
James Betker
2b03e40f98
Debug process_video
2020-05-29 20:44:50 -06:00
James Betker
74b313aaa9
Add grayscale downsampling option
2020-05-29 20:34:00 -06:00
James Betker
b123ed8a45
Add attention resnet
...
Not ready for prime time, but is a first draft.
2020-05-29 20:02:10 -06:00
James Betker
5e9da65d81
Fix process_video bugs
2020-05-29 12:47:22 -06:00
James Betker
beac71ad18
Allow minivids to start at any specified number
2020-05-28 20:27:42 -06:00
James Betker
57682ebee3
Separate feature extractors out, add resnet feature extractor
2020-05-28 20:26:30 -06:00
James Betker
156cee240a
Remove reliance on magick, only wait for ffmpeg at last second, fix image ordering issue
2020-05-27 23:09:46 -06:00
James Betker
b551e86adb
Encode videos to HEVC
...
MP4 doesnt support 8K
2020-05-27 20:04:45 -06:00
James Betker
bdeafad8c5
Only run 'convert' not magick
2020-05-27 17:24:10 -06:00
James Betker
41c1efbf56
Add dynamic video processing script
2020-05-27 17:09:11 -06:00
James Betker
f745be9dea
Fix vgg disc arch
2020-05-27 13:31:22 -06:00
James Betker
6962ccb306
Adjust motion blur
...
0 is invalid.
2020-05-27 13:09:46 -06:00
James Betker
f6815df58b
Misc
2020-05-27 08:04:47 -06:00
James Betker
e27a49454e
Enable vertical splitting on inference images to support very large resolutions.
2020-05-27 08:04:35 -06:00
James Betker
96ac26a8b7
Allow injection of random low-amplitude noise & motion blur into generator
2020-05-27 08:04:11 -06:00
James Betker
69cbfa2f0c
Adjust dataset mutations a bit
...
Adjusts the compression artfacts to be more aggressive, and blurring to be less aggressive.
2020-05-26 13:48:34 -06:00
James Betker
2931142458
Allow multiple gt image dirs
2020-05-25 19:21:09 -06:00
James Betker
4e44b8a1aa
Clean up video stuff
2020-05-25 19:20:49 -06:00
James Betker
8464cae168
HQ blurring doesnt actually work right - hq images arent the right size when they are blurred
...
Just revert it and blur the lq images..
2020-05-24 22:32:54 -06:00
James Betker
5fd8749cf2
More updates - need more blurring
2020-05-24 22:13:27 -06:00
James Betker
9627cc2c49
Update HR gaussian blur params
2020-05-24 18:00:31 -06:00
James Betker
2f8b0250b9
Blur HR image before downsizing, when available
2020-05-24 17:18:44 -06:00
James Betker
cc4571eb8d
Randomize blur effect
2020-05-24 12:35:41 -06:00
James Betker
27a548c019
Enable blurring via settings
2020-05-24 11:56:39 -06:00
James Betker
3c2e5a0250
Apply fixes to resgen
2020-05-24 07:43:23 -06:00
James Betker
446322754a
Support generators that don't output intermediary values.
2020-05-23 21:09:54 -06:00
James Betker
987cdad0b6
Misc mods
2020-05-23 21:09:38 -06:00
James Betker
9b44f6f5c0
Add AssistedRRDB and remove RRDBNetXL
2020-05-23 21:09:21 -06:00
James Betker
445e7e7053
Extract subimages mod
2020-05-23 21:07:41 -06:00
James Betker
90073fc761
Update LQ_dataset to support inference on split image videos
2020-05-23 21:05:49 -06:00
James Betker
74bb0fad33
Allow dataset classes to add noise internally
2020-05-23 21:04:24 -06:00
James Betker
af1968f9e5
Allow passthrough discriminator to have passthrough disabled from config
2020-05-19 09:41:16 -06:00
James Betker
67139602f5
Test modifications
...
Allows bifurcating large images put into the test pipeline
This code is fixed and not dynamic. Needs some fixes.
2020-05-19 09:37:58 -06:00
James Betker
6400607fc5
ONNX export support
2020-05-19 09:36:04 -06:00
James Betker
89c71293ce
IDEA update
2020-05-19 09:35:26 -06:00
James Betker
9cde58be80
Make RRDB usable in the current iteration
2020-05-16 18:36:30 -06:00
James Betker
b95c4087d1
Allow an alt_path for saving models and states
2020-05-16 09:10:51 -06:00
James Betker
f911ef0d3e
Add corruptor_usage_probability
...
Governs how often a corruptor is used, vs feeding uncorrupted images.
2020-05-16 09:05:43 -06:00
James Betker
635c53475f
Restore swapout models just before a checkpoint
2020-05-16 07:45:19 -06:00
James Betker
a33ec3e22b
Fix skips & images samples
...
- Makes skip connections between the generator and discriminator more
extensible by adding additional configuration options for them and supporting
1 and 0 skips.
- Places the temp/ directory with sample images from the training process appear
in the training directory instead of the codes/ directory.
2020-05-15 13:50:49 -06:00
James Betker
cdf641e3e2
Remove working options from repo
2020-05-15 07:50:55 -06:00
James Betker
bd4d478572
config changes
2020-05-15 07:41:18 -06:00
James Betker
79593803f2
biggan arch, initial work (not implemented)
2020-05-15 07:40:45 -06:00
James Betker
61ed51d9e4
Improve corruptor logic: switch corruptors randomly
2020-05-14 23:14:32 -06:00
James Betker
d72e154442
Allow more LQ than GT images in corrupt mode
2020-05-14 20:46:20 -06:00
James Betker
8a514b9645
Misc changes
2020-05-14 20:45:38 -06:00
James Betker
a946483f1c
Fix discriminator noise floor
2020-05-14 20:45:06 -06:00
James Betker
c8ab89d243
Add model swapout
...
Model swapout is a feature where, at specified intervals,
a random D and G model will be swapped in place for the
one being trained. After a short period of time, this model
is swapped back out. This is intended to increase training
diversity.
2020-05-13 16:53:38 -06:00
James Betker
c336d32fd3
Config updates
2020-05-13 15:27:49 -06:00
James Betker
5bcf187fb6
Disable LMDB support
...
It doesn't play nice with multiple dataroots and I don't
really see any need to continue support since I'm not
testing it.
2020-05-13 15:27:33 -06:00
James Betker
e36f22e14a
Allow "corruptor" network to be specified
...
This network is just a fixed (pre-trained) generator
that performs a corruption transformation that the
generator-in-training is expected to undo alongside
SR.
2020-05-13 15:26:55 -06:00
James Betker
f389025b53
Change ResGen noise feature
...
It now injects noise directly into the input filters, rather than a
pure noise filter. The pure noise filter was producing really
poor results (and I'm honestly not quite sure why).
2020-05-13 09:22:06 -06:00
James Betker
343af70a8d
Add code for compiling model to torchscript
...
I want to be able to export it to other formats too in the future.
2020-05-13 09:21:13 -06:00
James Betker
585b05e66b
Cap test workers at 10
2020-05-13 09:20:45 -06:00
James Betker
037a5a3cdb
Config updates
2020-05-13 09:20:28 -06:00
James Betker
fc3ec8e3a2
Add a noise floor to th discriminator noise factor
2020-05-13 09:19:22 -06:00
James Betker
5d1b4caabf
Allow noise to be injected at the generator inputs for resgen
2020-05-12 16:26:29 -06:00
James Betker
06d18343f7
Allow noise to be added to discriminator inputs
2020-05-12 16:25:38 -06:00