Commit Graph

736 Commits

Author SHA1 Message Date
James Betker
a57ed8e960 Various mods to support better jpeg image filtering 2021-06-25 13:16:15 -06:00
James Betker
e7890dc0ba Misc fixes for diffusion nets 2021-06-21 10:38:07 -06:00
James Betker
65c474eecf Various changes to fix testing 2021-06-11 15:31:10 -06:00
James Betker
220f11a5e4 Half channel sizes in cifar_resnet 2021-06-09 17:06:37 -06:00
James Betker
9b5f4abb91 Add fade in for hard switch 2021-06-07 18:15:09 -06:00
James Betker
108c5d829c Fix dropout norm 2021-06-07 16:13:23 -06:00
James Betker
438217094c Also debug distribution of switch 2021-06-07 15:36:07 -06:00
James Betker
44b09e5f20 Amplify dropout rate 2021-06-07 15:20:53 -06:00
James Betker
f0d4eb9182 Fixor 2021-06-07 11:58:36 -06:00
James Betker
c456a60466 Another go at fixing nan 2021-06-07 11:51:43 -06:00
James Betker
1c574c5bd1 Attempt to fix nan 2021-06-07 11:43:42 -06:00
James Betker
eda796985b Try out dropout norm 2021-06-07 11:33:33 -06:00
James Betker
6c6e82406e Pass a corruption factor through the dataset into the upsampling network
The intuition is this will help guide the network to make better informed decisions
about how it performs upsampling based on how it perceives the underlying content.

(I'm giving up on letting networks detect their own quality - I'm not convinced it is
actually feasible)
2021-06-07 09:13:54 -06:00
James Betker
061dbcd458 Another fix to anorm 2021-06-06 15:09:49 -06:00
James Betker
9a6991e461 Fix switch norm average 2021-06-06 15:04:28 -06:00
James Betker
57e1a6a0f2 cifar: add hard routing
Also mods switched_routing to support non-pixular inputs
2021-06-06 14:53:43 -06:00
James Betker
692e9c417b Support diffusion unet 2021-06-06 13:57:22 -06:00
James Betker
a0158ebc69 Simplify cifar resnet further for faster training 2021-06-06 10:02:24 -06:00
James Betker
75567a9814 Only head norm removed 2021-06-05 23:29:11 -06:00
James Betker
65d0376b90 Re-add normalization at the tail of the RRDB 2021-06-05 23:04:05 -06:00
James Betker
184e887122 Remove rrdb normalization 2021-06-05 21:39:19 -06:00
James Betker
f5e75602b9 Add regular attention to cifar_resnet 2021-06-05 21:34:07 -06:00
James Betker
af52751d6b Fix device error 2021-06-05 14:21:32 -06:00
James Betker
5f0cc65f3b Register branched resnet properly 2021-06-05 14:19:03 -06:00
James Betker
fb405d9ef1 CIFAR stuff
- Extract coarse labels for the CIFAR dataset
- Add simple resnet that branches lower layers based on coarse labels
- Some other cleanup
2021-06-05 14:16:02 -06:00
James Betker
80d4404367 A few fixes:
- Output better prediction of xstart from eps
- Support LossAwareSampler
- Support AdamW
2021-06-05 13:40:32 -06:00
James Betker
7c251af7a8 Support cifar100 with resnet 2021-06-04 17:29:07 -06:00
James Betker
bf811f80c1 GD mods & fixes
- Report variational loss separately
- Report model prediction from injector
- Log these things
- Use respacing like guided diffusion
2021-06-04 17:13:16 -06:00
James Betker
6084915af8 Support gaussian diffusion models
Adds support for GD models, courtesy of some maths from openai.

Also:
- Fixes requirement for eval{} even when it isn't being used
- Adds support for denormalizing an imagenet norm
2021-06-02 21:47:32 -06:00
James Betker
f129eaa39e Clean up byol a bit
- Remove option to aug in dataset (there's really no reason for this now that kornia works on GPU on windows)
- Other stufff
2021-05-24 21:35:46 -06:00
James Betker
1a2b9fa130 Get rid of old byol net wrapping
Simplifies and makes this usable with DLAS' multi-gpu trainer
2021-04-27 12:48:34 -06:00
James Betker
119f17c808 Add testing capabilities for segformer & contrastive feature 2021-04-27 09:59:50 -06:00
James Betker
9bbe6fc81e Get segformer to a trainable state 2021-04-25 11:45:20 -06:00
James Betker
fc623d4b5a Add segformer model. Start work on BYOL adaptation that will support training it. 2021-04-23 17:16:46 -06:00
James Betker
17555e7d07 misc adjustments for stylegan 2021-04-21 18:14:17 -06:00
James Betker
b687ef4cd0 Misc 2021-04-21 18:09:46 -06:00
James Betker
9fc3df3f5b Switched conv: add conversion function with allowlist 2021-03-13 10:44:56 -07:00
James Betker
cf9a6da889 Fix some bugs, checkin work on vqvae3 2021-03-02 20:56:19 -07:00
James Betker
f89ea5f1c6 Mods to support lightweight_gan model 2021-03-02 20:51:48 -07:00
James Betker
39fd755baa New benchmark numbers 2021-02-08 08:09:41 -07:00
James Betker
784b96c059 Misc options to add support for training stylegan2-rosinality models:
- Allow image_folder_dataset to normalize inbound images
- ExtensibleTrainer can denormalize images on the output path
- Support .webp - an output from LSUN
- Support logistic GAN divergence loss
- Support stylegan2 TF weight extraction for discriminator
- New injector that produces latent noise (with separated paths)
- Modify FID evaluator to be operable with rosinality-style GANs
2021-02-08 08:09:21 -07:00
James Betker
e7be4bdff3 Revert 2021-02-05 08:43:07 -07:00
James Betker
6dec1f5968 Back to groupnorm 2021-02-05 08:42:11 -07:00
James Betker
336f807c8e lambda2 2021-02-05 00:00:24 -07:00
James Betker
025a5867c4 Use syncbatchnorm instead 2021-02-04 22:26:36 -07:00
James Betker
bb79fafb89 Fix groupnorm specification 2021-02-04 22:15:38 -07:00
James Betker
43da1f9c4b Convert lambda coupler to use groupnorm instead of batchnorm 2021-02-04 21:59:44 -07:00
James Betker
7070142805 Make vqvae3_hard more configurable 2021-02-04 09:03:22 -07:00
James Betker
b980028ca8 Add get_debug_values for vqvae_3_hardswitch 2021-02-03 14:12:24 -07:00
James Betker
1405ff06b8 Fix SwitchedConvHardRoutingFunction for current cuda router 2021-02-03 14:11:55 -07:00