Commit Graph

1328 Commits

Author SHA1 Message Date
James Betker
60079a1572 Fix saver in distributed mode 2021-06-14 09:41:06 -06:00
James Betker
545f2db170 Distributed FID dataset across processes 2021-06-14 09:33:44 -06:00
James Betker
6b32c87dcb Try to make diffusion fid more deterministic 2021-06-14 09:27:43 -06:00
James Betker
5b4f86293f Add FID evaluator for diffusion models 2021-06-14 09:14:30 -06:00
James Betker
9cfe840872 Attempt to fix syncing multiple times when doing gradient accumulation 2021-06-13 14:30:30 -06:00
James Betker
1cd75dfd33 Fix ddp bug 2021-06-13 10:25:23 -06:00
James Betker
3e3ad7825f Add support for training an EMA network alongside the main networks 2021-06-12 21:01:41 -06:00
James Betker
696f320820 Get rid of feature networks 2021-06-11 20:50:07 -06:00
James Betker
65c474eecf Various changes to fix testing 2021-06-11 15:31:10 -06:00
James Betker
220f11a5e4 Half channel sizes in cifar_resnet 2021-06-09 17:06:37 -06:00
James Betker
aea12e1b9c Fix cat eval hack 2021-06-09 17:05:11 -06:00
James Betker
9b5f4abb91 Add fade in for hard switch 2021-06-07 18:15:09 -06:00
James Betker
108c5d829c Fix dropout norm 2021-06-07 16:13:23 -06:00
James Betker
438217094c Also debug distribution of switch 2021-06-07 15:36:07 -06:00
James Betker
44b09e5f20 Amplify dropout rate 2021-06-07 15:20:53 -06:00
James Betker
f0d4eb9182 Fixor 2021-06-07 11:58:36 -06:00
James Betker
c456a60466 Another go at fixing nan 2021-06-07 11:51:43 -06:00
James Betker
1c574c5bd1 Attempt to fix nan 2021-06-07 11:43:42 -06:00
James Betker
eda796985b Try out dropout norm 2021-06-07 11:33:33 -06:00
James Betker
6c6e82406e Pass a corruption factor through the dataset into the upsampling network
The intuition is this will help guide the network to make better informed decisions
about how it performs upsampling based on how it perceives the underlying content.

(I'm giving up on letting networks detect their own quality - I'm not convinced it is
actually feasible)
2021-06-07 09:13:54 -06:00
James Betker
4dd053f694 Add distributed training guide to docs 2021-06-06 16:56:40 -06:00
James Betker
2ad2b56438 Don't do wandb except on rank 0 2021-06-06 16:52:07 -06:00
James Betker
7c5478bc2c Formatting issue with gdi 2021-06-06 16:35:37 -06:00
James Betker
061dbcd458 Another fix to anorm 2021-06-06 15:09:49 -06:00
James Betker
9a6991e461 Fix switch norm average 2021-06-06 15:04:28 -06:00
James Betker
57e1a6a0f2 cifar: add hard routing
Also mods switched_routing to support non-pixular inputs
2021-06-06 14:53:43 -06:00
James Betker
692e9c417b Support diffusion unet 2021-06-06 13:57:22 -06:00
James Betker
a0158ebc69 Simplify cifar resnet further for faster training 2021-06-06 10:02:24 -06:00
James Betker
75567a9814 Only head norm removed 2021-06-05 23:29:11 -06:00
James Betker
65d0376b90 Re-add normalization at the tail of the RRDB 2021-06-05 23:04:05 -06:00
James Betker
184e887122 Remove rrdb normalization 2021-06-05 21:39:19 -06:00
James Betker
f5e75602b9 Add regular attention to cifar_resnet 2021-06-05 21:34:07 -06:00
James Betker
16cd92acd5 hack 2021-06-05 14:23:41 -06:00
James Betker
af52751d6b Fix device error 2021-06-05 14:21:32 -06:00
James Betker
5f0cc65f3b Register branched resnet properly 2021-06-05 14:19:03 -06:00
James Betker
fb405d9ef1 CIFAR stuff
- Extract coarse labels for the CIFAR dataset
- Add simple resnet that branches lower layers based on coarse labels
- Some other cleanup
2021-06-05 14:16:02 -06:00
James Betker
80d4404367 A few fixes:
- Output better prediction of xstart from eps
- Support LossAwareSampler
- Support AdamW
2021-06-05 13:40:32 -06:00
James Betker
fa908a6a15 Fix wandb import issue 2021-06-04 23:27:15 -06:00
James Betker
103a88506e Log eval to wandb 2021-06-04 23:23:20 -06:00
James Betker
7d45132f60 fdsa 2021-06-04 21:26:54 -06:00
James Betker
6c8c8087d5 asdf 2021-06-04 21:24:48 -06:00
James Betker
e6c537824a Allow validation for ce 2021-06-04 21:21:04 -06:00
James Betker
7c251af7a8 Support cifar100 with resnet 2021-06-04 17:29:07 -06:00
James Betker
bf811f80c1 GD mods & fixes
- Report variational loss separately
- Report model prediction from injector
- Log these things
- Use respacing like guided diffusion
2021-06-04 17:13:16 -06:00
James Betker
6084915af8 Support gaussian diffusion models
Adds support for GD models, courtesy of some maths from openai.

Also:
- Fixes requirement for eval{} even when it isn't being used
- Adds support for denormalizing an imagenet norm
2021-06-02 21:47:32 -06:00
James Betker
45bc76ba92 Fixes and mods to support training classifiers on imagenet 2021-06-01 17:25:24 -06:00
James Betker
f129eaa39e Clean up byol a bit
- Remove option to aug in dataset (there's really no reason for this now that kornia works on GPU on windows)
- Other stufff
2021-05-24 21:35:46 -06:00
James Betker
6649ef2dae Add zipfilesdataset 2021-05-24 21:35:00 -06:00
James Betker
1a2b9fa130 Get rid of old byol net wrapping
Simplifies and makes this usable with DLAS' multi-gpu trainer
2021-04-27 12:48:34 -06:00
James Betker
119f17c808 Add testing capabilities for segformer & contrastive feature 2021-04-27 09:59:50 -06:00