DL-Art-School/codes/models/vqvae
James Betker 0dca36946f Hard Routing mods
- Turns out my custom convolution was RIDDLED with backwards bugs, which is
   why the existing implementation wasn't working so well.
- Implements the switch logic from both Mixture of Experts and Switch Transformers
  for testing purposes.
2021-02-02 20:35:58 -07:00
..
__init__.py VQVAE 2021-01-07 10:20:15 -07:00
kmeans_mask_producer.py Support training imagenet classifier 2021-01-11 20:09:16 -07:00
scaled_weight_conv.py Support training imagenet classifier 2021-01-11 20:09:16 -07:00
vqvae_3.py Register vqvae3 2021-01-29 15:26:28 -07:00
vqvae_no_conv_transpose_hardswitched_lambda.py Hard Routing mods 2021-02-02 20:35:58 -07:00
vqvae_no_conv_transpose_switched_lambda.py Add switch norm, up dropout rate, detach selector 2021-01-26 09:31:53 -07:00
vqvae_no_conv_transpose.py update vqvae to double codebook size for bottom quantizer 2021-01-23 13:47:07 -07:00
vqvae.py VQVAE 2021-01-07 10:20:15 -07:00