Change default ReLU slope to .2 BREAKS COMPATIBILITY

This conforms my ConvGnLelu implementation with the generally accepted negative_slope=.2. I have no idea where I got .1. This will break backwards compatibility with some older models but will likely improve their performance when freshly trained. I did some auditing to find what these models might be, and I am not actively using any of them, so probably OK.
This commit is contained in:
James Betker 2020-12-19 08:28:03 -07:00
parent 9377d34ac3
commit 4328c2f713

View File

@ -366,7 +366,7 @@ class ConvGnLelu(nn.Module):
else:
self.gn = None
if activation:
self.lelu = nn.LeakyReLU(negative_slope=.1)
self.lelu = nn.LeakyReLU(negative_slope=.2)
else:
self.lelu = None