Change default ReLU slope to .2 BREAKS COMPATIBILITY
This conforms my ConvGnLelu implementation with the generally accepted negative_slope=.2. I have no idea where I got .1. This will break backwards compatibility with some older models but will likely improve their performance when freshly trained. I did some auditing to find what these models might be, and I am not actively using any of them, so probably OK.
This commit is contained in:
parent
9377d34ac3
commit
4328c2f713
|
@ -366,7 +366,7 @@ class ConvGnLelu(nn.Module):
|
||||||
else:
|
else:
|
||||||
self.gn = None
|
self.gn = None
|
||||||
if activation:
|
if activation:
|
||||||
self.lelu = nn.LeakyReLU(negative_slope=.1)
|
self.lelu = nn.LeakyReLU(negative_slope=.2)
|
||||||
else:
|
else:
|
||||||
self.lelu = None
|
self.lelu = None
|
||||||
|
|
||||||
|
|
Loading…
Reference in New Issue
Block a user