-
Notifications
You must be signed in to change notification settings - Fork 650
Add Parametric Soft Exponential Unit (PSEU) activation layer #451
Conversation
Merge latest master
@gabrieldemarmiesse I have added the Parametric Soft Exponential Unit (PSEU) activation layer. Here is the research paper on arXiv. The build is passing now. Do you know someone who can review this ? |
@SriRangaTarun We're currently focused on getting the design of this repo right to have a v1.0. So there might be delays to reviews PRs which add new features. |
keras_contrib/layers/__init__.py
Outdated
@@ -4,6 +4,7 @@ | |||
from .advanced_activations.srelu import SReLU | |||
from .advanced_activations.swish import Swish | |||
from .advanced_activations.sinerelu import SineReLU | |||
from .advanced_activations.pseu import PSEU |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You should insert the import in alphabetical order
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed
self.build = True | ||
|
||
def call(self, x, mask=None): | ||
if self.alpha_init is not None and self.alpha_init < 0: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.alpha_init
can't be None as it has a default value. You can remove the first condition
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed
|
||
self.build = True | ||
|
||
def call(self, x, mask=None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What happens if mask is not None, is there any change in the behaviour?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed. It has no impact.
@@ -0,0 +1,87 @@ | |||
# -*- coding: utf-8 -*- | |||
from keras import initializers |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Usually, we like to have the imports sorted alphabetically, with first all the import XXX
and then all the from XX import YY
. See this StackOverflow thread
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed
@SriRangaTarun Thanks for the contribution! I've left some comments. Also, as soon as #441 gets merged, you can rebase on it and add yourself as the code owner for this feature. Nice work! |
@RaphaelMeudec and @gabrieldemarmiesse Ever since the codeowners PR has been merged, it is failing the codeowners test. But, every other test is passing. Please check the latest CI Build here. |
Now the build is passing after adding pseu.py to CODEOWNERS. |
@RaphaelMeudec and @gabrieldemarmiesse Today I tried to add initializer and trainable options, instead of only having a constant alpha initializer. But, I removed them and reverted to the previous version because they cause numerical instability in the function (it is a piecewise function in alpha, so it becomes unstable when the negative formula is used for positive alpha weights and the other way round). A custom initializer or trainability can cause the signs of the weights to mismatch the function and result in NaNs. So, I am sticking to constant initial alpha weights which can not be trained. |
@RaphaelMeudec Do you suggest any other changes ? The most recent build failed because the instancenormalization test suddenly failed for some reason. |
@SriRangaTarun All good, nothing to add IMO! @gabrieldemarmiesse Failing test doesn't seem linked to this PR, code looks good to me! |
Somehow, the build is passing now. |
@gabrieldemarmiesse I think we can merge this! |
We'll merge this after I merge the tf.keras PR. It's getting quite painful to resolve conflicts every two days. |
@SriRangaTarun Please merge master into your branches to make sure that it's compatible with tf.keras |
@RaphaelMeudec and @gabrieldemarmiesse I am closing this PR and have created an alias here. There, I have merged master into my branch. |
- What I did
See above
- How I did it
- How you can verify it
I added unit tests
This pull request fixes #issue_number_here