-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
about the activation function #7
Comments
In other words, I am talking about the activation function in vgg_spiking.py |
The activation function in vgg_spiking.py is a place holder. The actual activation is integrate-and-fire (IF) implemented in the LinearSpike/STDB class |
If I change the activation function used by ann from relu to Prelu and avgpool2d to maxpool2d, what should I change in vgg_spiking.py accordingly |
If I apply the batch_nromalization method in ann, do I need to make corresponding changes in snn |
This ANN-SNN conversion method only works for ReLU, average pooling, and dropout. If you plan to include batch norm, you may need to define additional blocks for SNN. |
thank you a lot for your replay,i'll try to figure out this. |
Is it possible to replace the activation function RELU in snn.py with Prelu
The text was updated successfully, but these errors were encountered: