-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SINGA-140: Fixed bug in CollectAll() function #141
base: master
Are you sure you want to change the base?
Conversation
Would you please change the commit message to follow this format "SINGA-xxx "? |
I'll change the commit message. |
here are the instructions: http://singa.apache.org/docs/general-rnn.html On Wed, Mar 30, 2016 at 6:24 PM, Raunaq Abhyankar [email protected]
|
Dear sir, |
have you tried to run the example? |
Original Code (no changes): $ ./bin/singa-run.sh -conf examples/char-rnn/job.conf Changed Code: $ ./bin/singa-run.sh -conf examples/char-rnn/job.conf @nudles This is the output. Before and after changes were made. |
Hi,
If you do not have GPU (or CUDA), then comment out one line in job.conf
|
Hey thanks for the tip!
Changed code
|
In SINGA_HOME/src/worker.cc, in “int Worker::CollectAll(int step, NeuralNet* net){}” function, the layers which are unrolled (except for the first one) should not collect parameters, due to parameter sharing.
Previous:
if (layer->partition_id() == id_)
Current changes:
if (layer->partition_id() == id_ && layer->unroll_index() == 0)
@kaiping