-
Notifications
You must be signed in to change notification settings - Fork 61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question: So what about the neural network? #1
Comments
Hello, |
@kimhc6028 I think the issue @ryanpeach bought up was that the paper was mainly concerned about the distillation aspect. This project currently trains a novel decision tree that uses a convolutional filter to make its decision. The labels you use are based off of the MNIST data rather than the outputs of a NN. Happy to submit a PR next week to rectify this issue and add some argument parsing in a bit. |
If it's not too late, I would be interested in seeing such a PR. |
Would be interested too. |
@jnclayiii @benbetze if still relevant, see my recent implementation that includes distillation as well as visualization of learned parameters and follows the paper exactly in every detail: https://github.com/lmartak/distill-nn-tree |
So this seems to be an implementation of the tree as described in the paper, however, how then do we distill the knowledge from a neural network into the tree?
Thanks
The text was updated successfully, but these errors were encountered: