Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GRU functions #227

Closed
wants to merge 5 commits into from
Closed

GRU functions #227

wants to merge 5 commits into from

Conversation

tengyifei
Copy link
Contributor

  1. Support linear_before_reset by implementing a different GRU with that option
  2. Fix errors using bidirectional GRU

@tjingrant
Copy link
Collaborator

@fumihwh will probably give better review.
I have two general question/comment:

  1. Does GRU from Keras https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/keras/layers/recurrent.py with the reset_after attr do what you want?

with mapping between ONNX attribute linear_before_reset and keras attribute mapping as:

g(Xt*(Wh^T) + (rt . Ht-1)*(Rh^T) + Rbh + Wbh) # default, when linear_before_reset = 0 -> reset before matrix multiplication
g(Xt*(Wh^T) + (rt . (Ht-1*(Rh^T) + Rbh)) + Wbh) # when linear_before_reset != 0 -> reset after matrix multiplication
  1. It's fine with me to have this implementation here in onnx-tensorflow. But do you feel like maybe it's better to move this under tensorflow?

@fumihwh
Copy link
Collaborator

fumihwh commented Jul 20, 2018

@tengyifei
Thanks for contribution.
I am agree with @tjingrant . I prefer moving this implementation to tensorflow.
In my opinion, as we are not a new framework by wrapping tensorflow, any enhancement should be added to tensorflow.
If ONNX has such feature, means there should be a demand. Add to tensorflow is a better idea.

@ibm-jenkins
Copy link
Collaborator

Can one of the admins verify this patch?

@sinkpoint
Copy link

I'm running into this issue trying to translate PyTorch -> Tensorflow. Since they have different implementations, this is a roadblock.
It's unlikely at this point that Tensorflow will include this into its codebase, I vote for merging it here.

@tjingrant
Copy link
Collaborator

@sinkpoint I'm fine either way, but there is a legitimate concern around the maintenance of such piece of code. Is @tengyifei still around for taking this patch to its conclusion?

@fumihwh what do you say?

@tengyifei
Copy link
Contributor Author

@tjingrant Sorry I was distracted by other things! I would love to land this if possible. Personally I don't expect the TensorFlow team to merge the GRU extension into their library. Is there another place for such "adaptors"?

@iiSeymour
Copy link

iiSeymour commented Jun 27, 2019

This is also an issue for me so it would be great to see it merged here. I've opened #452 which uses the cudnn compatible GRU already in TensorFlow.

iiSeymour added a commit to iiSeymour/onnx-tensorflow that referenced this pull request Jul 3, 2019
@CLAassistant
Copy link

CLAassistant commented Jul 24, 2019

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 2 committers have signed the CLA.

✅ tjingrant
❌ tengyifei
You have signed the CLA already but the status is still pending? Let us recheck it.

@chinhuang007
Copy link
Collaborator

Close due to inactivity. Please reopen or create a new PR if needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants