Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiplication following a int8 transposed convolutions isn't constant folded #57680

Closed
lgeiger opened this issue Sep 13, 2022 · 9 comments
Closed
Assignees
Labels
comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.11 Issues related to TF 2.11 TFLiteConverter For issues related to TFLite converter type:bug Bug

Comments

@lgeiger
Copy link
Contributor

lgeiger commented Sep 13, 2022

When converting a model that used int8 quantization aware training, conversion of transposed convolutions followed by a scalar multiplication fails.

The converter isn't able to correctly constant fold the per-tensor fake quantized weights and the scalar multiplication which is a common patter when using transposed convolutions followed by batch normalisation layers. This is a follow-up issue to #53766

1. System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): macOS / Ubuntu
  • TensorFlow installation (pip package or built from source): pip
  • TensorFlow library (version, if pip package or github SHA, if built from source): 2.10.0 / 2.11.0-dev20220913

2. Code

A minimal reproduction of the issue is available in this notebook. Re-run the notebook to show netron visualisations showing the conversion problem.

@lgeiger lgeiger added the TFLiteConverter For issues related to TFLite converter label Sep 13, 2022
@mohantym mohantym assigned mohantym and unassigned tilakrayal Sep 13, 2022
@mohantym mohantym added comp:lite TF Lite related issues TF 2.10 labels Sep 13, 2022
@mohantym
Copy link
Contributor

Hi @lgeiger !
Thanks for bring it up.
It is replicating in nightly version as you pointed but not in 2.10 stable release.
Could we stick with 2.10 (stable version) for a while.
Thank you!

@mohantym mohantym added type:bug Bug stat:awaiting response Status - Awaiting response from author labels Sep 13, 2022
@lgeiger
Copy link
Contributor Author

lgeiger commented Sep 13, 2022

@mohantym Thanks for the fast reply. However, I am able to reproduce this issue in 2.10, in fact this is the version which I am also using locally. Checkout this notebook.

With per-tensor fake quantization I am getting the following TFLite graph:
Screenshot 2022-09-13 at 16 12 28
However the multiplication should be fused into the transposed convolution like it is done when not using any weight quantization:
Screenshot 2022-09-13 at 16 12 35

I get the same result also when re-running your linked notebook.

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Status - Awaiting response from author label Sep 13, 2022
@mohantym
Copy link
Contributor

mohantym commented Sep 15, 2022

Hi @lgeiger !
Can you check again with tf-nightly. I deleted the run time and re-run the code in nightly ( issue seems to be fixed in nightly though , I think it might be specific to mac os )

@sachinprasadhs ! Could you look at this issue. Attached 2.9, 2.10 and nightly for reference.

Thank you!

@mohantym mohantym assigned sachinprasadhs and unassigned mohantym Sep 15, 2022
@lgeiger
Copy link
Contributor Author

lgeiger commented Sep 15, 2022

Can you check again with tf-nightly. I deleted the run time and re-run the code in nightly ( issue seems to be fixed in nightly though , I think it might be specific to mac os )

I double checked by re-running your colab notebook and the issue persists even on nightly 2.11.0-dev20220915. The scalar multiplication of a transposed convolutions with per-tensor quantized weights is still not fused correctly which results in dequantization and multiplications operation in the graph so that the transposed convolution is executed in float instead of in int8.

@sachinprasadhs sachinprasadhs added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Oct 27, 2022
@pjpratik
Copy link
Contributor

Issue exists in TF 2.11 and TF Nightly 2.12.0-dev20230111 as well. Please find the gist here. Thank you.

@pjpratik pjpratik added TF 2.11 Issues related to TF 2.11 and removed TF 2.10 labels Jan 11, 2023
@pkgoogle
Copy link

Hi,

Thank you for opening this issue. Since this issue has been open for a long time, the code/debug information for this issue may not be relevant with the current state of the code base.

The TFLite team is constantly improving the framework by fixing bugs and adding new features. We suggest you try the latest TensorFlow version with the latest compatible hardware configuration which could potentially resolve the issue. If you are still facing the issue, please create a new GitHub issue with your latest findings, with all the debugging information which could help us investigate.

Please follow the release notes to stay up to date with the latest developments which are happening in the TFLite space.

@pkgoogle pkgoogle added stat:awaiting response Status - Awaiting response from author and removed stat:awaiting tensorflower Status - Awaiting response from tensorflower labels Aug 28, 2023
@lgeiger
Copy link
Contributor Author

lgeiger commented Aug 29, 2023

@pkgoogle Thanks for the response. I double checked and the issue still persists in the latest TF nightly. See this notebook.

@pkgoogle
Copy link

Hi @lgeiger,

Thanks for raising this issue. Are you aware of AI-Edge-Torch? As we believe this issue is better supported by and more relevant to AI-Edge-Torch we are moving your issue there. Please follow progress here.

Let us know if you have any questions. Thanks.

@pkgoogle pkgoogle closed this as not planned Won't fix, can't repro, duplicate, stale Nov 27, 2024
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.11 Issues related to TF 2.11 TFLiteConverter For issues related to TFLite converter type:bug Bug
Projects
None yet
Development

No branches or pull requests

7 participants