-
Notifications
You must be signed in to change notification settings - Fork 841
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
quantized results description #39
Comments
@burui11087 Could you tell me on which condition the quantized size will be bigger than the original one? |
|
I am not a researcher in the direction of model quantization. At first I thought the official description you posted was correct, but after a lot of experiments( a classification task), I found the quantized results could go down as well as up(I can't post my experiment results here because of corporate data security). After discuss with my colleague, we think the quantization operation can be treated as reducing overfitting like L1,L2, but this is just speculation. However I think the official description should be verified again. Maybe I should open a issue in tensorflow github repository. |
I think that's my fault after reread https://tf.wiki/zh/deployment/lite.html#quantization . I find what you talk about is post-training quantization, it will cause accuracy loss. I use quantization-aware training in my daily work, that will make the accuracy better or worse. |
not quantized size, it's accuracy. |
@burui11087 Thank you for your careful reviewing. I am a beginner in tflite. If you have found any issue in this chapter, or have any idea, please let me know. |
@snowkylin maybe we need add training-aware-quantization content later. |
Thanks for your great work in advance!
I have some questions about the following description:
tensorflow-handbook/docs/zh/deployment/lite.html
Line 446 in 3b4abc3
In my daily experiments, I find that the quantized results(accuracy) can go down as well as up when task is classification. So I think it's better to modify this description.
Thanks again.
The text was updated successfully, but these errors were encountered: