Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Oops, there are some operators not supported yet, including fake_channel_wise_quantize_dequantize_abs_max,fake_quantize_dequantize_moving_average_abs_max, #1105

Open
L-Inkink opened this issue May 22, 2023 · 7 comments

Comments

@L-Inkink
Copy link

L-Inkink commented May 22, 2023

问题描述
想将PaddleOCR的Multilingual_PP-OCRv3_det_slim_infer模型转换成onnx模型,使用命令报错

paddle2onnx --model_dir ./inference/ch_PP-OCRv3_det_infer \
--model_filename inference.pdmodel \
--params_filename inference.pdiparams \
--save_file ./inference/det_onnx/model.onnx \
--opset_version 10 \
--input_shape_dict="{'x':[-1,3,-1,-1]}" \
--enable_onnx_checker True

更多信息 :

  • 用于部署的推理引擎:Multilingual_PP-OCRv3_det_slim_infer
  • 为什么需要转换为ONNX格式:需要加快OCR文字识别的速度
  • Paddle2ONNX版本:1.0.6
  • 你的联系方式(Email/Wechat/Phone):

报错截图
image

其他信息

@WenmuZhou
Copy link

同样遇到了这个问题

@Nurfen
Copy link

Nurfen commented Aug 29, 2023

Same error

@HIT-ShuWei
Copy link

Same error, 请问何时能够支持呢?

@xdd130
Copy link

xdd130 commented Oct 16, 2024

请问楼主这个问题解决了吗,遇到了一样问题

@hhhappiness
Copy link

image
我也遇到的同样问题,想着slim版应该参数更少推理速度更快

@leduy-it
Copy link

leduy-it commented Dec 6, 2024

请问楼主这个问题解决了吗,遇到了一样问题. If already resolve please share with community solution.

Copy link

This issue is stale because it has been open for 30 days with no activity.

@github-actions github-actions bot added the stale label Feb 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants