Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

使用DCNv3_pytorch 训练的分割模型,导出onnx后,还是存在不支持的onnx node #286

Open
ZJDATY opened this issue Apr 11, 2024 · 3 comments

Comments

@ZJDATY
Copy link

ZJDATY commented Apr 11, 2024

我是win10环境,参考这篇文章,成功训练。
#160
#201

我想使用openvino C++部署,使用onnx作为中间层转换。
使用export的教程,得到end2end.onnx
https://github.com/OpenGVLab/InternImage/tree/master/segmentation#export
转到IR模型时,遇到不支持的onnx节点。报错如下
mo --input_model end2end.onnx


[ ERROR ]  -------------------------------------------------
[ ERROR ]  ----------------- INTERNAL ERROR ----------------
[ ERROR ]  Unexpected exception happened.
[ ERROR ]  Please contact Model Optimizer developers and forward the following information:
[ ERROR ]  Check 'error_message.empty()' failed at src\frontends\onnx\frontend\src\frontend.cpp:124:
OpenVINO does not support the following ONNX operations: mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.
TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3,
 mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy
.TRTDCNv3, mmdeploy.TRTDCNv3, mmdeploy.TRTDCNv3

继而发现了这个issues。
#61

接着 参考了下面这篇issues的回复,修改core_op='DCNv3_pytorch',
#41

报错如下,不支持的节点 由 mmdeploy.TRTDCNv3 ,变为了 mmdeploy.grid_sampler,想请教一下cpu推理不build mmdeploy后端,怎么使用?


[ ERROR ]  -------------------------------------------------
[ ERROR ]  ----------------- INTERNAL ERROR ----------------
[ ERROR ]  Unexpected exception happened.
[ ERROR ]  Please contact Model Optimizer developers and forward the following information:
[ ERROR ]  Check 'error_message.empty()' failed at src\frontends\onnx\frontend\src\frontend.cpp:124:
OpenVINO does not support the following ONNX operations: mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampl
er, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmd
eploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.g
rid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler, mmdeploy.grid_sampler
@tzhang2014
Copy link

在导出onnx时把opset 那个设为16或以上就可以了

@ZJDATY
Copy link
Author

ZJDATY commented May 14, 2024

在导出onnx时把opset 那个设为16或以上就可以了

你好,你测试成功了吗?你可以帮忙打印一下 pip list吗? 我本身安装的就是新版本的onnx 和onnx_runtime. 但是当我把默认导出的opset_version由11改为16时,它会提示不支持。
image
是否可能是mmdeploy的版本较老。
image

@tzhang2014
Copy link

我的onnx是1.14.0 ,onnxruntime是1.16.1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants