-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What should I do if openvino2019R1 does not support operators slice, resize, and avgpool during model conversion? #24840
Comments
Hello @tanyao15, thank you for reaching the OpenVINO! Unfortunately 2019 release is out of support and Model Optimizer is deprecated. Thank you! |
可是如果必须要用openvino2019r1呢?因为需要用到相对应的FPGA开发板
…---- 回复的原邮件 ----
| 发件人 | Andrei ***@***.***> |
| 日期 | 2024年06月04日 19:39 |
| 收件人 | ***@***.***> |
| 抄送至 | ***@***.***>***@***.***> |
| 主题 | Re: [openvinotoolkit/openvino] openvino2019R1在模型转化时候不支持算子slice、resize、avgpool怎么办 (Issue #24840) |
Hello @tanyao15, thank you for reaching the OpenVINO!
Unfortunately 2019 release is out of support and Model Optimizer is deprecated. convert_model and OVC tool should be used instead. Could you please take newer release of the OpenVINO and try converting your model by simply running ovc <model.onnx> ?
Thank you!
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Then you can debug the issue on your own with our support. As you can see in the error message there were suggestion regarding the specified input shape and others. Also you still can take MO from newer version and check if it works with newer versions and try to port some changes to old MO by simply changing the .py files. |
更换新的MO之后确实可以转换,但是推理部署模型的时候还是识别不到,无法支持,
出现错误error reading network:cannot parse future versions:10
|
@tanyao15 ok, great news! now it's time to find the closest MO version to 2019 which is able to produce IR but delta is minimal. Try going step by step from 2019 to find earliest MO possible to convert. Please note that there is an option in MO |
@tanyao15 any luck? |
Closing for now. Feel free to reopen once have some updates or further questions! |
As shown in the figure above, when converting from onnx to IR model, there is no registered "infer" function for node "resize_185" with op="rsize" and this operator cannot be implemented during inference.
The text was updated successfully, but these errors were encountered: