Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: ONNX model inferencing does not work on Nvidia GPU, only CPU #24927

Closed
3 tasks done
haimat opened this issue Jun 10, 2024 · 1 comment
Closed
3 tasks done

[Bug]: ONNX model inferencing does not work on Nvidia GPU, only CPU #24927

haimat opened this issue Jun 10, 2024 · 1 comment
Assignees
Labels
bug Something isn't working category: GPU OpenVINO GPU plugin support_request

Comments

@haimat
Copy link

haimat commented Jun 10, 2024

OpenVINO Version

2024.1.0

Operating System

Ubuntu 20.04 (LTS)

Device used for inference

GPU

Framework

ONNX

Model used

Anomalib Fastflow

Issue description

I have trained a Fastflow PyTorch model with openvinotoolkit/anomalib, then I exported it to ONNX format. Now I want to use OpenVino to run that ONNX model on an Nvidia GPU (RTX A6000). This, however, does not work, as the predictions generated that way are empty. It does work with that very same model on the CPU though.

Is this a bug or does OpenVino not support Nvidia GPUs?

Step-by-step reproduction

After training the model via anomalib and exporting it to ONNX format, I am using anomalib's inferencer to load and run the model. This basically involves the following code:

import openvino as ov
core = ov.Core()
onnx = core.read_model("/path/to/model.onnx")
model = core.compile_model(model=onnx, device_name="GPU")  # works with "CPU" here
model(image) 

With CPU the last line model(image) leads to a correct result like this:

{<ConstOutput: names[output] shape[?,1,256,256] type: f32>: array([[[[-0.14703088, -0.14703088, -0.15631244, ...

With GPU the last line leads to an empty result:

{<ConstOutput: names[output] shape[?,1,256,256] type: f32>: array([], shape=(0, 1, 256, 256), dtype=float32)}

Relevant log output

No response

Issue submission checklist

  • I'm reporting an issue. It's not a question.
  • I checked the problem with the documentation, FAQ, open issues, Stack Overflow, etc., and have not found a solution.
  • There is reproducer code and related data files such as images, videos, models, etc.
@haimat haimat added bug Something isn't working support_request labels Jun 10, 2024
@ilya-lavrenov ilya-lavrenov added the category: GPU OpenVINO GPU plugin label Jun 10, 2024
@haimat
Copy link
Author

haimat commented Jun 17, 2024

It seems the problem is related to Nvidia GPUs being not fully supported in OpenVINO.
After I exported the model to ONNX / TensorRT format, everything works fine now.

@haimat haimat closed this as completed Jun 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working category: GPU OpenVINO GPU plugin support_request
Projects
None yet
Development

No branches or pull requests

3 participants