-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to get working OpenVINO/ONNX GPU accelleration on Ubuntu 24.04, Intel i7 1355U #24797
Comments
Hi @ilya-lavrenov , Installation of ONNX Runtime with OpenVINO For installing ONNX Runtime, rather than using pip install openvino, please follow these steps:
Running Docker Container Please use the following command to run the Docker container:
This command ensures that the container has the same group permissions as the host for the Direct Rendering Manager (DRM) devices, allowing you to perform inference using GPU devices. We have tested this setup on Ubuntu 20.04, and i believe it should work on Ubuntu 24.04 as well. |
tried running the more elaborate docker command but does not seem to work:
|
tried to add the group in after the container has been created:
But still does not work:
|
Hi, can you share the current GPU driver version that is installed in your ubuntu 24.04 OS and the link for installing the driver |
I'm having the same issue, |
Sure. This is "stock Ubuntu 24.04 LTS", have not installed anything additional. So I don't think I have more details than the kernel version and how the driver informs us in dmesg. It did not look like you have to install a out-of-tree driver to make it work? As OpenVino does find the GPU on the host.
|
@khhsenti I just setup my own docker from base 24.04 image, added their installs and got gpu acceleration successfully. |
OpenVINO Version
onnxruntime-openvino 1.17.1
Operating System
Other (Please specify in description)
Hardware Architecture
x86 (64 bits)
Target Platform
DT Research tablet DT302-RP with Intel i7 1355U , running Ubuntu 24.04 LTS
Build issue description
We are trying to evaluate the possible performance improvement when running AI inference through the OpenVINO framework. Our AI models are ONNX based, and as such the integration between OpenVINO and ONNX need to work.
We have tried using the provided demonstration Docker container, but it is unable to find the GPU.
Installing the OpenVINO framework on the host OS directly, does provide the GPU as an option, but the Python version on Ubuntu 24.04 is too new that there is a PIP package available for the OpenVINO ONNX integration. We have then continued to build the ONNX integration manually, which worked, but the OpenVINO execution provider does not load.
Build script or step-by-step to reproduce
Run the provided docker container:
Install OpenVINO manually on host OS:
Follow: https://docs.openvino.ai/2022.3/openvino_docs_install_guides_configurations_for_intel_gpu.html
pip install openvino newest version
root@sentispec-DT302RP-dsv-1:~# source openvino_env/bin/activate (openvino_env) root@sentispec-DT302RP-dsv-1:~# python3 -c "from openvino import Core; print(Core().available_devices)" ['CPU', 'GPU'] (openvino_env) root@sentispec-DT302RP-dsv-1:~#
Build ONNX with OpenVino wheel to manual installation:
https://onnxruntime.ai/docs/build/eps.html#linux-4
pip install /home/sentispec-admin/onnxruntime_openvino-1.19.0-cp312-cp312-linux_x86_64.whl
Try to use it:
Happy to provide any details, decided to keep it as short as possible for now. I am aware ubuntu 24.04 is not currently officially supported but was thinking that it is planned e.g. also regarding this: #23905
Relevant log output
No response
Issue submission checklist
The text was updated successfully, but these errors were encountered: