Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Training] Support for RKNPU Execution Provider on RK3562 Platform and On-Device Training Capabilities #21060

Open
Leo5050xvjf opened this issue Jun 16, 2024 · 1 comment
Labels
ep:RockchipNPU issues related to Rockchip execution provider training issues related to ONNX Runtime training; typically submitted using template

Comments

@Leo5050xvjf
Copy link

Describe the issue

Hi,
I have been reading the documentation for ONNX Runtime and came across the RKNPU Execution Provider (EP). I would like to clarify my understanding:

It appears that the RKNPU EP currently only supports the RK1808 Linux platform. Does this mean I cannot use the RKNPU EP on my RK3562 platform?
If the answer to the first question is yes, can I still perform ONNX model inference using the CPU on the RK3562?
To further clarify my requirements:

I intend to use ONNX Runtime with the RKNPU EP for model inference on the RK3562 platform.
I also plan to use ONNX Runtime's On-Device Training feature to train models on the RK3562 and would like to use the RKNPU EP for inference with the trained models.
Does the current version of ONNX Runtime support these requirements? Thank you for your clarification

To reproduce

No

Urgency

No response

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.18

PyTorch Version

1.7

Execution Provider

Other / Unknown

Execution Provider Library Version

RKNN EP

@Leo5050xvjf Leo5050xvjf added the training issues related to ONNX Runtime training; typically submitted using template label Jun 16, 2024
@prathikr prathikr added the ep:RockchipNPU issues related to Rockchip execution provider label Jun 18, 2024
@prathikr
Copy link
Contributor

prathikr commented Jun 18, 2024

Looks like RKNPU is a community contributed EP. @jslhcl @pranavsharma are recent contributors in this space, do you have any guidance for what level of RKNPU inference support we offer?

@Leo5050xvjf as for training, on-device training only supports cuda and cpu EPs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ep:RockchipNPU issues related to Rockchip execution provider training issues related to ONNX Runtime training; typically submitted using template
Projects
None yet
Development

No branches or pull requests

2 participants