You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was following the onnx import example in burn examples/onnx-inference . but instead of using mnist.py to generate the onnx model, i am using transformers python package to generate onnx model
Looking at the panicking statement in question, seems to be an issue with the unsqueeze op.
Right now most of the ONNX ops are parsed during import expecting some of the shapes to be available in the protobuf metadata, but in practice that is not always the case (as seen here).
I was following the onnx import example in burn
examples/onnx-inference
. but instead of using mnist.py to generate the onnx model, i am using transformers python package to generate onnx modelgetting this error
you can find the onnx model file at https://github.com/7kanak/rust_ml/blob/8ac45531fcb6b789ed685890c94011856671cd80/bert-ft/src/model/albert.onnx
The text was updated successfully, but these errors were encountered: