You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is a runtime error when loading a ONNX model with OnnxModelHandlerNumpy when using ONNX's latest version:
fromapache_beam.ml.inference.onnx_inferenceimportOnnxModelHandlerNumpymodel_handler=OnnxModelHandlerNumpy(
"/tmp/model.onnx"# Also crashed with a remote model URI gs://...
)
Here is the error, which happens using the DirectRunner:
/3.10.14/lib/python3.10/site-packages/apache_beam/ml/inference/onnx_inference.py", line 119, in load_model
model_proto_bytes = onnx._serialize(model_proto)
AttributeError: module 'onnx' has no attribute '_serialize' [while running 'RunInference/BeamML_RunInference']
What happened?
There is a runtime error when loading a ONNX model with
OnnxModelHandlerNumpy
when using ONNX's latest version:Here is the error, which happens using the DirectRunner:
Issue Priority
Priority: 1 (data loss / total loss of function)
Issue Components
The text was updated successfully, but these errors were encountered: