Describe the issue
After upgrading from onnxruntime 1.22 to 1.23.0, loading a previously working ONNX model throws a shape inference error related to external data.
This issue does not occur in version 1.22.x the same code and model run successfully there.
[ONNXRuntimeError] : 1 : FAIL : Node (/decoder/Split_33) Op (Split) [ShapeInferenceError] Cannot parse data from external tensors. Please load external data into raw data for tensor: /decoder/Constant_1066_output_0
Related: danielgatis/rembg#790
To reproduce
import onnxruntime as ort
model_path = "BiRefNet-COD-epoch_125.onnx"
session = ort.InferenceSession(
model_path,
providers=["CPUExecutionProvider"]
)
BiRefNet-COD-epoch_125.onnx
Urgency
No response
Platform
Linux
OS Version
Linux Mint 22 Cinnamon
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.23.0
ONNX Runtime API
Python
Architecture
X64
Execution Provider
CPU
Execution Provider Library Version
No response