Running ONNX Models in C# kernel #3310
Unanswered
wpalaceJHA
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Has anyone been successful in getting an ONNX model to run in a C# kernel in a polyglot notebook? The below C# code runs fine from a command line application, but throws an error in dotnet interactive.
/// in code cell 1
#r "nuget:Microsoft.ML"
#r "nuget:Microsoft.ML.OnnxRuntime"
/// in C# code cell 2
using Microsoft.ML;
using Microsoft.ML.OnnxRuntime;
using Microsoft.ML.OnnxRuntime.Tensors;
// Create input data
float[] X = new float[] {1.0f, 2.0f, 3.0f, 4.0f}; // Your input data here
var inputTensor = new DenseTensor(X, new int[] {4, 1}); // Your input shape here
// Load a model
string model64 = "CAgSCHNrbDJvbm54GgQxLjEzIgdhaS5vbm54KAAyADrOAQpzCgtmbG9hdF9pbnB1dBIIdmFyaWFibGUaD0xpbmVhclJlZ3Jlc3NvciIPTGluZWFyUmVncmVzc29yKhYKDGNvZWZmaWNpZW50cz0AAABAoAEGKhQKCmludGVyY2VwdHM9AACAP6ABBjoKYWkub25ueC5tbBIgYWRmOTI3NDM1YmY1NDg2NGIzMThlNTk3NmE1OWQ3MzBaGwoLZmxvYXRfaW5wdXQSDAoKCAESBgoACgIIAWIYCgh2YXJpYWJsZRIMCgoIARIGCgAKAggBQg4KCmFpLm9ubngubWwQAUIECgAQEQ==";
byte[] modelBytes = Convert.FromBase64String(model64);
var session = new InferenceSession(modelBytes);
// Get input and output names
string input_name = session.InputMetadata.Keys.First();
string output_name = session.OutputMetadata.Keys.First();
// Run inference and get output
var pred = session.Run(new List {NamedOnnxValue.CreateFromTensor(input_name, inputTensor)})[0].AsTensor();
// view results
pred
/// error message
Error: System.TypeInitializationException: The type initializer for 'Microsoft.ML.OnnxRuntime.NativeMethods' threw an exception.
---> System.EntryPointNotFoundException: Unable to find an entry point named 'OrtGetApiBase' in DLL 'onnxruntime'.
at Microsoft.ML.OnnxRuntime.NativeMethods.OrtGetApiBase()
at Microsoft.ML.OnnxRuntime.NativeMethods..cctor() in D:\a_work\1\s\csharp\src\Microsoft.ML.OnnxRuntime\NativeMethods.shared.cs:line 313
--- End of inner exception stack trace ---
at Microsoft.ML.OnnxRuntime.SessionOptions..ctor() in D:\a_work\1\s\csharp\src\Microsoft.ML.OnnxRuntime\SessionOptions.shared.cs:line 53
at Microsoft.ML.OnnxRuntime.InferenceSession..ctor(Byte[] model) in D:\a_work\1\s\csharp\src\Microsoft.ML.OnnxRuntime\InferenceSession.shared.cs:line 130
at Submission#3.<>d__0.MoveNext()
--- End of stack trace from previous location ---
at Microsoft.CodeAnalysis.Scripting.ScriptExecutionState.RunSubmissionsAsync[TResult](ImmutableArray
1 precedingExecutors, Func
2 currentExecutor, StrongBox1 exceptionHolderOpt, Func
2 catchExceptionOpt, CancellationToken cancellationToken)Beta Was this translation helpful? Give feedback.
All reactions