Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Windows Machine Learning (ML) includes a shared copy of the ONNX Runtime, including its APIs. That means when you install Windows ML via Windows App SDK, your app will have access to the full ONNX API surface.
To see what version of ONNX Runtime is included in specific Windows ML versions, see ONNX Runtime versions shipped in Windows ML.
This page covers how to use the ONNX APIs included in Windows ML.
Prerequisites
- Follow all the steps in Get started with Windows ML.
Namespaces / headers
The namespaces / headers for the ONNX APIs within Windows ML are as follows:
In C#, the namespaces of the ONNX APIs are the same as when using ONNX Runtime directly.
using Microsoft.ML.OnnxRuntime;
APIs
The ONNX APIs are the same as when using ONNX Runtime directly. For example, to create an inference session:
// Create inference session using compiled model
using InferenceSession session = new(compiledModelPath, sessionOptions);
We suggest reading the ONNX Runtime docs for more info about how to use the ONNX Runtime APIs within Windows ML.