This folder contains several JavaScript examples. Most of the examples, unless remarked explicitly, are available in all NPM packages as described below:
- onnxruntime-node: Node.js binding for ONNXRuntime. Can be used in Node.js applications and Node.js compatible environment (eg. Electron.js).
- onnxruntime-web: ONNXRuntime on browsers.
- onnxruntime-react-native: ONNXRuntime for React Native applications on Android and iOS.
Click links for README of each examples.
-
Quick Start - Nodejs Binding - a demonstration of basic usage of ONNX Runtime Node.js binding.
-
Quick Start - Nodejs Binding Bundle - a demonstration of basic usage of ONNX Runtime Node.js binding using bundler.
-
Quick Start - Web (using script tag) - a demonstration of basic usage of ONNX Runtime Web using script tag.
-
Quick Start - Web (using bundler) - a demonstration of basic usage of ONNX Runtime Web using a bundler.
-
Importing - Nodejs Binding - a demonstration of how to import ONNX Runtime Node.js binding.
-
Importing - Web - a demonstration of how to import ONNX Runtime Web.
-
Importing - React Native - a demonstration of how to import ONNX Runtime React Native.
-
API usage - Tensor - a demonstration of basic usage of
Tensor
. -
API usage - Tensor <--> Image conversion - a demonstration of conversions from Image elements to and from
Tensor
. -
API usage - InferenceSession - a demonstration of basic usage of
InferenceSession
. -
API usage - SessionOptions - a demonstration of how to configure creation of an
InferenceSession
instance. -
API usage -
ort.env
flags - a demonstration of how to configure a set of global flags.
-
OpenAI Whisper - demonstrates how to run whisper tiny.en in your browser using onnxruntime-web and the browser's audio interfaces.
-
Facebook Segment-Anything - demonstrates how to run segment-anything in your browser using onnxruntime-web with webgpu.
-
Stable Diffusion Turbo - demonstrates how to run Stable Diffusion Turbo in your browser using onnxruntime-web with webgpu.
-
Phi-3-mini-4k-instruct - demonstrates how to run Phi-3-mini-4k-instruct in your browser using onnxruntime-web with webgpu.