-
Notifications
You must be signed in to change notification settings - Fork 3.5k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add TVM application extension with WASM runtime
Signed-off-by: leonwanghui <[email protected]>
- Loading branch information
1 parent
82d157f
commit ae58ca1
Showing
16 changed files
with
926 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
[build] | ||
target = "wasm32-wasi" | ||
rustflags = ["-C", "link-arg=--whole-archive", "-C", "link-arg=-lops_wasm32"] |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
# Built packages | ||
**/lib/ | ||
|
||
|
||
#Added by cargo | ||
|
||
**/target/ | ||
**/Cargo.lock |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,30 @@ | ||
[package] | ||
name = "wasm-dlbackend-tvm" | ||
version = "0.1.0" | ||
authors = ["leonwanghui <[email protected]>"] | ||
edition = "2018" | ||
description = "WebAssembly backend to deep learning framework using TVM runtime" | ||
readme = "README.md" | ||
repository = "https://github.com/apache/incubator-tvm" | ||
license = "Apache-2.0" | ||
keywords = ["wasm", "machine learning", "tvm"] | ||
|
||
[package.metadata] | ||
wasm-opt = true | ||
wasm-name-section = false | ||
wasm-producers-section = false | ||
|
||
[profile.release] | ||
lto = true | ||
opt-level = 's' | ||
|
||
[lib] | ||
crate-type = ['cdylib'] | ||
|
||
[dependencies] | ||
serde = "1.0.53" | ||
serde_derive = "1.0.53" | ||
serde_json = "1.0.53" | ||
ndarray = "0.12" | ||
tvm-common = { version = "0.1", path = "../../rust/common" } | ||
tvm-runtime = { version = "0.1", path = "../../rust/runtime" } |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,137 @@ | ||
# WebAssembly Backend for Deep Learning Framework with TVM Runtime | ||
|
||
#### Experimental notice: This project is still *experimental* and only serves as a proof of concept for running deep learning frameworks (such like [MindSpore](https://github.com/mindspore-ai/mindspore)) on [WebAssembly runtime](https://github.com/bytecodealliance/wasmtime) with [TVM stack](https://tvm.apache.org/). | ||
|
||
- [WebAssembly Backend for Deep Learning Framework with TVM Runtime](#webassembly-backend-for-deep-learning-framework-with-tvm-runtime) | ||
- [Motivation](#motivation) | ||
- [Framework Landscape](#framework-landscape) | ||
- [Project Status](#project-status) | ||
- [PoC Guidelines](#poc-guidelines) | ||
- [Pre-installation](#pre-installation) | ||
- [Build wasm-dlbackend-tvm package](#build-wasm-dlbackend-tvm-package) | ||
- [Test](#test) | ||
- [Future Work](#future-work) | ||
- [Operator enhancement](#operator-enhancement) | ||
- [Performance benchmark](#performance-benchmark) | ||
- [Native TVM Rust runtime support](#native-tvm-rust-runtime-support) | ||
- [Appendix](#appendix) | ||
- [System packages install](#system-packages-install) | ||
- [Contribution](#contribution) | ||
|
||
## Motivation | ||
|
||
<img src="https://github.com/dmlc/web-data/raw/master/tvm/tutorial/tvm_support_list.png" alt="TVM hardware support" width="600"/> | ||
|
||
As demonstrated in TVM runtime [tutorials](https://tvm.apache.org/docs/tutorials/relay_quick_start.html), TVM already supports WASM as the optional hardware backend, so we can leverage the features of WebAssembly (portability, security) and TVM runtime (domain-specific, optimization) to build a flexible and auto-optimized operator backend for all deep learning frameworks. | ||
|
||
## Framework Landscape | ||
|
||
The figure below demonstrates the whole landscape of running deep learning framework on WASM runtime with TVM compiler stack. | ||
``` | ||
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ | ||
| | _ _ _ _ _ _ _ _ _ _ _ | ||
| Framework Frontend Expression | | | | ||
|_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _| | TVM (TE) Python API | | ||
|| |_ _ _ _ _ _ _ _ _ _ _| | ||
\/ || | ||
_ _ _ _ _ _ _ _ _ _ _ _ _ _ \/ | ||
| | _ _ _ _ _ _ _ _ _ _ _ | ||
| Framework WASM Backend | | | | ||
| (WASM runtime) | | TVM Compiler Stack | | ||
|_ _ _ _ _ _ _ _ _ _ _ _ _ _| |_ _ _ _ _ _ _ _ _ _ _| | ||
|| || | ||
\/ \/ | ||
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ | ||
| | | | llvm-ar | | | ||
| TVM Runtime | <--- | libops_wasm32.a | <------- | add.o sub.o | | ||
|_ _ _ _ _ _ _ _| |_ _ _ _ _ _ _ _ _| |_ _ _ _ _ _ _| | ||
``` | ||
|
||
## Project Status | ||
|
||
This project should be considered **experimental** at the very early stage, all rich features are under active development. Here is the current operator support matrix: | ||
|
||
| Operator Name | FP32 | INT32 | INT8 | | ||
| ------------- | ---- | ----- | ---- | | ||
| Add | ✔️ | <center>—</center> | <center>—</center> | | ||
| Sub | ✔️ | <center>—</center> | <center>—</center> | | ||
|
||
**NOTICE**: Currently this project is ONLY tested on Ubuntu system, so `Ubuntu 16.04+` should be prepared as the testing environment. | ||
|
||
## PoC Guidelines | ||
|
||
### Pre-installation | ||
|
||
* Rust | ||
|
||
Before running this demo, please make sure [Rust](#system-packages-install) has been installed. | ||
|
||
After Rust installed, execute the code below to add `wasm32-wasi` target: | ||
```shell | ||
rustup target add wasm32-wasi | ||
cargo install cargo-wasi | ||
``` | ||
|
||
* TVM | ||
|
||
Please follow TVM [installations](https://tvm.apache.org/docs/install/index.html), `export TVM_HOME=/path/to/tvm` and add `libtvm_runtime` to your `LD_LIBRARY_PATH`. | ||
|
||
*Note:* To run the end-to-end examples and tests, `tvm` and `topi` need to be added to your `PYTHONPATH` or it's automatic via an Anaconda environment when it is installed individually. | ||
### Build wasm-dlbackend-tvm package | ||
```shell | ||
cd wasm-dlbackend-tvm && cargo wasi build --release | ||
``` | ||
### Test | ||
Run the command below to install the frontend package for testing (`rust` REQUIRED): | ||
```shell | ||
cd wasm-dlfrontend/ && cargo build --release | ||
cp ./target/release/wasm-dlfrontend /usr/local/bin/ | ||
``` | ||
Check the usage of `wasm-dlfrontend`: | ||
```shell | ||
~# wasm-dlfrontend -h | ||
Usage: wasm-dlfrontend [options] | ||
Options: | ||
-c, --ms-backend-config FILE_PATH | ||
set wasm backend config file | ||
-o, --op-type VALUE set the operator type, currently ONLY support Add and | ||
Sub, default: Add. | ||
-h, --help print this help menu | ||
``` | ||
## Future Work | ||
### Operator enhancement | ||
TODO | ||
### Performance benchmark | ||
TODO | ||
### Native TVM Rust runtime support | ||
TODO | ||
## Appendix | ||
### System packages install | ||
* Rust (latest version) | ||
If you are running Windows, to install Rust, download and run the [RUST-INIT.EXE](https://win.rustup.rs/), and then follow the onscreen instructions. | ||
If you are a Linux user, run the following in your terminal, then follow the on-screen instructions to install Rust. | ||
```shell | ||
curl https://sh.rustup.rs -sSf | sh | ||
``` | ||
## Contribution | ||
Lastly very thanks [@kazum](https://github.com/kazum) for having offered a lot of help when implementing this project. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,32 @@ | ||
use std::{path::PathBuf, process::Command}; | ||
|
||
fn main() { | ||
let mut out_dir = PathBuf::from(env!("CARGO_MANIFEST_DIR")); | ||
out_dir.push("lib"); | ||
|
||
if !out_dir.is_dir() { | ||
std::fs::create_dir(&out_dir).unwrap(); | ||
} | ||
|
||
Command::new(concat!( | ||
env!("CARGO_MANIFEST_DIR"), | ||
"/tools/build_ops_lib.py" | ||
)) | ||
.arg(&out_dir) | ||
.output() | ||
.expect("Failed to execute command!"); | ||
|
||
let ar = option_env!("LLVM_AR").unwrap_or("llvm-ar-10"); | ||
let add_obj_file = out_dir.join("add.o"); | ||
let sub_obj_file = out_dir.join("sub.o"); | ||
let lib_file = out_dir.join("libops_wasm32.a"); | ||
Command::new(ar) | ||
.arg("rcs") | ||
.arg(&lib_file) | ||
.arg(&add_obj_file) | ||
.arg(&sub_obj_file) | ||
.output() | ||
.expect("Failed to execute command!"); | ||
|
||
println!("cargo:rustc-link-search=native={}", out_dir.display()); | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,31 @@ | ||
#[macro_use] | ||
extern crate serde_derive; | ||
#[macro_use] | ||
extern crate tvm_runtime; | ||
|
||
mod ops; | ||
use ops::types::Status; | ||
mod utils; | ||
|
||
#[no_mangle] | ||
pub extern "C" fn run(op_type: i32, in_addr: i32, in_size: i32, out_addr: i32) -> i32 { | ||
let inputs = utils::load_inputs(in_addr, in_size as usize); | ||
if ops::validate_inputs(&inputs) != Status::Succeed { | ||
return 0i32; | ||
} | ||
|
||
let op_instance = ops::operator_instantiate(op_type); | ||
let (a_shape, b_shape, c_shape) = ops::parse_inputs_shape(&inputs); | ||
if op_instance.init(a_shape, b_shape, c_shape) != Status::Succeed { | ||
return 0i32; | ||
}; | ||
|
||
let (in_tensors, out_tensor) = ops::parse_inputs_tensor(&inputs); | ||
let (stat, output) = op_instance.launch(in_tensors, out_tensor); | ||
if stat != Status::Succeed { | ||
return 0i32; | ||
} | ||
|
||
let out_size = utils::store_output(out_addr, output); | ||
out_size as i32 | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,63 @@ | ||
use super::types::*; | ||
use tvm_runtime::{Module as _, SystemLibModule}; | ||
|
||
extern "C" { | ||
fn __wasm_call_ctors(); | ||
} | ||
|
||
pub struct TVMAddOp {} | ||
|
||
impl TVMAddOp { | ||
pub fn new() -> Self { | ||
Self {} | ||
} | ||
} | ||
|
||
impl Operator for TVMAddOp { | ||
fn init(&self, a_shape: Vec<i64>, b_shape: Vec<i64>, c_shape: Vec<i64>) -> Status { | ||
if !((a_shape.len() == b_shape.len() | ||
&& a_shape | ||
.iter() | ||
.zip(&b_shape) | ||
.filter(|&(a, b)| a == b) | ||
.count() | ||
== a_shape.len()) | ||
&& (b_shape.len() == c_shape.len() | ||
&& b_shape | ||
.iter() | ||
.zip(&c_shape) | ||
.filter(|&(b, c)| b == c) | ||
.count() | ||
== c_shape.len())) | ||
{ | ||
println!("Both dimension size and shape for Add operator should be equal!"); | ||
return Status::InitFailed; | ||
} | ||
|
||
println!("TVM Add operator init success!"); | ||
Status::Succeed | ||
} | ||
|
||
fn launch(&self, mut inputs: Vec<Tensor>, output: Tensor) -> (Status, Tensor) { | ||
if inputs.len() != 2 { | ||
println!("Inputs tensor length should be 2!"); | ||
return (Status::LaunchFailed, Tensor::default()); | ||
} | ||
let mut l_tensor = inputs.get_mut(0).unwrap().as_dltensor(); | ||
let mut r_tensor = inputs.get_mut(1).unwrap().as_dltensor(); | ||
let mut out_tensor = output.as_dltensor(); | ||
|
||
unsafe { | ||
// This is necessary to invoke TVMBackendRegisterSystemLibSymbol | ||
// API calls. | ||
__wasm_call_ctors(); | ||
} | ||
let syslib = SystemLibModule::default(); | ||
let add = syslib.get_function("add").expect("add function not found!"); | ||
call_packed!(add, &mut l_tensor, &mut r_tensor, &mut out_tensor).unwrap(); | ||
|
||
let output: Tensor = out_tensor.into(); | ||
println!("TVM Add operator run success!"); | ||
(Status::Succeed, output) | ||
} | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,46 @@ | ||
mod add; | ||
use add::TVMAddOp; | ||
mod sub; | ||
use sub::TVMSubOp; | ||
pub mod types; | ||
use types::*; | ||
|
||
use std::boxed::Box; | ||
|
||
pub fn operator_instantiate(op_type: i32) -> Box<dyn Operator> { | ||
match OpType::from(op_type) { | ||
OpType::Add => Box::new(TVMAddOp::new()), | ||
OpType::Sub => Box::new(TVMSubOp::new()), | ||
} | ||
} | ||
|
||
pub fn validate_inputs(inputs: &Vec<Tensor>) -> Status { | ||
if (inputs.len() == 3 | ||
&& !(inputs[0].dtype() == inputs[1].dtype() && inputs[0].dtype() == inputs[2].dtype())) | ||
|| (inputs.len() == 2 && inputs[0].dtype() != inputs[1].dtype()) | ||
{ | ||
println!("The dtype of inputs and outputs is not equal!"); | ||
Status::ValidateFailed | ||
} else { | ||
Status::Succeed | ||
} | ||
} | ||
|
||
pub fn parse_inputs_shape(inputs: &Vec<Tensor>) -> (Vec<i64>, Vec<i64>, Vec<i64>) { | ||
if inputs.len() == 3 { | ||
(inputs[0].shape(), inputs[1].shape(), inputs[2].shape()) | ||
} else { | ||
(inputs[0].shape(), inputs[1].shape(), Vec::new()) | ||
} | ||
} | ||
|
||
pub fn parse_inputs_tensor(inputs: &Vec<Tensor>) -> (Vec<Tensor>, Tensor) { | ||
if inputs.len() == 3 { | ||
( | ||
vec![inputs[0].clone(), inputs[1].clone()], | ||
inputs[2].clone(), | ||
) | ||
} else { | ||
(vec![inputs[0].clone()], inputs[1].clone()) | ||
} | ||
} |
Oops, something went wrong.