Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add TVM application extension with WASM runtime #5892

Merged
merged 6 commits into from
Jul 28, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions apps/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,3 +26,4 @@ If you are interested in writing optimized kernels with TVM, checkout [TOPI: TVM
- [android_rpc](android_rpc) Android RPC server.
- [benchmark](benchmark) Example end to end compilation benchmarks
- [howto_deploy](howto_deploy) Tutorial on how to deploy TVM with minimum code dependency.
- [wasm_standalone](tvm-standalone) WebAssembly standalone for deep learning framework with TVM runtime.
8 changes: 8 additions & 0 deletions apps/wasm-standalone/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# Built packages
**/lib/


#Added by cargo

**/target/
**/Cargo.lock
202 changes: 202 additions & 0 deletions apps/wasm-standalone/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,202 @@
<!--- Licensed to the Apache Software Foundation (ASF) under one -->
<!--- or more contributor license agreements. See the NOTICE file -->
<!--- distributed with this work for additional information -->
<!--- regarding copyright ownership. The ASF licenses this file -->
<!--- to you under the Apache License, Version 2.0 (the -->
<!--- "License"); you may not use this file except in compliance -->
<!--- with the License. You may obtain a copy of the License at -->

<!--- http://www.apache.org/licenses/LICENSE-2.0 -->

<!--- Unless required by applicable law or agreed to in writing, -->
<!--- software distributed under the License is distributed on an -->
<!--- "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -->
<!--- KIND, either express or implied. See the License for the -->
<!--- specific language governing permissions and limitations -->
<!--- under the License. -->

# WebAssembly Standalone for Deep Learning Framework with TVM Runtime

#### Experimental notice: This project is still *experimental* and only serves as a proof of concept for running deep learning frameworks on [WebAssembly runtime](https://github.com/bytecodealliance/wasmtime) with [TVM stack](https://tvm.apache.org/).

- [WebAssembly Standalone for Deep Learning Framework with TVM Runtime](#webassembly-standalone-for-deep-learning-framework-with-tvm-runtime)
- [Motivation](#motivation)
- [Framework Landscape](#framework-landscape)
- [Project Status](#project-status)
- [PoC Guidelines](#poc-guidelines)
- [Pre-installation](#pre-installation)
- [Build ResNet50 model](#build-resnet50-model)
- [Build wasm-graph package](#build-wasm-graph-package)
- [Test](#test)
- [Future Work](#future-work)
- [More networks support](#more-networks-support)
- [Performance benchmark](#performance-benchmark)
- [Native TVM Rust runtime support](#native-tvm-rust-runtime-support)
- [Appendix](#appendix)
- [System packages install](#system-packages-install)

## Motivation

<img src="https://github.com/dmlc/web-data/raw/master/tvm/tutorial/tvm_support_list.png" alt="TVM hardware support" width="600"/>

As demonstrated in TVM runtime [tutorials](https://tvm.apache.org/docs/tutorials/relay_quick_start.html), TVM already supports WASM as the optional hardware backend, so we can leverage the features of WebAssembly (portability, security) and TVM runtime (domain-specific, optimization) to build a flexible and auto-optimized graph compiler for all deep learning frameworks.

## Framework Landscape

The figures below demonstrate the whole landscape of running deep learning frameworks on WASM runtime with TVM compiler stack.

* WASM graph generation
```
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
| | | | | |
| Framework Model | ---> | ONNX Model | ---> | TVM Relay Python API |
|_ _ _ _ _ _ _ _ _ _| |_ _ _ _ _ _ _| |_ _ _ _ _ _ _ _ _ _ _ _|
||
\/
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
| | | |
| WASM Graph Builder | | TVM Compiler Stack |
| (TVM runtime) | |_ _ _ _ _ _ _ _ _ _ _|
|_ _ _ _ _ _ _ _ _ _ _| ||
|| \/
_ _ _ _ _ _ _ _ _ || _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
| | \/ | | llvm-ar | |
| wasm_graph.wasm | <--- | libgraph_wasm32.a | <------- | graph.o |
|_ _ _ _ _ _ _ _ _| |_ _ _ _ _ _ _ _ _ _| |_ _ _ _ _|
```

* WASM graph loading
```
_ _ _ _ _ _ _ _ _ _ _
| |
| WASM Graph Loader |
| (WASM runtime) |
|_ _ _ _ _ _ _ _ _ _ _|
||
\/
_ _ _ _ _ _ _ _ _ _
| |
| wasm_graph.wasm |
|_ _ _ _ _ _ _ _ _ _|
```

## Project Status

This project should be considered **experimental** at the very early stage, all rich features are under active development. Here is the current operator support matrix:

| Model Name | Status |
| ---------- | ------ |
| ResNet50 | ✔️ |
| LeNet | <center>&mdash;</center> |

**NOTICE**: Currently this project is ONLY tested on Ubuntu system, so `Ubuntu 16.04+` should be prepared as the testing environment.

## PoC Guidelines

### Pre-installation

* Rust

Before running this demo, please make sure [Rust](#system-packages-install) has been installed.

After Rust installed, execute the code below to add `wasm32-wasi` target:
```shell
rustup target add wasm32-wasi
```

* TVM

Please follow TVM [installations](https://tvm.apache.org/docs/install/index.html) for the detailed instruction.

* LLVM

`LLVM 10.0` or later is REQUIRED.

### Build ResNet50 model

- Build DL library in the WebAssembly format.

- Download model

```
cd wasm-graph/tools && wget https://s3.amazonaws.com/onnx-model-zoo/resnet/resnet50v1/resnet50v1.onnx
```

- Compile

```
LLVM_AR=llvm-ar-10 python ./build_graph_lib.py -O3 ./resnet50v1.onnx
```

### Build wasm-graph package

```shell
cd wasm-graph && cargo build --release
cp ./target/wasm32-wasi/release/wasm_graph.wasm ./lib/wasm_graph_resnet50.wasm
```

### Test

Before running this demo, please make sure [`Rust`](#system-packages-install) has been installed.

Next run the command below to install the runtime package for testing (`rust` REQUIRED):

```shell
cd wasm-runtime/tests/test_graph_resnet50 && cargo build
```

Check the usage of `test_graph_resnet50`:

```shell
~# ./target/debug/test_graph_resnet50 -h

Usage: ./target/debug/test_graph_resnet50 [options]

Options:
-g, --wasm-graph-file FILE_PATH
set the path to wasm graph file
-i, --input-data-file FILE_PATH
set the path to input image file
-l, --label-class-file FILE_PATH
set the path to label class file
-h, --help print this help menu
```

Next perform model inference using these commands below:
```
$ cp ../../../wasm-graph/lib/wasm_graph_resnet50.wasm ./
$ wget -O cat.png https://github.com/dmlc/mxnet.js/blob/master/data/cat.png?raw=true
$ wget -O synset.csv https://raw.githubusercontent.com/kazum/tvm-wasm/master/synset.csv
$ ./target/debug/test_graph_resnet50 -g ./wasm_graph_resnet50.wasm -i ./cat.png -l ./synset.csv
original image dimensions: (256, 256)
resized image dimensions: (224, 224)
input image belongs to the class `tabby, tabby cat`
```

## Future Work

### More networks support
TODO

### Performance benchmark

We are working on several improvements on performances:
* WebAssembly simd128 support (**Done**)
* Auto-tvm enhancement for llvm target

### Native TVM Rust runtime support
TODO

## Appendix

### System packages install

* Rust (latest version)

If you are running Windows, to install Rust, download and run the [RUST-INIT.EXE](https://win.rustup.rs/), and then follow the onscreen instructions.

If you are a Linux user, run the following in your terminal, then follow the on-screen instructions to install Rust.

```shell
curl https://sh.rustup.rs -sSf | sh
```
3 changes: 3 additions & 0 deletions apps/wasm-standalone/wasm-graph/.cargo/config
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[build]
target = "wasm32-wasi"
rustflags = ["-C", "link-arg=--whole-archive", "-C", "link-arg=-lgraph_wasm32"]
43 changes: 43 additions & 0 deletions apps/wasm-standalone/wasm-graph/Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.

[package]
name = "wasm-graph"
version = "0.1.0"
authors = ["TVM Contributors"]
edition = "2018"
description = "WebAssembly graph to deep learning frameworks using TVM"
readme = "README.md"
repository = "https://github.com/apache/incubator-tvm"
license = "Apache-2.0"
keywords = ["wasm", "machine learning", "tvm"]

[profile.release]
lto = true
opt-level = 's'

[lib]
crate-type = ['cdylib']

[dependencies]
serde = "1.0.53"
serde_derive = "1.0.53"
serde_json = "1.0.53"
ndarray = "0.12"
tvm-sys = { path = "../../../rust/tvm-sys" }
tvm-graph-rt = { path = "../../../rust/tvm-graph-rt" }
lazy_static = "1.1.1"
leonwanghui marked this conversation as resolved.
Show resolved Hide resolved
24 changes: 24 additions & 0 deletions apps/wasm-standalone/wasm-graph/build.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/

fn main() {
let out_dir = concat!(env!("CARGO_MANIFEST_DIR"), "/lib");

println!("cargo:rustc-link-search=native={}", out_dir);
}
83 changes: 83 additions & 0 deletions apps/wasm-standalone/wasm-graph/src/lib.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/

#[macro_use]
extern crate lazy_static;
#[macro_use]
extern crate serde_derive;

mod types;
mod utils;

use std::{collections::HashMap, convert::TryFrom, env, sync::Mutex};

use tvm_graph_rt::{Graph, GraphExecutor, SystemLibModule, Tensor as TVMTensor};

use types::Tensor;

extern "C" {
fn __wasm_call_ctors();
}

lazy_static! {
static ref SYSLIB: SystemLibModule = SystemLibModule::default();
static ref GRAPH_EXECUTOR: Mutex<GraphExecutor<'static, 'static>> = {
unsafe {
// This is necessary to invoke TVMBackendRegisterSystemLibSymbol
// API calls.
__wasm_call_ctors();
}
let graph = Graph::try_from(include_str!(concat!(
env!("CARGO_MANIFEST_DIR"),
"/lib/graph.json"
)))
.unwrap();
let params_bytes =
include_bytes!(concat!(env!("CARGO_MANIFEST_DIR"), "/lib/graph.params"));
let params = tvm_graph_rt::load_param_dict(params_bytes)
.unwrap()
.into_iter()
.map(|(k, v)| (k, v.to_owned()))
.collect::<HashMap<String, TVMTensor<'static>>>();

let mut exec = GraphExecutor::new(graph, &*SYSLIB).unwrap();
exec.load_params(params);

Mutex::new(exec)
};
}

#[no_mangle]
pub extern "C" fn run(wasm_addr: i32, in_size: i32) -> i32 {
let in_tensor = unsafe { utils::load_input(wasm_addr, in_size as usize) };
let input: TVMTensor = in_tensor.as_dltensor().into();

GRAPH_EXECUTOR.lock().unwrap().set_input("data", input);
GRAPH_EXECUTOR.lock().unwrap().run();
let output = GRAPH_EXECUTOR
.lock()
.unwrap()
.get_output(0)
.unwrap()
.as_dltensor(false);

let out_tensor: Tensor = output.into();
let out_size = unsafe { utils::store_output(wasm_addr, out_tensor) };
out_size as i32
}
Loading