Skip to content

Commit

Permalink
[DOC] Fix Some Broken Web Links (apache#6475)
Browse files Browse the repository at this point in the history
  • Loading branch information
Johnson9009 authored and kevinthesun committed Sep 17, 2020
1 parent a50d9ab commit 5698ecf
Show file tree
Hide file tree
Showing 5 changed files with 8 additions and 8 deletions.
2 changes: 1 addition & 1 deletion apps/wasm-standalone/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@

<img src="https://github.com/dmlc/web-data/raw/master/tvm/tutorial/tvm_support_list.png" alt="TVM hardware support" width="600"/>

As demonstrated in TVM runtime [tutorials](https://tvm.apache.org/docs/tutorials/relay_quick_start.html), TVM already supports WASM as the optional hardware backend, so we can leverage the features of WebAssembly (portability, security) and TVM runtime (domain-specific, optimization) to build a flexible and auto-optimized graph compiler for all deep learning frameworks.
As demonstrated in TVM runtime [tutorials](https://tvm.apache.org/docs/tutorials/get_started/relay_quick_start.html), TVM already supports WASM as the optional hardware backend, so we can leverage the features of WebAssembly (portability, security) and TVM runtime (domain-specific, optimization) to build a flexible and auto-optimized graph compiler for all deep learning frameworks.

## Framework Landscape

Expand Down
8 changes: 4 additions & 4 deletions docs/dev/inferbound.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ InferBound Pass
*******************************************


The InferBound pass is run after normalize, and before ScheduleOps `build_module.py <https://github.com/apache/incubator-tvm/blob/master/python/tvm/build_module.py>`_. The main job of InferBound is to create the bounds map, which specifies a Range for each IterVar in the program. These bounds are then passed to ScheduleOps, where they are used to set the extents of For loops, see `MakeLoopNest <https://github.com/apache/incubator-tvm/blob/master/src/op/op_util.cc>`_, and to set the sizes of allocated buffers (`BuildRealize <https://github.com/apache/incubator-tvm/blob/master/src/op/compute_op.cc>`_), among other uses.
The InferBound pass is run after normalize, and before ScheduleOps `build_module.py <https://github.com/apache/incubator-tvm/blob/master/python/tvm/driver/build_module.py>`_. The main job of InferBound is to create the bounds map, which specifies a Range for each IterVar in the program. These bounds are then passed to ScheduleOps, where they are used to set the extents of For loops, see `MakeLoopNest <https://github.com/apache/incubator-tvm/blob/master/src/te/operation/op_util.cc>`_, and to set the sizes of allocated buffers (`BuildRealize <https://github.com/apache/incubator-tvm/blob/master/src/te/operation/compute_op.cc>`_), among other uses.

The output of InferBound is a map from IterVar to Range:

Expand Down Expand Up @@ -53,9 +53,9 @@ Therefore, let's review the Range and IterVar classes:
};
}
Note that IterVarNode also contains a Range ``dom``. This ``dom`` may or may not have a meaningful value, depending on when the IterVar was created. For example, when ``tvm.compute`` is called, an `IterVar is created <https://github.com/apache/incubator-tvm/blob/master/src/op/compute_op.cc>`_ for each axis and reduce axis, with dom's equal to the shape supplied in the call to ``tvm.compute``.
Note that IterVarNode also contains a Range ``dom``. This ``dom`` may or may not have a meaningful value, depending on when the IterVar was created. For example, when ``tvm.compute`` is called, an `IterVar is created <https://github.com/apache/incubator-tvm/blob/master/src/te/operation/compute_op.cc>`_ for each axis and reduce axis, with dom's equal to the shape supplied in the call to ``tvm.compute``.

On the other hand, when ``tvm.split`` is called, `IterVars are created <https://github.com/apache/incubator-tvm/blob/master/src/schedule/schedule_lang.cc>`_ for the inner and outer axes, but these IterVars are not given a meaningful ``dom`` value.
On the other hand, when ``tvm.split`` is called, `IterVars are created <https://github.com/apache/incubator-tvm/blob/master/src/te/schedule/schedule_lang.cc>`_ for the inner and outer axes, but these IterVars are not given a meaningful ``dom`` value.

In any case, the ``dom`` member of an IterVar is never modified during InferBound. However, keep in mind that the ``dom`` member of an IterVar is sometimes used as default value for the Ranges InferBound computes.

Expand Down Expand Up @@ -117,7 +117,7 @@ Tensors haven't been mentioned yet, but in the context of TVM, a Tensor represen
int value_index;
};
In the Operation class declaration above, we can see that each operation also has a list of InputTensors. Thus the stages of the schedule form a DAG, where each stage is a node in the graph. There is an edge in the graph from Stage A to Stage B, if the operation of Stage B has an input tensor whose source operation is the op of Stage A. Put simply, there is an edge from A to B, if B consumes a tensor produced by A. See the diagram below. This graph is created at the beginning of InferBound, by a call to `CreateReadGraph <https://github.com/apache/incubator-tvm/blob/master/src/schedule/bound.cc>`_.
In the Operation class declaration above, we can see that each operation also has a list of InputTensors. Thus the stages of the schedule form a DAG, where each stage is a node in the graph. There is an edge in the graph from Stage A to Stage B, if the operation of Stage B has an input tensor whose source operation is the op of Stage A. Put simply, there is an edge from A to B, if B consumes a tensor produced by A. See the diagram below. This graph is created at the beginning of InferBound, by a call to `CreateReadGraph <https://github.com/apache/incubator-tvm/blob/master/src/te/schedule/bound.cc>`_.

.. image:: https://raw.githubusercontent.com/tvmai/tvmai.github.io/master/images/docs/inferbound/stage_graph.png
:align: center
Expand Down
2 changes: 1 addition & 1 deletion tutorials/frontend/build_gcn.py
Original file line number Diff line number Diff line change
Expand Up @@ -164,7 +164,7 @@ def evaluate(data, logits):
# Define Graph Convolution Layer in Relay
# ---------------------------------------
# To run GCN on TVM, we first need to implement Graph Convolution Layer.
# You may refer to https://github.com/dmlc/dgl/blob/master/python/dgl/nn/mxnet/conv.py for a GraphConv Layer implemented in DGL with MXNet Backend
# You may refer to https://github.com/dmlc/dgl/blob/master/python/dgl/nn/mxnet/conv/graphconv.py for a GraphConv Layer implemented in DGL with MXNet Backend
#
# The layer is defined with below operations, note that we apply two transposes to keep adjacency matrix on right hand side of sparse_dense operator,
# this method is temporary and will be updated in next few weeks when we have sparse matrix transpose and support for left sparse operator.
Expand Down
2 changes: 1 addition & 1 deletion tutorials/optimize/opt_matmul_auto_tensorcore.py
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ def matmul_nn(A, B, L, dtype="float16", layout="NN"):
# (2) The warp tile size is not 16x16x16 on CUDA9, or not one of {16x16x16, 32x8x16, 8x32x16} on CUDA version >= 10.0.
#
# In this schedule, storage_align is used to reduce bank conflicts of shared memory. Please refer to this
# `doc <https://tvm.apache.org/docs/api/python/schedule.html#tvm.te.schedule.Stage.storage_align>`_
# `doc <https://tvm.apache.org/docs/api/python/te.html#tvm.te.Stage.storage_align>`_
# for the usage of storage_align primitive. In short, we need to add an offset to some shared memory buffer
# to reduce bank conflicts.
# According to the `wmma doc <https://docs.nvidia.com/cuda/cuda-c-programming-guide/index.html#wmma-description>`_,
Expand Down
2 changes: 1 addition & 1 deletion web/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ This command will create the tvmjs library that we can use to interface with the

Check code snippet in

- [tests/python/prepare_test_libs.py](https://github.com/apache/incubator-tvm/tree/master/web/tests/pythob/prepare_test_libs.py)
- [tests/python/prepare_test_libs.py](https://github.com/apache/incubator-tvm/tree/master/web/tests/python/prepare_test_libs.py)
shows how to create a wasm library that links with tvm runtime.
- Note that all wasm libraries have to created using the `--system-lib` option
- emcc.create_wasm will automatically link the runtime library `dist/wasm/libtvm_runtime.bc`
Expand Down

0 comments on commit 5698ecf

Please sign in to comment.