diff --git a/docs/langref/relay_expr.rst b/docs/langref/relay_expr.rst index 2a120fe03842..6c4ec5ca2624 100644 --- a/docs/langref/relay_expr.rst +++ b/docs/langref/relay_expr.rst @@ -11,7 +11,7 @@ Dataflow and Control Fragments For the purposes of comparing Relay to traditional computational graph-based IRs, it can be useful to consider Relay exrpessions in terms of dataflow and control fragments. Each portion of a Relay program containing expressions that only affect the dataflow can -be viewed as a traditional comptuation graph when writing and expressing transformations. +be viewed as a traditional computation graph when writing and expressing transformations. The dataflow fragment covers the set of Relay expressions that do not involve control flow. That is, any portion of a program containing only the following @@ -31,8 +31,8 @@ fragment in Relay includes the following constructs: - Recursive Calls in Functions From the point of view of a computation graph, a function is a subgraph and a function call inlines the subgraph, substituting its arguments for the free variables in the subgraph with corresponding names. -Thus if a function's body uses only dataflow constructs -, a call to that function is in the dataflow fragment; conversely, if the +Thus, if a function's body uses only dataflow constructs, +a call to that function is in the dataflow fragment; conversely, if the function's body contains control flow, a call to that function is not part of the dataflow fragment. Variables @@ -205,6 +205,7 @@ For example, one can define a polymorphic identity function for any Relay type as follows: .. code-block:: python + fn(%x : t) -> t { %x } @@ -213,13 +214,14 @@ The below definition is also polymorphic, but restricts its arguments to tensor types: .. code-block:: python + fn(%x : Tensor[s, bt]) { %x } Notice that the return type is omitted and will be inferred. -*Note: :code:`where` syntax is not yet supported in the text format.* +*Note: "where" syntax is not yet supported in the text format.* A function may also be subject to one or more type relations, such as in the following: diff --git a/docs/langref/relay_op.rst b/docs/langref/relay_op.rst index ba59bc1f24a5..43033c355652 100644 --- a/docs/langref/relay_op.rst +++ b/docs/langref/relay_op.rst @@ -2,9 +2,9 @@ Relay Core Tensor Operators =========================== This page contains the list of core tensor operator primitives pre-defined in tvm.relay. -The core tensor operator primitives covers typical workloads in deep learning. -They can represent workloads in front-end frameworks, and provide basic building blocks for optimization. -Since deep learning is a fast evolving field and it is that possible to have operators that are not in here. +The core tensor operator primitives cover typical workloads in deep learning. +They can represent workloads in front-end frameworks and provide basic building blocks for optimization. +Since deep learning is a fast evolving field, it is possible to have operators that are not in here. .. note:: diff --git a/docs/langref/relay_type.rst b/docs/langref/relay_type.rst index 3078682b0d40..8d725b7d97f5 100644 --- a/docs/langref/relay_type.rst +++ b/docs/langref/relay_type.rst @@ -2,8 +2,8 @@ Relay's Type System =================== -We briefly introduced types while detailing Relay's expression language -, but have not yet described its type system. Relay is +We briefly introduced types while detailing Relay's expression language, +but have not yet described its type system. Relay is a statically typed and type-inferred language, allowing programs to be fully typed while requiring just a few explicit type annotations. diff --git a/python/tvm/relay/op/nn/nn.py b/python/tvm/relay/op/nn/nn.py index 0acb656c99ac..f70b0072f7db 100644 --- a/python/tvm/relay/op/nn/nn.py +++ b/python/tvm/relay/op/nn/nn.py @@ -101,7 +101,7 @@ def conv2d_transpose(data, kernel_layout="OIHW", output_padding=(0, 0), out_dtype=""): - """Two dimensional trnasposed convolution operator. + """Two dimensional transposed convolution operator. Parameters ---------- diff --git a/python/tvm/relay/op/transform.py b/python/tvm/relay/op/transform.py index bc0a42d6ab30..454d335692c6 100644 --- a/python/tvm/relay/op/transform.py +++ b/python/tvm/relay/op/transform.py @@ -231,7 +231,7 @@ def full(fill_value, shape=(), dtype=""): def full_like(data, fill_value): - """Return an scalar value array with the same shape and type as the input array. + """Return a scalar value array with the same shape and type as the input array. Parameters ---------- @@ -288,7 +288,7 @@ def where(condition, x, y): return _make.where(condition, x, y) def broadcast_to(data, shape): - """Return an scalar value array with the same type, broadcast to + """Return a scalar value array with the same type, broadcast to the provided shape. Parameters @@ -307,7 +307,7 @@ def broadcast_to(data, shape): return _make.broadcast_to(data, shape) def broadcast_to_like(data, broadcast_type): - """Return an scalar value array with the same shape and type as the input array. + """Return a scalar value array with the same shape and type as the input array. Parameters ---------- @@ -326,7 +326,7 @@ def broadcast_to_like(data, broadcast_type): def collapse_sum_like(data, collapse_type): - """Return an scalar value array with the same shape and type as the input array. + """Return a scalar value array with the same shape and type as the input array. Parameters ---------- @@ -377,7 +377,7 @@ def split(data, indices_or_sections, axis=0): def strided_slice(data, begin, end, strides=None): - """Strided slice of an array.. + """Strided slice of an array. Parameters ----------