Skip to content

Commit

Permalink
Fix Relay docs formatting and grammar (apache#2471)
Browse files Browse the repository at this point in the history
  • Loading branch information
weberlo authored and tqchen committed Jan 25, 2019
1 parent 859bda8 commit 6d6762f
Show file tree
Hide file tree
Showing 5 changed files with 17 additions and 15 deletions.
10 changes: 6 additions & 4 deletions docs/langref/relay_expr.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Dataflow and Control Fragments
For the purposes of comparing Relay to traditional computational graph-based IRs, it
can be useful to consider Relay exrpessions in terms of dataflow and control fragments.
Each portion of a Relay program containing expressions that only affect the dataflow can
be viewed as a traditional comptuation graph when writing and expressing transformations.
be viewed as a traditional computation graph when writing and expressing transformations.

The dataflow fragment covers the set of Relay expressions that do not involve
control flow. That is, any portion of a program containing only the following
Expand All @@ -31,8 +31,8 @@ fragment in Relay includes the following constructs:
- Recursive Calls in Functions

From the point of view of a computation graph, a function is a subgraph and a function call inlines the subgraph, substituting its arguments for the free variables in the subgraph with corresponding names.
Thus if a function's body uses only dataflow constructs
, a call to that function is in the dataflow fragment; conversely, if the
Thus, if a function's body uses only dataflow constructs,
a call to that function is in the dataflow fragment; conversely, if the
function's body contains control flow, a call to that function is not part of the dataflow fragment.

Variables
Expand Down Expand Up @@ -205,6 +205,7 @@ For example, one can define a polymorphic identity function for
any Relay type as follows:

.. code-block:: python
fn<t : Type>(%x : t) -> t {
%x
}
Expand All @@ -213,13 +214,14 @@ The below definition is also polymorphic, but restricts its
arguments to tensor types:

.. code-block:: python
fn<s : Shape, bt : BaseType>(%x : Tensor[s, bt]) {
%x
}
Notice that the return type is omitted and will be inferred.

*Note: :code:`where` syntax is not yet supported in the text format.*
*Note: "where" syntax is not yet supported in the text format.*

A function may also be subject to one or more type relations, such as in
the following:
Expand Down
6 changes: 3 additions & 3 deletions docs/langref/relay_op.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@ Relay Core Tensor Operators
===========================

This page contains the list of core tensor operator primitives pre-defined in tvm.relay.
The core tensor operator primitives covers typical workloads in deep learning.
They can represent workloads in front-end frameworks, and provide basic building blocks for optimization.
Since deep learning is a fast evolving field and it is that possible to have operators that are not in here.
The core tensor operator primitives cover typical workloads in deep learning.
They can represent workloads in front-end frameworks and provide basic building blocks for optimization.
Since deep learning is a fast evolving field, it is possible to have operators that are not in here.


.. note::
Expand Down
4 changes: 2 additions & 2 deletions docs/langref/relay_type.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@
Relay's Type System
===================

We briefly introduced types while detailing Relay's expression language
, but have not yet described its type system. Relay is
We briefly introduced types while detailing Relay's expression language,
but have not yet described its type system. Relay is
a statically typed and type-inferred language, allowing programs to
be fully typed while requiring just a few explicit type annotations.

Expand Down
2 changes: 1 addition & 1 deletion python/tvm/relay/op/nn/nn.py
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ def conv2d_transpose(data,
kernel_layout="OIHW",
output_padding=(0, 0),
out_dtype=""):
"""Two dimensional trnasposed convolution operator.
"""Two dimensional transposed convolution operator.
Parameters
----------
Expand Down
10 changes: 5 additions & 5 deletions python/tvm/relay/op/transform.py
Original file line number Diff line number Diff line change
Expand Up @@ -231,7 +231,7 @@ def full(fill_value, shape=(), dtype=""):


def full_like(data, fill_value):
"""Return an scalar value array with the same shape and type as the input array.
"""Return a scalar value array with the same shape and type as the input array.
Parameters
----------
Expand Down Expand Up @@ -288,7 +288,7 @@ def where(condition, x, y):
return _make.where(condition, x, y)

def broadcast_to(data, shape):
"""Return an scalar value array with the same type, broadcast to
"""Return a scalar value array with the same type, broadcast to
the provided shape.
Parameters
Expand All @@ -307,7 +307,7 @@ def broadcast_to(data, shape):
return _make.broadcast_to(data, shape)

def broadcast_to_like(data, broadcast_type):
"""Return an scalar value array with the same shape and type as the input array.
"""Return a scalar value array with the same shape and type as the input array.
Parameters
----------
Expand All @@ -326,7 +326,7 @@ def broadcast_to_like(data, broadcast_type):


def collapse_sum_like(data, collapse_type):
"""Return an scalar value array with the same shape and type as the input array.
"""Return a scalar value array with the same shape and type as the input array.
Parameters
----------
Expand Down Expand Up @@ -377,7 +377,7 @@ def split(data, indices_or_sections, axis=0):


def strided_slice(data, begin, end, strides=None):
"""Strided slice of an array..
"""Strided slice of an array.
Parameters
----------
Expand Down

0 comments on commit 6d6762f

Please sign in to comment.