Skip to content

Commit

Permalink
Coarsening -> Simplification
Browse files Browse the repository at this point in the history
  • Loading branch information
tbennun committed Jan 6, 2022
1 parent 51d83e2 commit 5e5815b
Show file tree
Hide file tree
Showing 117 changed files with 344 additions and 341 deletions.
16 changes: 8 additions & 8 deletions .github/workflows/general-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:
strategy:
matrix:
python-version: [3.7,'3.10']
coarsening: [0,1,autoopt]
simplify: [0,1,autoopt]

steps:
- uses: actions/checkout@v2
Expand Down Expand Up @@ -43,12 +43,12 @@ jobs:
export DACE_testing_serialization=1
export DACE_cache=unique
export DACE_optimizer_interface=" "
if [ "${{ matrix.coarsening }}" = "autoopt" ]; then
export DACE_optimizer_automatic_dataflow_coarsening=1
if [ "${{ matrix.simplify }}" = "autoopt" ]; then
export DACE_optimizer_automatic_simplification=1
export DACE_optimizer_autooptimize=1
echo "Auto-optimization heuristics"
else
export DACE_optimizer_automatic_dataflow_coarsening=${{ matrix.coarsening }}
export DACE_optimizer_automatic_simplification=${{ matrix.simplify }}
fi
pytest -n auto --cov-report=xml --cov=dace --tb=short -m "not gpu and not verilator and not tensorflow and not mkl and not sve and not papi and not mlir and not lapack and not fpga and not mpi"
codecov
Expand All @@ -59,12 +59,12 @@ jobs:
export DACE_testing_serialization=1
export DACE_cache=unique
export DACE_optimizer_interface=" "
if [ "${{ matrix.coarsening }}" = "autoopt" ]; then
export DACE_optimizer_automatic_dataflow_coarsening=1
if [ "${{ matrix.simplify }}" = "autoopt" ]; then
export DACE_optimizer_automatic_simplification=1
export DACE_optimizer_autooptimize=1
echo "Auto-optimization heuristics"
else
export DACE_optimizer_automatic_dataflow_coarsening=${{ matrix.coarsening }}
export DACE_optimizer_automatic_simplification=${{ matrix.simplify }}
fi
pytest -n 1 --cov-report=xml --cov=dace --tb=short -m "lapack"
codecov
Expand All @@ -74,7 +74,7 @@ jobs:
export NOSTATUSBAR=1
export DACE_testing_serialization=1
export DACE_cache=single
export DACE_optimizer_automatic_dataflow_coarsening=${{ matrix.coarsening }}
export DACE_optimizer_automatic_simplification=${{ matrix.simplify }}
export PYTHON_BINARY="coverage run --source=dace --parallel-mode"
./tests/polybench_test.sh
./tests/xform_test.sh
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ FPGA programming:
SDFG interactive transformation:
* `DACE_optimizer_transform_on_call` (default: False): Uses the transformation command line interface every time a `@dace` function is called.
* `DACE_optimizer_interface` (default: `dace.transformation.optimizer.SDFGOptimizer`): Controls the SDFG optimization process if `transform_on_call` is enabled. By default, uses the transformation command line interface.
* `DACE_optimizer_automatic_dataflow_coarsening` (default: True): If False, skips automatic dataflow coarsening in the Python frontend (see transformations tutorial for more information).
* `DACE_optimizer_automatic_simplification` (default: True): If False, skips automatic simplification in the Python frontend (see transformations tutorial for more information).

Profiling:
* `DACE_profiling` (default: False): Enables profiling measurement of the DaCe program runtime in milliseconds. Produces a log file and prints out median runtime.
Expand Down
6 changes: 3 additions & 3 deletions dace/config_schema.yml
Original file line number Diff line number Diff line change
Expand Up @@ -86,12 +86,12 @@ required:
title: Save intermediate SDFGs
description: Save SDFG files after every transformation.

automatic_dataflow_coarsening:
automatic_simplification:
type: bool
default: true
title: Automatic dataflow coarsening
title: Automatic SDFG simplification
description: >
Automatically performs dataflow coarsening on programs.
Automatically performs SDFG simplification on programs.
detect_control_flow:
type: bool
Expand Down
6 changes: 3 additions & 3 deletions dace/frontend/python/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,9 +44,9 @@ There is also upcoming support for NumPy ufuncs. You may preview ufunc support w

## Known Issues

### Issues when automatic dataflow coarsening is enabled
### Issues when automatic simplification is enabled

When automatic dataflow coarsening is enabled, SDFGs created using the
When automatic simplification is enabled, SDFGs created using the
Python-Frontend are automatically transformed using:
- InlineSDFG
- EndStateElimination
Expand All @@ -67,4 +67,4 @@ ranges, leading to RW/WR/WW dependencies, InlineSDFG and StateFusion may violate
- When there are sequential dependencies between statements due to updating a loop variable,
StateFusion may erroneously lead to concurrent execution of those statements (see [#315](https://github.com/spcl/dace/issues/315)).

Temporary workaround: Disable the automatic dataflow coarsening flag in the configuration file `.dace.conf`.
Temporary workaround: Disable the automatic simplification pass flag in the configuration file `.dace.conf`.
14 changes: 7 additions & 7 deletions dace/frontend/python/newast.py
Original file line number Diff line number Diff line change
Expand Up @@ -138,14 +138,14 @@ def parse_dace_program(name: str,
argtypes: Dict[str, data.Data],
constants: Dict[str, Any],
closure: SDFGClosure,
coarsen: Optional[bool] = None,
simplify: Optional[bool] = None,
save=True) -> SDFG:
""" Parses a `@dace.program` function into an SDFG.
:param src_ast: The AST of the Python program to parse.
:param visitor: A ProgramVisitor object returned from
``preprocess_dace_program``.
:param closure: An object that contains the @dace.program closure.
:param coarsen: If True, dataflow coarsening will be performed.
:param simplify: If True, simplification pass will be performed.
:param save: If True, saves source mapping data for this SDFG.
:return: A 2-tuple of SDFG and its reduced (used) closure.
"""
Expand All @@ -158,7 +158,7 @@ def parse_dace_program(name: str,
scope_arrays=argtypes,
scope_vars={},
closure=closure,
coarsen=coarsen)
simplify=simplify)

sdfg, _, _, _ = visitor.parse_program(preprocessed_ast.preprocessed_ast.body[0])
sdfg.set_sourcecode(preprocessed_ast.src, 'python')
Expand Down Expand Up @@ -959,7 +959,7 @@ def __init__(self,
closure: SDFGClosure = None,
nested: bool = False,
tmp_idx: int = 0,
coarsen: Optional[bool] = None):
simplify: Optional[bool] = None):
""" ProgramVisitor init method
Arguments:
Expand All @@ -972,7 +972,7 @@ def __init__(self,
scope_arrays {Dict[str, data.Data]} -- Scope arrays
scope_vars {Dict[str, str]} -- Scope variables
closure {SDFGClosure} -- The closure of this program
coarsen {bool} -- Whether to apply dataflow coarsening after parsing nested dace programs
simplify {bool} -- Whether to apply simplification pass after parsing nested dace programs
Keyword Arguments:
nested {bool} -- True, if SDFG is nested (default: {False})
Expand All @@ -991,7 +991,7 @@ def __init__(self,
self.globals = global_vars
self.closure = closure
self.nested = nested
self.coarsen = coarsen
self.simplify = simplify

# Keeps track of scope arrays, numbers, variables and accesses
self.scope_arrays = OrderedDict()
Expand Down Expand Up @@ -3248,7 +3248,7 @@ def _parse_sdfg_call(self, funcname: str, func: Union[SDFG, SDFGConvertible], no

if isinstance(fcopy, DaceProgram):
fcopy.signature = copy.deepcopy(func.signature)
sdfg = fcopy.to_sdfg(*fargs, **fkwargs, coarsen=self.coarsen, save=False)
sdfg = fcopy.to_sdfg(*fargs, **fkwargs, simplify=self.simplify, save=False)
else:
sdfg = fcopy.__sdfg__(*fargs, **fkwargs)

Expand Down
35 changes: 20 additions & 15 deletions dace/frontend/python/parser.py
Original file line number Diff line number Diff line change
Expand Up @@ -181,7 +181,7 @@ def _auto_optimize(self, sdfg: SDFG, symbols: Dict[str, int] = None) -> SDFG:
from dace.transformation.auto import auto_optimize as autoopt
return autoopt.auto_optimize(sdfg, self.device, symbols=symbols)

def to_sdfg(self, *args, coarsen=None, save=False, validate=False, use_cache=False, **kwargs) -> SDFG:
def to_sdfg(self, *args, simplify=None, save=False, validate=False, use_cache=False, **kwargs) -> SDFG:
""" Parses the DaCe function into an SDFG. """
if use_cache:
# Update global variables with current closure
Expand All @@ -204,7 +204,7 @@ def to_sdfg(self, *args, coarsen=None, save=False, validate=False, use_cache=Fal
entry = self._cache.get(cachekey)
return entry.sdfg

sdfg = self._parse(args, kwargs, coarsen=coarsen, save=save, validate=validate)
sdfg = self._parse(args, kwargs, simplify=simplify, save=save, validate=validate)

if use_cache:
# Add to cache
Expand All @@ -213,11 +213,11 @@ def to_sdfg(self, *args, coarsen=None, save=False, validate=False, use_cache=Fal
return sdfg

def __sdfg__(self, *args, **kwargs) -> SDFG:
return self._parse(args, kwargs, coarsen=None, save=False, validate=False)
return self._parse(args, kwargs, simplify=None, save=False, validate=False)

def compile(self, *args, coarsen=None, save=False, **kwargs):
def compile(self, *args, simplify=None, save=False, **kwargs):
""" Convenience function that parses and compiles a DaCe program. """
sdfg = self._parse(args, kwargs, coarsen=coarsen, save=save)
sdfg = self._parse(args, kwargs, simplify=simplify, save=save)

# Invoke auto-optimization as necessary
if Config.get_bool('optimizer', 'autooptimize') or self.auto_optimize:
Expand Down Expand Up @@ -383,15 +383,15 @@ def __call__(self, *args, **kwargs):

return result

def _parse(self, args, kwargs, coarsen=None, save=False, validate=False) -> SDFG:
def _parse(self, args, kwargs, simplify=None, save=False, validate=False) -> SDFG:
"""
Try to parse a DaceProgram object and return the `dace.SDFG` object
that corresponds to it.
:param function: DaceProgram object (obtained from the ``@dace.program``
decorator).
:param args: The given arguments to the function.
:param kwargs: The given keyword arguments to the function.
:param coarsen: Whether to apply dataflow coarsening or not (None
:param simplify: Whether to apply simplification pass or not (None
uses configuration-defined value).
:param save: If True, saves the generated SDFG to
``_dacegraphs/program.sdfg`` after parsing.
Expand All @@ -403,18 +403,18 @@ def _parse(self, args, kwargs, coarsen=None, save=False, validate=False) -> SDFG
from dace.transformation import helpers as xfh

# Obtain DaCe program as SDFG
sdfg, cached = self._generate_pdp(args, kwargs, coarsen=coarsen)
sdfg, cached = self._generate_pdp(args, kwargs, simplify=simplify)

# Apply dataflow coarsening automatically
if not cached and (coarsen == True or
(coarsen is None and Config.get_bool('optimizer', 'automatic_dataflow_coarsening'))):
# Apply simplification pass automatically
if not cached and (simplify == True or
(simplify is None and Config.get_bool('optimizer', 'automatic_simplification'))):

# Promote scalars to symbols as necessary
promoted = scal2sym.promote_scalars_to_symbols(sdfg)
if Config.get_bool('debugprint') and len(promoted) > 0:
print('Promoted scalars {%s} to symbols.' % ', '.join(p for p in sorted(promoted)))

sdfg.coarsen_dataflow()
sdfg.simplify()

# Split back edges with assignments and conditions to allow richer
# control flow detection in code generation
Expand Down Expand Up @@ -619,11 +619,11 @@ def load_precompiled_sdfg(self, path: str, *args, **kwargs) -> None:
# Update SDFG cache with the SDFG and compiled version
self._cache.add(cachekey, csdfg.sdfg, csdfg)

def _generate_pdp(self, args, kwargs, coarsen=None) -> SDFG:
def _generate_pdp(self, args, kwargs, simplify=None) -> SDFG:
""" Generates the parsed AST representation of a DaCe program.
:param args: The given arguments to the program.
:param kwargs: The given keyword arguments to the program.
:param coarsen: Whether to apply dataflow coarsening when parsing
:param simplify: Whether to apply simplification pass when parsing
nested dace programs.
:return: A 2-tuple of (parsed SDFG object, was the SDFG retrieved
from cache).
Expand Down Expand Up @@ -690,7 +690,12 @@ def _generate_pdp(self, args, kwargs, coarsen=None) -> SDFG:
cached = True
else:
cached = False
sdfg = newast.parse_dace_program(self.name, parsed_ast, argtypes, self.dec_kwargs, closure, coarsen=coarsen)
sdfg = newast.parse_dace_program(self.name,
parsed_ast,
argtypes,
self.dec_kwargs,
closure,
simplify=simplify)

# Set SDFG argument names, filtering out constants
sdfg.arg_names = [a for a in self.argnames if a in argtypes]
Expand Down
2 changes: 1 addition & 1 deletion dace/frontend/tensorflow/tensorflow.py
Original file line number Diff line number Diff line change
Expand Up @@ -329,7 +329,7 @@ def train(

print("Adding connectors")
self.graph.fill_scope_connectors()
# self.graph.coarsen_dataflow(validate=False)
# self.graph.simplify(validate=False)
if gpu:
self.graph.apply_gpu_transformations()

Expand Down
6 changes: 3 additions & 3 deletions dace/libraries/blas/nodes/gemv.py
Original file line number Diff line number Diff line change
Expand Up @@ -832,10 +832,10 @@ def _gemTv_pblas(_A: dtype[m, n], _x: dtype[m], _y: dtype[n]):
# in ValueError: Node type "BlockCyclicScatter" not supported for
# promotion
if transA:
sdfg = _gemTv_pblas.to_sdfg(coarsen=False)
sdfg = _gemTv_pblas.to_sdfg(simplify=False)
else:
sdfg = _gemNv_pblas.to_sdfg(coarsen=False)
sdfg.coarsen_dataflow()
sdfg = _gemNv_pblas.to_sdfg(simplify=False)
sdfg.simplify()
return sdfg


Expand Down
2 changes: 1 addition & 1 deletion dace/sdfg/nodes.py
Original file line number Diff line number Diff line change
Expand Up @@ -489,7 +489,7 @@ class NestedSDFG(CodeNode):

no_inline = Property(dtype=bool,
desc="If True, this nested SDFG will not be inlined during "
"dataflow coarsening",
"simplification",
default=False)

unique_name = Property(dtype=str, desc="Unique name of the SDFG", default="")
Expand Down
15 changes: 7 additions & 8 deletions dace/sdfg/sdfg.py
Original file line number Diff line number Diff line change
Expand Up @@ -1896,18 +1896,17 @@ def is_valid(self) -> bool:

def apply_strict_transformations(self, validate=True, validate_all=False):
"""
This method is DEPRECATED in favor of ``coarsen_dataflow``.
This method is DEPRECATED in favor of ``simplify``.
Applies safe transformations (that will surely increase the
performance) on the SDFG. For example, this fuses redundant states
(safely) and removes redundant arrays.
B{Note:} This is an in-place operation on the SDFG.
"""
warnings.warn('SDFG.apply_strict_transformations is deprecated, use SDFG.coarsen_dataflow instead.',
DeprecationWarning)
return self.coarsen_dataflow(validate, validate_all)
warnings.warn('SDFG.apply_strict_transformations is deprecated, use SDFG.simplify instead.', DeprecationWarning)
return self.simplify(validate, validate_all)

def coarsen_dataflow(self, validate=True, validate_all=False):
def simplify(self, validate=True, validate_all=False):
""" Applies safe transformations (that will surely increase the
performance) on the SDFG. For example, this fuses redundant states
(safely) and removes redundant arrays.
Expand All @@ -1919,7 +1918,7 @@ def coarsen_dataflow(self, validate=True, validate_all=False):
from dace.transformation.dataflow import RedundantReadSlice, RedundantWriteSlice
from dace.sdfg import utils as sdutil
# This is imported here to avoid an import loop
from dace.transformation.transformation import coarsening_transformations
from dace.transformation.transformation import simplification_transformations

# First step is to apply multi-state inline, before any state fusion can
# occur
Expand All @@ -1930,7 +1929,7 @@ def coarsen_dataflow(self, validate=True, validate_all=False):
validate=validate,
permissive=False,
validate_all=validate_all)
self.apply_transformations_repeated(coarsening_transformations(),
self.apply_transformations_repeated(simplification_transformations(),
validate=validate,
permissive=False,
validate_all=validate_all)
Expand Down Expand Up @@ -2141,7 +2140,7 @@ def apply_gpu_transformations(self, states=None, validate=True, validate_all=Fal
generate GPU code.
:note: It is recommended to apply redundant array removal
transformation after this transformation. Alternatively,
you can coarsen_dataflow() after this transformation.
you can simplify() after this transformation.
:note: This is an in-place operation on the SDFG.
"""
# Avoiding import loops
Expand Down
2 changes: 1 addition & 1 deletion dace/transformation/__init__.py
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
from .transformation import (coarsening_transformations, SingleStateTransformation, MultiStateTransformation,
from .transformation import (simplification_transformations, SingleStateTransformation, MultiStateTransformation,
SubgraphTransformation, ExpandTransformation)
Loading

0 comments on commit 5e5815b

Please sign in to comment.