You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While the egg-like backend fully supports Associative-Commutative (and Distributive) rules (see also #8), the MatchCore backend is still heavily affected by non-terminating (infinite) rewriting loops.
When compiling theories for the MatchCore backend, a few improvements could be done:
Identify (and disallow for MatchCore?) associative, commutative and distributive rules, and other rules known to cause computational loops.
Keep an history of the expressions that have been visited by the classic fixpoint recursive rewriting algorithm during reduction. Halt, warn and return the expression when a loop is detected. (A vector of hash-es for storing history can suffice, but vulnerable to collisions)
Consider a "fuel" mechanism, used for example in GHC. Based on the input expression, at the beginning of reduction each rule is assigned a "fuel" counter. When the left hand of a rule is matched repeatedly, decrement its fuel. If a rule runs out of fuel (e.g. because of a loop due to commutativity or associativity), temporarily remove it from the theory and recompile the pattern matching block, after the expression changes, reintroduce the rule (in which position??) and reset its fuel counter.
Consider designing a smarter algorithm that considers both fuel and reduction history.
(Hardest point) Consider compile time transformations and rule reordering for the MatchCore backend, when a set of rewriting rules can be inferred to be terminating a priori.
The text was updated successfully, but these errors were encountered:
While the
egg
-like backend fully supports Associative-Commutative (and Distributive) rules (see also #8), the MatchCore backend is still heavily affected by non-terminating (infinite) rewriting loops.When compiling
theories
for the MatchCore backend, a few improvements could be done:hash
-es for storing history can suffice, but vulnerable to collisions)The text was updated successfully, but these errors were encountered: