HomePhabricator

[mlir][Linalg] Rethink fusion of linalg ops with reshape ops.

Authored by mravishankar on Oct 14 2020, 1:34 PM.

Description

[mlir][Linalg] Rethink fusion of linalg ops with reshape ops.

The current fusion on tensors fuses reshape ops with generic ops by
linearizing the indexing maps of the fused tensor in the generic
op. This has some limitations

  • It only works for static shapes
  • The resulting indexing map has a linearization that would be potentially prevent fusion later on (for ex. tile + fuse).

Instead, try to fuse the reshape consumer (producer) with generic op
producer (consumer) by expanding the dimensionality of the generic op
when the reshape is expanding (folding). This approach conflicts with
the linearization approach. The expansion method is used instead of
the linearization method.

Further refactoring that changes the fusion on tensors to be a
collection of patterns.

Differential Revision: https://reviews.llvm.org/D89002

Details

Committed
mravishankarOct 14 2020, 1:50 PM
Differential Revision
D89002: [mlir][Linalg] Rethink fusion ot linalg ops with reshape ops.
Parents
rG633f9fcb820b: Make header self-contained. NFC.
Branches
Unknown
Tags
Unknown