tensor.collapse_shape(tensor.expand_shape) op when fused with a
consumer(producer) elementwise linalg.generic operation results in
creation of tensor.expand_shape ops. In purely dynamic cases this
can end up with a dynamic dimensions being expanded to more than one
dynamic dimension. This is disallowed by the semantics of
tensor.expand_shape operation. (While the transformation is itself
correct, its a gap in the specification of tensor.expand_shape that
is the issue). So disallow fusions which result in such a pattern.
Details
Details
Diff Detail
Diff Detail
- Repository
- rG LLVM Github Monorepo
Event Timeline
mlir/lib/Dialect/Linalg/Transforms/ElementwiseOpFusion.cpp | ||
---|---|---|
628–632 | Would the op be valid in this case? Should we have a test for this case if it is possible? |
mlir/lib/Dialect/Linalg/Transforms/ElementwiseOpFusion.cpp | ||
---|---|---|
628–632 | The reshape op that would be generated is not valid (due to shape propogation related concerns, but not actually illegal), but the transformation is still valid. The test case below checks that the fusion does not happen, and illegal (as of today) reshape op is not created. |
Would the op be valid in this case? Should we have a test for this case if it is possible?