Page MenuHomePhabricator

[MLIR][TOSA] Fix converting tosa.clamp and tosa.relu to linalg
ClosedPublic

Authored by jungpark-mlir on Jun 27 2022, 2:43 AM.

Details

Summary

Tosa to Linalg conversion crashes when input tensor is a float type other than fp32.
Because tosa.clamp and tosa.reluN have fp32 min/max attribute which is converted as arith.constant with the attribute type.
This commit fixes the crash by correctly setting the float constant type from the input tensor.

Diff Detail

Event Timeline

jungpark-mlir created this revision.Jun 27 2022, 2:43 AM
Herald added a project: Restricted Project. · View Herald Transcript
jungpark-mlir requested review of this revision.Jun 27 2022, 2:43 AM

In "things I'm just now noticing now that we're trying to send this upstream", should we add a test?

This doesn't directly impact this change, but in the TOSA specification, we've removed the ReluN operator, since it's trivially implementable with Clamp. I will look at removing ReluN from the dialect to line up with the specification.

In "things I'm just now noticing now that we're trying to send this upstream", should we add a test?

That SGTM

Add test clamp f16

@jpienaar Would you be willing to approve this, or do you not make sense as a reviewer?

@eric-k256 Would you be willing to approve here?

eric-k256 accepted this revision.Jul 7 2022, 11:17 AM

It looks good to me, although I don't have commit rights, so it might not be all the approvals you need.

As another side note, we recently released a new version of the TOSA specification, where we call out non-fp32 data types directly for operators and their attributes: https://www.mlplatform.org/tosa/tosa_spec.html. Our goal is to do some fine tuning to make sure that the specification part looks good, and then try to align the dialect. This should end up with better fp16 support in the long run.

This revision is now accepted and ready to land.Jul 7 2022, 11:17 AM

And I do have the bits, so landed. Thanks for the external pair of eyes!