This is an archive of the discontinued LLVM Phabricator instance.

Preserve more MD_mem_parallel_loop_access and MD_access_group in SROA
ClosedPublic

Authored by mmendell on May 27 2021, 7:55 AM.

Details

Summary

SROA sometimes preserves MD_mem_parallel_loop_access and MD_access_group metadata on loads/stores, and sometimes fails to do so. This change adds copying of the MD after other CreateAlignedLoad/CreateAlignedStores. Also fix a case where the metadata was being copied from a load, rather than the store.
Added a LIT test to catch one case.

Diff Detail

Event Timeline

mmendell created this revision.May 27 2021, 7:55 AM
mmendell requested review of this revision.May 27 2021, 7:55 AM
mmendell updated this revision to Diff 348975.Jun 1 2021, 8:00 AM

re-uploading to see if the unrelated Windows build failures go away, as they aren't related at all

Can someone please review? I haven't heard from anyone in a week

Not an expert on llvm.access.group, so someone please chime in with more thoughts. My one comment here is that since this impacts more than a load, I'd like to see test cases added for all the relevant behavior (including store).

I really don't know enough to recreate the paths through all the places. The LIT test that I submitted is derived from an actual failure to propagate llvm.access.group in our compiler (and the change in visitIntrinsicInst(IntrinsicInst &II) from LI to SI). The others I just added to be consistent with existing calls to propagate the metadata that were already present.

llvm.access.group is used with llvm.loop.parallel_accesses to mark loads/stores that don't have loop carried memory dependencies

Meinersbur added inline comments.Jun 7 2021, 10:51 AM
llvm/lib/Transforms/Scalar/SROA.cpp
2478

NewAI is the alloca instruction; it does not have parallel loop access or access group metadata which only apply to memory accesses.

2538

Did you mean LI instead of NewAI?

llvm/test/Transforms/SROA/mem-par-metadata-sroa-cast.ll
8–9

This tests the presplitLoadsAndStores case. Shouldn't it also have split the store?

mmendell updated this revision to Diff 350374.Jun 7 2021, 11:43 AM

Removed the copies of metadata from NewAI, as they can't contain any metadata. Only keep the ones from the load/stores

mmendell added inline comments.Jun 7 2021, 11:46 AM
llvm/test/Transforms/SROA/mem-par-metadata-sroa-cast.ll
8–9

The store is replaced with:

%0 = bitcast i32 %X371 to float
%1 = bitcast i32 %X373 to float

Should I add that to the LIT test? The important part was the loss of the llvm.access.group from the store

Meinersbur added inline comments.Jun 8 2021, 6:54 AM
llvm/lib/Transforms/Scalar/SROA.cpp
2478

This one might have made sense

llvm/test/Transforms/SROA/mem-par-metadata-sroa-cast.ll
8–9

The bitcasts belong to the load. I guess SROA determined that the store is dead (stack memory before returning) and removed it. Not an indication of a good test case.

I suggest to make %PREV a function pointer so this does not happen. Attach a separate !llvm.access.group !1 to the store and test it as well. Most other instructions in this function seems to be irrelevant.

mmendell added inline comments.Jun 8 2021, 12:56 PM
llvm/lib/Transforms/Scalar/SROA.cpp
2478

Yes, you are right. I will pass in LI

llvm/test/Transforms/SROA/mem-par-metadata-sroa-cast.ll
8–9

If I make %PREV a function pointer, then the cast isn't done: compiler explorer
I can remove 2 lines (defn and use of %r2) and still get it to fail. Anything else, and it doesn't generate the sroa_cast/sroa_cast2

mmendell updated this revision to Diff 350698.Jun 8 2021, 1:00 PM

Restored the copy of metadata for vectorized loads, passing in LI.
Trimmed the LIT test slightly

Meinersbur accepted this revision.Jun 8 2021, 5:46 PM

LGTM

llvm/test/Transforms/SROA/mem-par-metadata-sroa-cast.ll
8–9

OK, I also tried a bit to get SROA to re-emit a store (and therefore test that the store's metadata is preserved), but without success.

This revision is now accepted and ready to land.Jun 8 2021, 5:46 PM
This revision was landed with ongoing or failed builds.Jun 10 2021, 3:47 PM
This revision was automatically updated to reflect the committed changes.