If the original reduction root instruction was vectorized, it might be
removed from the tree. It means that the insertion point may become
invalidated and the whole vectorization of the reduction leads to the
incorrect output result.
The ReductionRoot instruction must be marked as externally used so it
could not be removed. Otherwise it might cause inconsistency with the
cost model and we may end up with too optimistic optimization.
Details
Diff Detail
- Repository
- rL LLVM
Event Timeline
This makes sense to me, but I'm not very familiar with SLP, so hopefully we can get a 2nd opinion.
By making the reduction root externally used, does that mean that some other pass is now responsible for deleting it (assuming it is not actually externally used) where before SLP would delete it?
lib/Transforms/Vectorize/SLPVectorizer.cpp | ||
---|---|---|
5751 ↗ | (On Diff #175496) | Can we add this assert independently of this patch? |
5754 ↗ | (On Diff #175496) | Better to have this code comment explain why this is necessary: // The reduction root is used as the insertion point for new instructions, // so set it as externally used to prevent it from being deleted. |
test/Transforms/SLPVectorizer/X86/PR39774.ll | ||
2 ↗ | (On Diff #175496) | We do not need to specify triple, cpu, or slp-threshold to have this crash currently, right? If this makes the test more robust against future cost model changes, that's good. If not, we could remove those? |
No, ReductionRoot is deleted then at the end of the reduction vectorization, see line 5844. It is marked as externally used by the future reduction vectorization process.
lib/Transforms/Vectorize/SLPVectorizer.cpp | ||
---|---|---|
5751 ↗ | (On Diff #175496) | I think no, because this is the replacement of the deleted assert on line 5829. |
5754 ↗ | (On Diff #175496) | Ok, will do. |
test/Transforms/SLPVectorizer/X86/PR39774.ll | ||
2 ↗ | (On Diff #175496) | Well, currently we need to specify all that stuff to reproduce the crash. With the updated cost model, the slp-threshold can be removed. |