SROA: Defer migrating the debug info for new allocas until after all partitions are created.
As discussed on #llvm today.
Details
- Reviewers
chandlerc
Diff Detail
Event Timeline
| lib/Transforms/Scalar/SROA.cpp | ||
|---|---|---|
| 3594 | Rather than a member, how about a local here... | |
| 3608–3612 | And change rewritePartition so that if it makes changes it returns the changed alloca. We can compare against &AI to handle the case that we got back the same alloca that went in, and we'll never get back the same alloca when we end up with more than one. | |
Is it possible to craft a test case for this? I'm thinking an alloca which needs rewriting but not splitting...
| lib/Transforms/Scalar/SROA.cpp | ||
|---|---|---|
| 3601 | Just call the parameters the same name as the members. This does the right thing in the initializer list. I would probably use AI, Offset, and Size. | |
| 3604 | I think you at least want a smallvector with 1 piece in it. | |
| 3612–3613 | Any reason not to actually handle the null case for now? I think the code would be pretty clean: if (AllocaInst *NewAI = rewritePartition(AI, AS, P)) {
Changed = true;
if (NewAI != &AI)
Pieces.push_back(...);
}
++NumPartitions; | |
| 3627 | I would say: bool IsSplit = ...; | |
- Addressed all of Chandler's comments.
- Added a testcase were the alloca is rewritten but not repartitioned.
Tiny update: Don't add the dbg.declare to DeadInsts. It's redundant because it will get removed automatically when the alloca is removed. It's also a bad idea to remove the same instruction more than once.
When compiling really large programs the pointer values in the DbgDeclares map may actually get recycled.
- clear the pointer map when entering a function
- remove values after the dbg.declare has been deleted
tested via an assertion.
Just want to say this LGTM in whatever form you decide to commit it.
I know there was a revert in between, so feel free to just commit the new version, or re-apply the old one and then commit this on top. Whatever works.
Rather than a member, how about a local here...