It seems consistent to always return zero for known poison rather than
varying the value. We do the same elsewhere.
Details
Diff Detail
- Repository
- rG LLVM Github Monorepo
Event Timeline
llvm/test/CodeGen/WebAssembly/pr59626.ll | ||
---|---|---|
18 | Can you point to the IR test that's evaluating to just poison? As long as this test case isn't crashing then I'm ok, but I'm probably not the right person to ask about KnownBits. I'm surprised this is the only test case that is affected! |
llvm/test/CodeGen/WebAssembly/pr59626.ll | ||
---|---|---|
18 |
This test @f:
I'm just asking if the affects to the wasm codegen is okay. |
IMHO we shouldn't go out of the way to handle such cases -- if we need to handle this for other reasons (e.g. for shift amounts, or for conflicts) that's one thing, but explicitly checking for poison result for known constant cases is pretty pointless -- those will get folded to actual poison values by code that can do so.
The INT_MIN / -1 case isn't necessary, but the LHS.isZero() || RHS.isZero() cases reduce the number of
checks we need elsewhere (specifically now we know MinTZ < BitWidth)
Edit: But I have no particularly strong feelings either way. the Zero, Zero case just
seemed to be the simplest way to handle it. The INT_MIN, -1 case was just to
be consistent. If you feel either/both should be dropped I have no objections.
Actually testing the impl without this, the zero, zero check will still be needed (just
later on) so would prefer to keep that regardless.
Is this referring to the code in D150923? That is we do the early isZero() check here and then divComputeLowBit() from that patch has an implicit precondition that the denominator is not zero? Adding an explicit < BitWidth check there seems more obvious to me, unless it's more than that?
nit: Poison -> UB