In code
void foo(int *p) { if (p) {} } void bar(int *p) { foo(p); *p = 5; }
we suppress the null dereference warning in *p = 5 because the state split within the inlined function is essentially irrelevant, as this is merely a defensive check that does not necessarily trigger in this caller context.
The suppression works by tracking the null pointer symbol in a bug report visitor and seeing if it was constrained to 0 inside a callee context. The fact that we have to rely on such manual suppressions is a downside of the inlining-based approach to interprocedural analysis.
The suppression gets confused when the null dereference becomes more complex, eg:
struct S { int x; } void foo(struct S *s) { if (s) {} } void bar(struct S *s) { foo(s); *(&(s->x)) = 5; }
In this case s is &SymRegion{reg_$0<s>}, and reg_$0<s> is constrained to [0, 0], similarly to the above. However, while computing s->x and then &(s->x), the symbol is collapsed to a constant 0 (Loc), making it harder to track the symbol. The visitor is improved to handle this case.
While the fact that offset of field x in struct S in our example is equal to 0 is purely accidental, with any other offset it'd still be a null dereference, worth suppressing. How does the analyzer handle the non-zero offset case and still see a null dereference? Simply and powerfully: it mis-computes the offset, incorrectly believing that all offsets are equal to 0 (woohoo!). I add a test case to document this behavior; even though it's trivial to fix, the positive side effects of this bug are for now too useful to discard.
Maybe you could move this condition one level up with &&?