While calculating a variable's scope OffsetToFirstDefinition is supposed
to be applied to the scope if a definition of the variable doesn't match
the beginning of the scope (this may happen if a variable is defined
in the middle of a function, for example).
But this also causes inaccuracies in debug info statistics especially in
optimized code.
This patch is a try to reduce the number of these inaccuracies by:
- applying the offset to the local variable's scopes only as the scope of the function argument is the whole function,
- applying the offset before calculating a variable's coverage bucket, otherwise, statistics look inconsistent,
- while calculating the offset taking into account the fact, that the scope may be represented as a set of disconnected ranges.
Currently the offset is calculated as a distance between an address where the scope starts
(scope's LowPC) and an address where a variable is defined (variable's LowPC).
Consider these two cases:
Scope ranges [0x04,0x08), [0x0a,0x0e) Variable location [0x0a,0x0b)
Scope = (0x08 - 0x04) + (0x0e - 0x0a) = 8
OffsetToFirstDef = 0x0a - 0x04 = 6
If we adjust the scope by the offset, it will be 2 bytes instead of 4
as it should be.
Scope ranges [0x00,0x02), [0x0a,0x0e) Variable location [0x0a,0x0b)
Scope = (0x02 - 0x00) + (0x0e - 0x0a) = 6
OffsetToFirstDef = 0x0a - 0x00 = 10
The offset is greater than the scope itself, so this will be counted as an error
and the offsest will be set to 0.
Can you add a /// comment that explains what this function is doing?