Cost-benefit analysis already did something like that, but different,
off by default, and needs a profile (as opposed to being happy with static weights).
In PR50099 i reported that NewPM switch introduced a rather large regression in one benchmark.
That happens because certain destructor call is determined to be cold,
so it is given smaller inlining budget (45), and it's inlining cost measure at ~55.
We could either raise budgets, or lower costs.
D101228, and potentially D101229 does the former.
Here i propose to investigate one approach for the latter.
The large portion of that destructor is exception handling,
and as per block frequency it is *exceptionally* unlikely to execute.
So i propose to introduce a impossible-code-rel-freq option,
defaulting to 2 parts per million (that is the smallest one to do the job)
of function's entry frequency, and saying that if the block's frequency
is less than that, then it is impossible to execute,
and not adding the costs of instructions in said block.
Does this sound completely insane? Thoughts?