There's a data race on the UninterestingChunks set. The code seems to
be operating on the assumption that all the tasks completed, so ensure
the unused results do complete. This started showing up about 50% of
the time when running operands-skip-parallel.ll after the recent
switch to use DenseSet; previously it failed much less frequently with
std::set.
We should introduce a mechanism to early terminate unused
results. Alternatively, I've been thinking about ways to to make the
reduction order smarter. I frequently have tests that take multiple
minutes to compile and hit the failure. It may be helpful to see which
chunks took the least time and prefer those over just taking the first
result.
missing a word