There's a 64k limit on the number of SDNode operands, and some very large functions with 64k or more loads can cause crashes due to this limit being hit when a TokenFactor with this many operands is created. To fix this, create sub-tokenfactors if we've exceeded the limit. No test case as it requires a very large function, however, the test is just this:
define void @foo() { %r1 = load i8, i8* undef %r2 = load i8, i8* undef %r3 = load i8, i8* undef ... etc etc 2^16 times call void @llvm.trap() unreachable
rdar://45196621
Instead of making a tree of TokenFactors, could you make a list? It seems a little simpler (less code, and you don't have to worry about the length of TokenFactors itself).
I'm a little worried that other code dealing with TokenFactors might end up violating the limit if we're very close... any idea if there's other code that could be affected, like DAGCombine? Do we have an assertion somewhere that will reliably catch this issue?