Found by PVS Studio
Not familiar with this code; no testcase.
I'm not familiar with this part of TableGen, but trying to narrow down what about RISCV causes this issue, it seems that patterns that have sext_inreg nodes cause this infinite loop, whereby this comparison now fails, and we fall into line 490 and remove modes.
Reading the comment above, it reads to me like this comparison should instead be if either S or B contains an integer mode, rather than if both do? If I change both this and line 486 to || rather than && in addition to this change then the infinite loop disappears. I ran make check-llvm with that change, and I don't see any regression tests failing.
The intent was what this change shows, i.e. if S contains an integer && B contains an integer). I don't know what's causing the timeouts, but IIRC the code at lines 505-509 used to cause some weirdness on X86 (it's been fixed since). Without knowing how the timeouts occur there isn't much that can be done.
Maybe someone from RISC-V could try to figure out what pattern it is that's causing the problem? Perhaps one approach could be to bisect over RISC-V commits with this patch applied and seeing where the timeouts start?