This is a regression of https://reviews.llvm.org/D61843 found by the build bot. This is the narrow down case:
void foo(char *p, char v) { p[0] = v; p[1] = v; }
We have a table ByteOffsets to map the offset in the combined value "v"(the vector index) to the offset in the store address "p" (the vector element value).
In this case, we will have:
ByteOffsets[0] = 0; // The first byte of "v" is stored to p[0] ByteOffsets[0] = 1; // The first byte of "v" is stored to p[1]
Therefore, at last, ByteOffsets[0] will be set as "1". However, as this vector is created with 2 elements that has "0" as its element default value, therefore, it will look like this:
ByteOffsets[0] = 1; // As described before. ByteOffsets[1] = 0; // The ctor initialized it as 0.
And this match the endian check, so, we will do the transformation, which is wrong.
Therefore, we need to initialize the default value of the table as INT64_MAX to avoid this issue. In fact, the range of the value is, [-63, 63], as it is offset from the base ("p"). And we can only merge 64 stores at most.
I have run the check-all, llvm testsuite, spec2017, and bootstrap, all are clean.