Currently, getCastInstrCost has limited information about the cast it's rating, often just the opcode and types.
Sometimes there is an instruction as well, but it isn't trustworthy: for instance, when the vectorizer is rating a plan, it calls getCastInstrCost with the old instructions when, in fact, it's trying to evaluate the cost of the instruction post-vectorization.
Thus, the current system isn't good enough as the cost of a cast can vary greatly based on the context in which it's used.
For example, if the vectorizer queries getCastInstrCost to evaluate the cost of a sext (load) with tail predication enabled, getCastInstrCost will think it's free most of the time, but it's not always free. On ARM MVE, if the destination type doesn't fit in a 128 bits register, those kind of casts can be very expensive (2 or more instructions per vector element: each element is converted individually!). (Note: the fix for this is the child revision).
To fix that, this path adds a new parameter to getCastInstrCost to give it a hint about the context of the cast. It adds a CastContextHint enum which contains the type of the load/store being created by the vectorizer - one for each of the types it can produce.
Considering how this is used, it looks like None would mean "This is not a load/strore", not "No context". I guess if we don't have I and we are not provided with anything better, we don't have any context. But in all other situations we at least try to calculate it from I.