This bug effected undefined symbols that were resolved by
existing defined symbols. We were skipping the signature
check in this case.
Details
Diff Detail
- Repository
- rLLD LLVM Linker
Event Timeline
wasm/Symbols.h | ||
---|---|---|
97 | This class have setFunctionType and getFunctionType, but this variance doesn't seem to be guaranteed Obj.setFunctionType(X); assert(X == Obj.getFunctionType()); because a value given to setFunctionType is ignored if this class have a non-null Chunk. Isn't that confusing? |
wasm/Symbols.h | ||
---|---|---|
97 | setFunctionType has an assert(!Chunk) which i think covers that case. |
If we can guarantee the invariance by data structure itself, that's better than guaranteeing the same thing by code (which is asserts). Does it make sense to read function type on symbol instantiation time?
Oh I see, so getFunction() can then just return the member, rather than looking at the chunk. Sure I think that makes sense. Can we do that a separate change though?
That's great, thanks for splitting it out. The longer-term fix I think is to just set FunctionType always (for defined and undefined symbols), rather than have the confusion of a member that's sometimes meaningful and sometimes not; less error-prone to keep the values of all the members in sync with each other (where meaningful).
This class have setFunctionType and getFunctionType, but this variance doesn't seem to be guaranteed
because a value given to setFunctionType is ignored if this class have a non-null Chunk. Isn't that confusing?