This is an archive of the discontinued LLVM Phabricator instance.

Add argument to expectedFailureAll decorator to xfail for environment variables, and xfail tests for hard float abi and -mthumb on android arm
Needs ReviewPublic

Authored by chying on Mar 25 2016, 1:12 PM.

Details

Reviewers
tberghammer
Summary

xfail TestCallUserAnonTypedef.test for -mthumb, TestReturnValue.test_with_python for hard fp, TestLldbGdbServer.software_breakpoint_set_and_remove_work for hard fp.

Diff Detail

Event Timeline

chying updated this revision to Diff 51679.Mar 25 2016, 1:12 PM
chying retitled this revision from to Add argument to expectedFailureAll decorator to xfail for environment variables, and xfail tests for hard float abi and -mthumb on android arm.
chying updated this object.
chying added reviewers: chaoren, zturner.
chying added a subscriber: lldb-commits.

This seems very strange to me. What if someone specifies those flags on the command line in the makefile instead of via some environment variables?

It also seems like a very specialized argument that we might use once or twice ever.

+pavel and tamas in case they have some suggestions.

chaoren edited edge metadata.Mar 25 2016, 1:48 PM

This seems very strange to me. What if someone specifies those flags on the command line in the makefile instead of via some environment variables?

I don't think there's anything we can do about that unless we parse the Makefiles from within the test suite. These kinds of flags are bad to hard code into the Makefiles anyway, unless there's a way to mark a test as thumb specific or hard float specific in the test suite, in which case we'd be using *those* indicators instead to filter for xfail. For now I think this is a pretty reasonable way to discriminate on the validity of failures based on compile flags.

It also seems like a very specialized argument that we might use once or twice ever.

This applies to arbitrary compile flags. Though you probably won't ever need it for x86/x86_64, I can see it being useful for ARM and MIPS due to the sheer number of variations.

Couldn't we have a way to say in the Makefile "never use these flags", then the test suite could check the environment and remove them if they are present. This woudl allow the test to run.

architecture, compiler, etc are things we don't really have control over. If a flag or an environment variable is causing a test to fail, it seems like the right thing to do is not to use that flag or environment variable.

zturner added a comment.EditedMar 25 2016, 1:56 PM

I'll still wait and see what Pavel and/or Tamas say, but if we are going to go this route, I would rather the argument be called cflags and not env_flags. Xfailing a test based on an arbitrary environment variable just seems like something we shouldn't be doing. I'd rather prime the environment however we need in order to get the test to run, than disable it and lose test coverage.

Couldn't we have a way to say in the Makefile "never use these flags", then the test suite could check the environment and remove them if they are present. This woudl allow the test to run.

architecture, compiler, etc are things we don't really have control over. If a flag or an environment variable is causing a test to fail, it seems like the right thing to do is not to use that flag or environment variable.

I don't find a way to pass compiler flags to test suite by current dotest options (please let me know if there is), so they're passed by CFLAGS_EXTRAS variable, which is checked and applied in the main Makefile. Alternative would be adding new dotest options to take compiler flags, then we add xfail based on compiler flag, please let me know how you think about this approach.

I'll still wait and see what Pavel and/or Tamas say, but if we are going to go this route, I would rather the argument be called cflags and not env_flags. Xfailing a test based on an arbitrary environment variable just seems like something we shouldn't be doing. I'd rather prime the environment however we need in order to get the test to run, than disable it and lose test coverage.

Yes, let's get more opinions on this.

tberghammer edited edge metadata.Mar 29 2016, 3:53 AM

You can pass extra compiler flags to the test suit with the -E option (haven't tested it). I would suggest to use that option and then xfail based on it (xfail if extra flags contain -foo).

In long term I would prefer to not add a new option to expectedFailureAll but to implement a generic functional style solution where we can just pass in arbitrary number of lambda functions to it so the number of arguments aren't increasing constantly but it is something what is completely unrelated to this patch.

labath edited edge metadata.Mar 29 2016, 7:29 AM

So, the way I would achieve this is to have a way to specify "i want to run the tests with hard-float", or "I want to use thumb instruction set" when running dotest.py and then based on this information, set the correct compiler flags and xfail/skip tests. When we do it this way, we could actually fix some of the tests you are trying to skip here. E.g. the only reason TestLldbGdbServer is failing for thumb is because the test does not know the instruction set and so it cannot put the right kind of a breakpoint. As for the dotest flags, we could either invent some new flags (--isa and --abi, perhaps?) or shove this information into --arch somehow (--arch=arm-softfp, --arch=thumb-hardfp ?).

I would prefer a more general solution then the one Pavel proposed because currently we are adding 2 more options (abi and isa) but in the future we might need a lot of other stuff as well and adding a new argument for each one will be problematic (e.g. omit frame pointer, tune for processor core, specify fpu instruction set, etc...). I think if we have the option to specify a list of additional command line flags then tests can change behavior based on that list what will result in a similar behavior then Pavel's suggestion

zturner resigned from this revision.Jun 2 2016, 8:37 PM
zturner removed a reviewer: zturner.
labath resigned from this revision.Apr 20 2017, 6:13 AM