For relatively uninteresting reasons, it is not possible when running under the debugger to invoke the mechanism that converts a Mach Exception into its BSD signal equivalent. The Mach Exceptions contain more information than the signal does, so we prefer to stop at the mach exception raise for most purposes. But if your application relies on a SIGBUS, SIGSEGV, etc. handler, then your application won't run properly in the debugger, it will just get stuck on the exception.
A workaround for this was added a while back - by passing -U to debugserver, EXC_BAD_ACCESS, EXC_BAD_INSTRUCTION and EXC_ARITHMETIC are masked out. But this isn't terribly convenient, because you have to invoke debugserver specially so you have to do this every time you debug, you can't use the feature when you don't control how debugserver is launched, and you can't control which exception you want to mask out.
I made this more convenient by adding a QSetIgnoredExceptions packet to debugserver which will set the exceptions not to include, so we can direct this from lldb.
Then I added a way for Platform to provide "extra startup commands" to the remote startup sequence - there was already a way for the user to supply extra startup commands, and I could have had users set the QSetIgnoredExceptions in the "target.extra-startup-commands", but that would be pretty undiscoverable. Instead, I added a "platform.plugin.darwin.ignored-exceptions" property that users can set with just the exception mask. Then PlatformDarwin gathers the results of this setting, and conses up the appropriate packet and returns that from PlatformDarwin::ExtraStartupCommands.
I also wanted to add a validator for the property. Since I don't control the property creation (that all happens through processing the .td file) I couldn't put my validator in the OptionValueString made for the setting on construction, so I added an API to set it after the fact.
What's the point of having this? Isn't PlatformDarwin an abstract class anyway?