After D49309 it became clear that we always need a null-terminated string (for
the Python API), so we might as well change the API to accept an std::string&
instead of taking a StringRef and then always allocating a new std::string.
Details
- Reviewers
dblaikie
Diff Detail
Event Timeline
Seems good to me - though I haven't looked too closely/don't know this code terribly well (so you're welcome to wait if you know someone else more knowledgable might take a look too - or if you're fairly confident, commit & they can always follow up in post-commit)
Normally this would be clearly a good thing, but the added complication here is that this function is part of a class hierarchy, and so this way you are forcing every implementation to take a std::string, even though only one of them cares about null-termination.
In performance-critical code, llvm would use llvm::Twine as an argument, which is able to avoid copies if the input string happens to be null-terminated (toNullTerminatedStringRef). However, this code is hardly that critical (and ScriptInterpreterPython is the only non-trivial class in the hierarchy), so I don't think it really matters what you do here.
Much simpler solution is inlined. Let me know what you think?
source/Plugins/ScriptInterpreter/Python/ScriptInterpreterPython.cpp | ||
---|---|---|
859 | Remove everything in this patch and just do: Py_BuildValue("(Os#)", session_dict.get(), command.data(), (int)command.size())); |
@clayborg Seems reasonable, even though the Python docs say that we should use Py_ssize_t instead of int. I'll update the patch.
Remove everything in this patch and just do: