This is an archive of the discontinued LLVM Phabricator instance.

[CMake] Cache the compiler-rt library search results
ClosedPublic

Authored by phosek on Sep 28 2020, 5:41 PM.

Details

Summary

There's a lot of duplicated calls to find various compiler-rt libraries
from build of runtime libraries like libunwind, libc++, libc++abi and
compiler-rt. The compiler-rt helper module already implemented caching
for results avoid repeated Clang invocations.

This change moves the compiler-rt implementation into a shared location
and reuses it from other runtimes to reduce duplication and speed up
the build.

Diff Detail

Event Timeline

phosek created this revision.Sep 28 2020, 5:41 PM
Herald added projects: Restricted Project, Restricted Project, Restricted Project, Restricted Project. · View Herald TranscriptSep 28 2020, 5:41 PM
Herald added a reviewer: Restricted Project. · View Herald Transcript
Herald added a reviewer: Restricted Project. · View Herald Transcript
Herald added a reviewer: Restricted Project. · View Herald Transcript
Herald added subscribers: libcxx-commits, Restricted Project, dexonsmith and 2 others. · View Herald Transcript
phosek requested review of this revision.Sep 28 2020, 5:41 PM
phosek retitled this revision from [CMake] Cache compiler-rt builtin results to [CMake] Cache compiler-rt library results.Sep 28 2020, 5:42 PM

If we require LLVM for any of the projects, perhaps we should hoist the HandleCompilerRT.cmake into LLVM and avoid having to maintain the copy everywhere.

compiler-rt/cmake/Modules/HandleCompilerRT.cmake
0

Why change this to a macro? Its setting a cached variable, which should be possible from within a function right?

libcxx/cmake/Modules/HandleCompilerRT.cmake
0

Please make this a function as well.

0

Where is target supposed to be defined? This is tautologically if(CMAKE_CXX_COMPILER_TARGET) no?

14

I think that this deserves a comment explaining what is going on. Using clang's runtime with gcc is possible.

libcxxabi/cmake/Modules/HandleCompilerRT.cmake
0

Please make this a function

libunwind/cmake/Modules/HandleCompilerRT.cmake
0

Same

phosek updated this revision to Diff 295150.Sep 29 2020, 4:34 PM
phosek marked 6 inline comments as done.
phosek added inline comments.
compiler-rt/cmake/Modules/HandleCompilerRT.cmake
0

This is an implementation detail which shouldn't be used from outside of this file, I was originally planning on moving it inside find_compiler_rt_library but then I realized we may also want to use it from find_compiler_rt_dir. Maybe we should rename this function to __cache_compiler_rt_library to signal the fact that it's internal?

If we require LLVM for any of the projects, perhaps we should hoist the HandleCompilerRT.cmake into LLVM and avoid having to maintain the copy everywhere.

That's the eventual plan but we haven't yet figured out how the sharing is going to work.

compnerd added inline comments.Sep 30 2020, 8:32 AM
libunwind/cmake/Modules/HandleCompilerRT.cmake
0

I'm still not connecting where target is set.

ldionne requested changes to this revision.Sep 30 2020, 10:42 AM

All those projects should require the whole LLVM monorepo. Let's have a single copy of HandleCompilerRT.

This revision now requires changes to proceed.Sep 30 2020, 10:42 AM

If we require LLVM for any of the projects, perhaps we should hoist the HandleCompilerRT.cmake into LLVM and avoid having to maintain the copy everywhere.

That's the eventual plan but we haven't yet figured out how the sharing is going to work.

What do you mean? Just include(...) the file from the various projects?

If we require LLVM for any of the projects, perhaps we should hoist the HandleCompilerRT.cmake into LLVM and avoid having to maintain the copy everywhere.

That's the eventual plan but we haven't yet figured out how the sharing is going to work.

What do you mean? Just include(...) the file from the various projects?

When migrating to monorepo, we discussed a plan to provide slices for individual projects. While these haven't been provided officially, we already have these for our own purposes (e.g. https://llvm.googlesource.com/llvm-project/libcxx/) and they are used by projects like Chromium. A potential solution would be extend the machinery so that the content of <llvm-project>/cmake/Modules is copied into each slice?

If we require LLVM for any of the projects, perhaps we should hoist the HandleCompilerRT.cmake into LLVM and avoid having to maintain the copy everywhere.

That's the eventual plan but we haven't yet figured out how the sharing is going to work.

What do you mean? Just include(...) the file from the various projects?

When migrating to monorepo, we discussed a plan to provide slices for individual projects. While these haven't been provided officially, we already have these for our own purposes (e.g. https://llvm.googlesource.com/llvm-project/libcxx/) and they are used by projects like Chromium. A potential solution would be extend the machinery so that the content of <llvm-project>/cmake/Modules is copied into each slice?

I wasn't aware of that, and libc++ / libc++abi don't support that. See http://lists.llvm.org/pipermail/libcxx-dev/2020-March/000714.html for example.

In libc++ and libc++abi, we make the assumption that we're checked out at the root of the monorepo. Please sell me the need to support something else, because the only benefit I can think of is "it's faster to clone just libc++", which is not a strong argument to justify such code duplication.

phosek updated this revision to Diff 363162.Jul 30 2021, 11:30 AM
phosek retitled this revision from [CMake] Cache compiler-rt library results to [CMake] Cache the compiler-rt library search results.
phosek edited the summary of this revision. (Show Details)

@ldionne I'm picking up this change again as I'm planning to do a larger cleanup of CMake checks in the coming weeks. I have extracted the compiler-rt logic into a shared module and reused it from other runtimes, let me know if this looks good to you.

phosek updated this revision to Diff 363172.Jul 30 2021, 11:52 AM

@beanz @smeenai Can you take a look as well?

ldionne accepted this revision.Sep 20 2021, 2:18 PM
ldionne added a subscriber: thakis.

This LGTM, but it's going to break some people who copy only some subdirectories of llvm. I think it's unavoidable, but we'll want to let them know. I think it might apply to e.g. @thakis.

I assume we'll also want to move more common stuff to <root>/cmake in the future? A common place for CMake modules has been sorely needed for a while, thanks for getting the ball rolling!

compiler-rt/cmake/config-ix.cmake
19–23

Wasn't aware of this, it's pretty awesome.

This LGTM, but it's going to break some people who copy only some subdirectories of llvm. I think it's unavoidable, but we'll want to let them know. I think it might apply to e.g. @thakis.

Chromium is building libc++, libc++abi and libunwind outside of the monorepo, but they use their own GN build so this shouldn't affect them.

I assume we'll also want to move more common stuff to <root>/cmake in the future? A common place for CMake modules has been sorely needed for a while, thanks for getting the ball rolling!

Yes, that's the plan. In particular, after D110005 lands, I'd like to move most of the helper macros and functions from libcxx/cmake/Modules/HandleLibcxxFlags.cmake that are shared across runtimes to <root>/cmake so we don't need to keep them in sync manually.

steven_wu accepted this revision.Sep 20 2021, 3:38 PM
steven_wu added a subscriber: steven_wu.

LGTM.

I want to do something like this for a long time. It would be good to ReleaseNote the change so people who is distributing/building with a subset of the project directories can be updated to build with top level cmake directory (@ldionne that probably include our internal build system). Some day we might be able to cmake configure with the top level llvm-project directory.

This revision is now accepted and ready to land.Sep 20 2021, 3:38 PM

This LGTM, but it's going to break some people who copy only some subdirectories of llvm. I think it's unavoidable, but we'll want to let them know. I think it might apply to e.g. @thakis.

AFAIK @tstellar builds the runtimes this way too.

If all the cmake files where installed to <root>/cmake. Distros like Fedora that build LLVM in standalone mode would likely package all these files into a llvm-cmake package and have the other projects depend on it. I actually think doing this will make standalone builds easier. If we do move to this approach, can we make sure the other projects that depend on them don't hardcode source paths, but have the option to include them from any directory on the system. I don' t think supporting this will be too invasive.

smeenai accepted this revision.Sep 21 2021, 2:14 PM

This looks great, and having a top-level CMake modules directory makes a lot of sense.

There's no LICENSE.TXT file at the top level so there isn't one that clearly applies to the new cmake/. Should compiler-rt/LICENSE.TXT be copied to cmake/LICENSE.TXT?

beanz added a comment.Sep 21 2021, 3:42 PM

There's no LICENSE.TXT file at the top level so there isn't one that clearly applies to the new cmake/. Should compiler-rt/LICENSE.TXT be copied to cmake/LICENSE.TXT?

Is there any reason not to just put one at the top now?

I assume we aren't going to add a new per-project mirror for the cmake folder, so we might as well just have a license file at the root and not inside the cmake directory. Since all the licenses are the same my understanding was the only reason we kept them per-directory was to ensure they appeared in the mirrors.

There's no LICENSE.TXT file at the top level so there isn't one that clearly applies to the new cmake/. Should compiler-rt/LICENSE.TXT be copied to cmake/LICENSE.TXT?

Is there any reason not to just put one at the top now?

This was suggested before and there wasn't consensus for change. See https://reviews.llvm.org/D64875.

There's no LICENSE.TXT file at the top level so there isn't one that clearly applies to the new cmake/. Should compiler-rt/LICENSE.TXT be copied to cmake/LICENSE.TXT?

Is there any reason not to just put one at the top now?

This was suggested before and there wasn't consensus for change. See https://reviews.llvm.org/D64875.

I don't mind adding a license file, but we also seem to be missing one for other top-level directories that were added recently like cross-project-tests, runtimes or utils so I wonder if this is something that needs to be addressed more broadly?

beanz added a comment.Sep 29 2021, 6:25 AM

I don't mind adding a license file, but we also seem to be missing one for other top-level directories that were added recently like cross-project-tests, runtimes or utils so I wonder if this is something that needs to be addressed more broadly?

Totally agree with @phosek here. A lot has changed since 2019. One big point is that we are making it so you can’t build some parts of the project without a checkout of the mono repo.

Is there any objection to landing this change as is and addressing the licensing aspect later in a separate change?

Is there any objection to landing this change as is and addressing the licensing aspect later in a separate change?

IMO that's reasonable, let's not hold this change back -- the lack of a top-level license in some subdirectories is a pre-existing condition, so when we decide what to do about that issue, we can solve it for all subdirectories.

I'd like to see CI passing before we ship this though.

phosek updated this revision to Diff 377454.Oct 6 2021, 12:56 AM
phosek added a comment.Oct 6 2021, 2:11 AM

Is there any objection to landing this change as is and addressing the licensing aspect later in a separate change?

IMO that's reasonable, let's not hold this change back -- the lack of a top-level license in some subdirectories is a pre-existing condition, so when we decide what to do about that issue, we can solve it for all subdirectories.

I'd like to see CI passing before we ship this though.

Done, I believe I've also addressed PR51389 which was the reason for the failing build on Darwin.

I suspect this is going to impact llvm/utils/release/export.sh that creates individual tarballs based on git directory content. If we were using cpack there, this wouldn't be an issue though. @tstellar can you confirm?

phosek added a comment.Oct 8 2021, 2:43 PM

I suspect this is going to impact llvm/utils/release/export.sh that creates individual tarballs based on git directory content. If we were using cpack there, this wouldn't be an issue though. @tstellar can you confirm?

Is the expectation that these tarballs are separately buildable? Is there any kind of checking done to ensure that it's indeed the case? We could update the script to also create a tarball for the top-level CMake directory.

h-vetinari added a comment.EditedOct 11 2021, 8:04 PM

Given the cross-review of the respective authors, I'm assuming this is compatible / aligned with https://reviews.llvm.org/D111356? I ask because despite the mutual review, this wasn't mentioned in the email to LLVM-dev about the latter.

Given the cross-review of the respective authors, I'm assuming this is compatible / aligned with https://reviews.llvm.org/D111356? I ask because despite the mutual review, this wasn't mentioned in the email to LLVM-dev about the latter.

I sent a separate RFC about this change to llvm-dev last week. These two changes are orthogonal although they both fit our shared goal of cleaning up and simplifying the build.

Are we waiting for something to land this? If not, let's do this!

Are we waiting for something to land this? If not, let's do this!

I was looking at the script for building distribution archives and zorg, but those can be updated separately (we'll need to do it for the runtimes/ directory as well).

This revision was automatically updated to reflect the committed changes.

I think this creates a build bustage when llvm is built with the following flags:

cmake -GNinja -DCMAKE_C_COMPILER=/usr/bin/gcc -DCMAKE_CXX_COMPILER=/usr/bin/g++ -DCMAKE_ASM_COMPILER=/usr/bin/gcc -DCMAKE_LINKER=/usr/bin/ld -DCMAKE_AR=/usr/bin/ar -DCMAKE_C_FLAGS= -DCMAKE_CXX_FLAGS= -DCMAKE_ASM_FLAGS= -DCMAKE_EXE_LINKER_FLAGS=-Wl,-Bsymbolic-functions -fuse-ld=gold -Wl,--gc-sections -Wl,--icf=safe -DCMAKE_SHARED_LINKER_FLAGS=-Wl,-Bsymbolic-functions -fuse-ld=gold -Wl,--gc-sections -Wl,--icf=safe -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=/builds/worker/fetches/llvm-project/build/stage1/clang -DLLVM_TARGETS_TO_BUILD=X86 -DLLVM_ENABLE_ASSERTIONS=OFF -DLLVM_TOOL_LIBCXX_BUILD=ON -DLLVM_ENABLE_BINDINGS=OFF -DCLANG_REPOSITORY_STRING=taskcluster-UynxIUZSRCaRcXEcj3dHqA -DLLVM_ENABLE_PROJECTS=clang;compiler-rt -DLLVM_INCLUDE_TESTS=OFF -DLLVM_TOOL_LLI_BUILD=OFF -DCOMPILER_RT_BUILD_SANITIZERS=OFF -DCOMPILER_RT_BUILD_XRAY=OFF -DCOMPILER_RT_BUILD_MEMPROF=OFF -DCOMPILER_RT_BUILD_LIBFUZZER=OFF -DLLVM_EXPERIMENTAL_TARGETS_TO_BUILD=WebAssembly -DLLVM_LINK_LLVM_DYLIB=ON -DCMAKE_RANLIB=/usr/bin/ranlib /builds/worker/fetches/llvm-project/llvm

The issues that occur are like:

Determining if the include file histedit.h exists failed with the following output:

Change Dir: /builds/worker/fetches/llvm-project/build/stage1/build/CMakeFiles/CMakeTmp



Run Build Command(s):/usr/bin/ninja cmTC_14991 && [1/2] Building C object CMakeFiles/cmTC_14991.dir/CheckIncludeFile.c.o

FAILED: CMakeFiles/cmTC_14991.dir/CheckIncludeFile.c.o 

/usr/bin/gcc    -o CMakeFiles/cmTC_14991.dir/CheckIncludeFile.c.o -c CheckIncludeFile.c

CheckIncludeFile.c:1:10: fatal error: histedit.h: No such file or directory

    1 | #include <histedit.h>

      |          ^~~~~~~~~~~~

compilation terminated.

ninja: build stopped: subcommand failed.

This is also causing build failures in our ToT builds.

phosek@ can you please revert.
You might be able to see the logs here:
https://logs.chromium.org/logs/chromeos/buildbucket/cr-buildbucket/8832772415946723489/+/steps/update_sdk/0/steps/call_chromite.api.SdkService_Update/0/steps/call_build_API_script/0/stdout

The package is building libc++abi for 32-bit and 64-bit x86.

See that for 32-bit, compiler rt detection is going wrong.
libcxxabi-13.0_pre428724-r4: -- Found compiler-rt builtins library: /usr/lib64/clang/14.0.0/lib/linux/libclang_rt.builtins-x86_64.a .

Nowhere, is 32-bit detected correctly.

phosek reopened this revision.Oct 21 2021, 10:34 AM

Thanks for update, I have reverted the change and I'll try to reproduce and debug the issue.

This revision is now accepted and ready to land.Oct 21 2021, 10:34 AM

@manojgupta Where can I find the cache and toolchain files used in your build? I'm not able to reproduce the issue.

I'll send the cache and other details by email.

phosek updated this revision to Diff 382531.Oct 26 2021, 11:43 PM

Properly handle the flags to address the issue seen on some bots when cross-compiling.

This revision was automatically updated to reflect the committed changes.

Properly handle the flags to address the issue seen on some bots when cross-compiling.

I don't know if you noticed my previous comment but this again broke the build.

Properly handle the flags to address the issue seen on some bots when cross-compiling.

I don't know if you noticed my previous comment but this again broke the build.

I tried reproducing the issue locally using the command you provided but I'm not seeing the same error. Would it be possible to share CMakeFiles/CMakeError.log from your build?

Properly handle the flags to address the issue seen on some bots when cross-compiling.

I don't know if you noticed my previous comment but this again broke the build.

I tried reproducing the issue locally using the command you provided but I'm not seeing the same error. Would it be possible to share CMakeFiles/CMakeError.log from your build?

Thanks for looking into this. Please see this.

phosek added a comment.Nov 1 2021, 9:50 AM

Properly handle the flags to address the issue seen on some bots when cross-compiling.

I don't know if you noticed my previous comment but this again broke the build.

I tried reproducing the issue locally using the command you provided but I'm not seeing the same error. Would it be possible to share CMakeFiles/CMakeError.log from your build?

Thanks for looking into this. Please see this.

Thanks, I looked at the CMakeError.log file but there's nothing out of ordinary. I don't think that the failing histedit.h check is a new issue, that seems like the expected behavior.

Can I see the full build log somewhere?

Abpostelnicu added a comment.EditedNov 4 2021, 7:30 AM

Properly handle the flags to address the issue seen on some bots when cross-compiling.

I don't know if you noticed my previous comment but this again broke the build. It's a tar of the build directory.

I tried reproducing the issue locally using the command you provided but I'm not seeing the same error. Would it be possible to share CMakeFiles/CMakeError.log from your build?

Thanks for looking into this. Please see this.

Thanks, I looked at the CMakeError.log file but there's nothing out of ordinary. I don't think that the failing histedit.h check is a new issue, that seems like the expected behavior.

Can I see the full build log somewhere?

Please let me know if this helps https://drive.google.com/file/d/1nzKVJOnYWuXKH1nJufClXCq5QHeJBlVp/view?usp=sharing

phosek added a comment.Nov 4 2021, 9:58 AM

Properly handle the flags to address the issue seen on some bots when cross-compiling.

I don't know if you noticed my previous comment but this again broke the build. It's a tar of the build directory.

I tried reproducing the issue locally using the command you provided but I'm not seeing the same error. Would it be possible to share CMakeFiles/CMakeError.log from your build?

Thanks for looking into this. Please see this.

Thanks, I looked at the CMakeError.log file but there's nothing out of ordinary. I don't think that the failing histedit.h check is a new issue, that seems like the expected behavior.

Can I see the full build log somewhere?

Please let me know if this helps https://drive.google.com/file/d/1nzKVJOnYWuXKH1nJufClXCq5QHeJBlVp/view?usp=sharing

I was hoping to see the output from CMake which is not included.

Properly handle the flags to address the issue seen on some bots when cross-compiling.

I don't know if you noticed my previous comment but this again broke the build. It's a tar of the build directory.

I tried reproducing the issue locally using the command you provided but I'm not seeing the same error. Would it be possible to share CMakeFiles/CMakeError.log from your build?

Thanks for looking into this. Please see this.

Thanks, I looked at the CMakeError.log file but there's nothing out of ordinary. I don't think that the failing histedit.h check is a new issue, that seems like the expected behavior.

Can I see the full build log somewhere?

Please let me know if this helps https://drive.google.com/file/d/1nzKVJOnYWuXKH1nJufClXCq5QHeJBlVp/view?usp=sharing

I was hoping to see the output from CMake which is not included.

Sure, is this something that helps https://paste.mozilla.org/yxUYUBk3 ?

phosek added a comment.EditedNov 9 2021, 1:56 AM

Sure, is this something that helps https://paste.mozilla.org/yxUYUBk3 ?

Yes, thank you. The issue is this:

CMake Error at projects/compiler-rt/cmake/Modules/AddCompilerRT.cmake:3 (include):
include could not find load file:

/builds/worker/fetches/llvm-project/llvm/projects/compiler-rt/../cmake/Modules/HandleCompilerRT.cmake
Call Stack (most recent call first):
projects/compiler-rt/lib/CMakeLists.txt:4 (include)

It looks to me like you're using the pre-monorepo layout where you move compiler-rt to llvm/projects, is that correct?

We could make it work, but I'm not sure if we should. The monorepo layout is the only supported one and we've already discussed the plans to remove the non-monorepo layout support to simplify our build.

Sure, is this something that helps https://paste.mozilla.org/yxUYUBk3 ?

Yes, thank you. The issue is this:

CMake Error at projects/compiler-rt/cmake/Modules/AddCompilerRT.cmake:3 (include):
include could not find load file:

/builds/worker/fetches/llvm-project/llvm/projects/compiler-rt/../cmake/Modules/HandleCompilerRT.cmake
Call Stack (most recent call first):
projects/compiler-rt/lib/CMakeLists.txt:4 (include)

It looks to me like you're using the pre-monorepo layout where you move compiler-rt to llvm/projects, is that correct?

Yes, you are correct.

We could make it work, but I'm not sure if we should. The monorepo layout is the only supported one and we've already discussed the plans to remove the non-monorepo layout support to simplify our build.

It would be better to make it work, removing support for the old build system will break many toolchains.

phosek added a comment.Nov 9 2021, 2:11 AM

It looks to me like you're using the pre-monorepo layout where you move compiler-rt to llvm/projects, is that correct?

Yes, you are correct.

Is there any particular reason for doing this rather than simply using the monorepo layout?

We could make it work, but I'm not sure if we should. The monorepo layout is the only supported one and we've already discussed the plans to remove the non-monorepo layout support to simplify our build.

It would be better to make it work, removing support for the old build system will break many toolchains.

So far you're the only one who reached affected by this change which to me suggests that there may not be many projects using the old layout.

When LLVM adopted the monorepo we made no guarantees that we would keep the old layout working.

We're also planning larger build changes after LLVM 14 which will likely disrupt you, see https://lists.llvm.org/pipermail/llvm-dev/2021-October/153238.html.

It looks to me like you're using the pre-monorepo layout where you move compiler-rt to llvm/projects, is that correct?

Yes, you are correct.

Is there any particular reason for doing this rather than simply using the monorepo layout?

We could make it work, but I'm not sure if we should. The monorepo layout is the only supported one and we've already discussed the plans to remove the non-monorepo layout support to simplify our build.

It would be better to make it work, removing support for the old build system will break many toolchains.

So far you're the only one who reached affected by this change which to me suggests that there may not be many projects using the old layout.

When LLVM adopted the monorepo we made no guarantees that we would keep the old layout working.

True but this is the first change that breaks it.

We're also planning larger build changes after LLVM 14 which will likely disrupt you, see https://lists.llvm.org/pipermail/llvm-dev/2021-October/153238.html.

I think we are getting a bit far, there is time till then to do our adjustments.

So far you're the only one who reached affected by this change which to me suggests that there may not be many projects using the old layout.

When LLVM adopted the monorepo we made no guarantees that we would keep the old layout working.

True but this is the first change that breaks it.

We're also planning larger build changes after LLVM 14 which will likely disrupt you, see https://lists.llvm.org/pipermail/llvm-dev/2021-October/153238.html.

I think we are getting a bit far, there is time till then to do our adjustments.

FWIW I 100% support @phosek here. We are making improvements to our build system that allow us to reduce maintenance costs, increase flexibility of our build system (e.g. for new platforms), reduce the room for mistakes, etc. As often the case with this kind of change, it requires the cooperation from downstream people who use our code.

We have done everything we can to communicate these changes to downstream users and we are giving them a generous grace period, now it is their (your) responsibility to act on that notification. Concretely, please:

  1. Make sure you are building in a vanilla monorepo layout.
  2. Make sure you use one of the documented ways to build the runtimes: https://libcxx.llvm.org/BuildingLibcxx.html

If you don't do that, LLVM 15 is going to break you, and we will not revert that.