[question] VirtualRunEnv does not take tool_requires into account?
What is your question?
Hi, I am currently migrating to Conan 2 (i.e. 2.0.12) and facing issues with the VirtualRunEnv generator when using gcc as a conan package which is obviously needed in the buildenv but also in the runenv, since it also provides libs like e.g. libstdc++.so and the like.
My understanding was that in Conan 2 the generator automatically takes care of propagating libdirs to LD_LIBRARY_PATH in the runenv. But this is apparently not the case for tool_requires, although I expected this to be the reason why tool_requires sets the trait run=True.
Specifically I first encountered the issue when building geographiclib (https://github.com/conan-io/conan-center-index/blob/master/recipes/geographiclib/all/test_package/conanfile.py). I provide the compiler dependency using a profile like this:
[settings]
compiler=gcc
compiler.version=11.4
compiler.libcxx=libstdc++11
[tool_requires]
gcc/11.4.0@me/testing
Building geographiclib works fine, but executing the test package leads to the following error:
======== Testing the package: Executing test ========
geographiclib/1.52@me/testing (test package): Running test()
geographiclib/1.52@me/testing (test package): RUN: ./test_package
./test_package: /lib/x86_64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.29' not found (required by /home/me/.conan2/p/b/geogr35b2f1e80aeb9/p/lib/libGeographic.so.19)
ERROR: geographiclib/1.52@me/testing (test package): Error in test() method, line 26
self.run(bin_path, env="conanrun")
ConanException: Error 1 while executing
Obviously the libdir of gcc is not part of the LD_LIBRARY_PATH and instead of the libstdc++.so provided by the conan package the system one is found during runtime, which does not match the version from compilation.
This can be mitigated by hacking an explicit runtime dependency into the test package like self.requires("gcc/11.4.0@me/stable", run=True). Then executing the test executable in the conanrun env works. But this is just a hack.
From looking at the code in conan/tools/env/virtualrunenv.py in the environment() method, I see that only host and test dependencies are taken into account for the env:
host_req = self._conanfile.dependencies.host
test_req = self._conanfile.dependencies.test
for require, dep in list(host_req.items()) + list(test_req.items()):
if dep.runenv_info:
runenv.compose_env(dep.runenv_info)
if require.run: # Only if the require is run (shared or application to be run)
_os = self._conanfile.settings.get_safe("os")
runenv.compose_env(runenv_from_cpp_info(dep, _os))
But those host and test properties directly filter for build=False which obviously does not take tool_requires into account which use build=True. See conans/model/dependencies.py line 138:
@property
def host(self):
return self.filter({"build": False, "test": False, "skip": False})
@property
def test(self):
# Not needed a direct_test because they are visible=False so only the direct consumer
# will have them in the graph
return self.filter({"build": False, "test": True, "skip": False})
Now, is there a misunderstanding on my side? I would have expected that since tool_requires sets run=True, those should also be reflected in the VirtualRunEnv. Generally I think all dependecies with run=True should be reflected in the VirtualRunEnv.
Indeed changing the code in conan/tools/env/virtualrunenv.py to something like this, makes is work as I expected:
host_req = self._conanfile.dependencies.host
test_req = self._conanfile.dependencies.test
run_req = self._conanfile.dependencies.filter({"run": True})
for require, dep in list(host_req.items()) + list(test_req.items()) + list(run_req.items()):
if dep.runenv_info:
runenv.compose_env(dep.runenv_info)
if require.run: # Only if the require is run (shared or application to be run)
_os = self._conanfile.settings.get_safe("os")
runenv.compose_env(runenv_from_cpp_info(dep, _os))
What is the best practice here for dependencies needed during build and run?
Have you read the CONTRIBUTING guide?
- [X] I've read the CONTRIBUTING guide
Hi @pklip
Thanks for your question.
I think your understanding and analysis is correct, and it is expected that VirtualRunEnv doesn't take tool_requires into account.
The tool_requires are tools that are exclusively necessary at build time, like cmake or meson. Once the binaries are built, they are no longer required.
When a tool_requires define the run=True indicates that it contains binaries that are needed to execute, but they are only going to exist at build-time anyway, and they will execute only at build-time.
As such tool_requires are not included in the VirtualRunEnv by default, to avoid potential collisions of common shared libraries from the build and host contexts. This is basically the major difference between VirtualRunEnv (only contains dependencies from the "host" context, never from "build" context or tool-requires) and VirtualBuildEnv (contains environment definition mostly from the "build" context, from tool-requires). If VirtualRunEnv added information from tool_requires and build context, then basically there will be no difference with VirtualBuildEnv. This was more or less the legacy design before having the host/build contexts, and the problems found there originated this split between Build-Run environments.
Now the question is what to do with toolchains. Toolchain that contains libraries that are necessary at runtime is an exception to this case, not covered by the default Conan behavior. There is not evident default for these libraries, like libstdc++.so, most applications running at runtime don't expect to have to setup any path environment variable, or any environment at all, because that shared library should already be in the system, not a part of the application. They are not like other shared libraries at the application level that will be distributed together with the application, in the same way that it is not expected that such library will have a find_package(xxxx) in CMake in order to be able to link and use it. Conan virtualenvs try to account for the application layer mostly.
If you could please elaborate a bit more how do you see your runtime, distribution and deployment of your application from Conan packages, that could be useful. There could be different possibilities, like splitting the library in its own package (for the host context, instead of build), or maybe adding explicitly a "host" requires to the same package as the tool_requires.
Hi @memsharded, I recently ran into the same/similar issues and figured I would piggy-back on this issue.
If you could please elaborate a bit more how do you see your runtime, distribution and deployment of your application from Conan packages, that could be useful. There could be different possibilities, like splitting the library in its own package (for the host context, instead of build), or maybe adding explicitly a "host" requires to the same package as the
tool_requires.
Our context is that we have an internal conan index repository where we compile various third party packages used in our builds. What I'd like to do is to compile a custom GCC version with Conan and then build the rest of our packages using this GCC package, and furthermore then use our GCC package when compiling, testing and packing our product.
Packaging our custom GCC version as a Conan package has worked fine so far. However when adding it as a [tool_requires] dependency when creating the other conan packages I run into the same issue as described above with VirtualRunEnv not containing environment variable from the GCC package and thus failing package tests (more specifically we grab the system libstdc++.so instead of the one we have as part of the package).
When compiling and testing our main product I can "avoid" this problem by souring the VirtualBuildEnv via conanbuild.sh rather than conanrun.sh, though I haven't managed to apply a similar workaround when creating our various third party packages. To be clear, we do not plan to package the main product as a Conan package, it simply consumes other Conan packages as part of its build.
Hopefully this provides some more context. If you want to me to expand upon some aspect(s) in order for you to provide suggestions I'm happy to do that.
Hi @memsharded ,
Thanks for clarifying the strict separation of the build and host context and how tool_requires only affect the build context. I think now I understood the consequences of the build and run trait (and their combinations).
I admit GCC (or any other compiler) is a special package here, since you need the bins and libs during compilation, but also at least the libs during runtime (and they should probably be the same to avoid weird conflicts/bugs). I would claim, that it is not uncommon, that one wants to try different compilers and different compiler versions which (in theory) should be simple with a package manager like conan. Especially there are lots of reasons in the real word, why you might be stuck with an old compiler in your system and still would like to build your application independent of the system compiler. And with newer compilers also introducing newer libstdc++ and GLIBCXX versions one cannot simply rely on "everything will already be there on the system", because on older systems it won't.
My context of runtime seems pretty similar to @Samev 's use case. Actually currently I build my main application with a custom cmake toolchain (no conan build involved), but I depend on conan for fetching all the dependencies (conan install) and making them available through the respective environments for building/runtime. I handle different compilers via different conan profiles which specify the tool_requires to e.g. gcc or llvm. Now, I could add an explicit runtime dependency like self.requires("gcc/11.4.0@me/stable", run=True), but that needs to be specified in basically every single recipe since profiles can only declare tool_requires (i.e. no simple requires), right? So, there is no way via a profile or similar to keep the tool_requires and the according "run-requires" in sync. This is a problem which would also persist if one splits the package into a "build-time gcc" and a "run-time gcc" package.
Is exchanging compilers via conan really uncommon in your opinion? I thought it is a pretty big advantage to do so and not being stuck with your system compiler (also for testing purposes). Fun fact: Also the gcc recipe from the conan-center-index is broken since at least 9 months due to conflicting dependencies.
Now, I could add an explicit runtime dependency like self.requires("gcc/11.4.0@me/stable", run=True), but that needs to be specified in basically every single recipe since profiles can only declare tool_requires (i.e. no simple requires), right?
You might with a small cheat, avoid that. Like using the tool_requires in the profile, and just running a conan install --requires=gcc/... -g VirtualRunEnv to get the extra env script to inject that runtime? Might need some exploration.
In any case, we are investigating again the "toolchain-requires" concept, trying to improve over this use case a bit more. It is just starting the discussion, but at least we have the issue in mind and we will be trying to improve it.
Is exchanging compilers via conan really uncommon in your opinion? I thought it is a pretty big advantage to do so and not being stuck with your system compiler (also for testing purposes).
Not uncommon at all. We know there are many users packaging tools including compilers in their packages to be used as tool_requires.
Fun fact: Also the gcc recipe from the conan-center-index is broken since at least 9 months due to conflicting dependencies.
Yes, we are aware of the challenges of that package. Because one thing is packaging a compiler and a different one is bootstrapping the build from source of a compiler. ConanCenter should probably not have in the first place that compiler package, at least in its current form. It is also widely different to have to support a package for your organization than trying to provide a package for the wider Conan community.
You might with a small cheat, avoid that. Like using the tool_requires in the profile, and just running a conan install --requires=gcc/... -g VirtualRunEnv to get the extra env script to inject that runtime? Might need some exploration.
Thank you @memsharded ! This might in fact be a workaround for now. I will give it a try.
Thank you also for giving the "toolchain-requires" concept some more thought. I believe this is a fundamental issue in the C/C++ world and conan needs to address this in some more user-friendly way.
I didn't mean to be rude or anything with that comment about the broken gcc package. It just startled me that this package is apparently barely used. Or probably most people just take that as a starting point and "fix" the official recipe locally to their custom needs (like I did)...
You might with a small cheat, avoid that. Like using the tool_requires in the profile, and just running a conan install --requires=gcc/... -g VirtualRunEnv to get the extra env script to inject that runtime? Might need some exploration.
This works for providing the appropriate run time environment for my final end product. But it does not fix the problem in test_packages of conan packages which try to build and execute code which needs the libs from gcc for execution. I don't see a possibility to inject requires here, without hardcoding them in the conanfile of the test_package.
Related feature request: https://github.com/conan-io/conan/issues/13533
Now, I could add an explicit runtime dependency like self.requires("gcc/11.4.0@me/stable", run=True), but that needs to be specified in basically every single recipe since profiles can only declare tool_requires (i.e. no simple requires), right?
You might with a small cheat, avoid that. Like using the
tool_requiresin the profile, and just running aconan install --requires=gcc/... -g VirtualRunEnvto get the extra env script to inject that runtime? Might need some exploration.In any case, we are investigating again the "toolchain-requires" concept, trying to improve over this use case a bit more. It is just starting the discussion, but at least we have the issue in mind and we will be trying to improve it.
Is exchanging compilers via conan really uncommon in your opinion? I thought it is a pretty big advantage to do so and not being stuck with your system compiler (also for testing purposes).
Not uncommon at all. We know there are many users packaging tools including compilers in their packages to be used as
tool_requires.Fun fact: Also the gcc recipe from the conan-center-index is broken since at least 9 months due to conflicting dependencies.
Yes, we are aware of the challenges of that package. Because one thing is packaging a compiler and a different one is bootstrapping the build from source of a compiler. ConanCenter should probably not have in the first place that compiler package, at least in its current form. It is also widely different to have to support a package for your organization than trying to provide a package for the wider Conan community.
@memsharded Firstly apologies for resurrecting an old thread but its identical to an issue I'm trying to solve.
It seems that its been almost 2 years since this discussion started. We have recently run into this exact issue where we are trying to use tool_requires with profiles to specify our toolchain but now have a shared library path issue.
Do you happen to have any idea if work on a tool-chain requires type feature would be available? Because packaging a new toolchain via profiles is extremely convenient and allows us to leverage conan repositories to store all these different toolchains and the respective builds with these toolchains. It seems that several other people also use conan in such a way that it seems that nicer support for such cases would be warranted.
This seems to work well in Conan 1 with [build_requires] and has been the way we have been managing our toolchains thus far and its a shame that it doesn't behave well in Conan 2 which is causing us issues in migrating.
Would be good to hear your thoughts on this.
Hi all,
There are a couple of aspects I'd like to highlight regarding the behavior and potential issues.
- When a package is a
tool_requiresto other package, its binary is not always necessary, and it sometimes will not be retrieved - For example if a package
myapp/0.1contains aself.tool_requires("cmake/3.31.0"), thecmake/3.31.0package binary will not even be retrieved unless the consumer package needs to be built from source - That means that a regular installation of
myapp/0.1and trying to execute it, will fail, no matter what theVirtualRunEnvdoes, because thetool-requirescmake/3.31.0has not been even downloaded from the server! If running the application depended on any shared libraries insidecmake/3.31.0package, it will fail
I think this behavior is expected in general, users really don't want to install some application executable (just a few MBs of binary), and that application forcing to download the full toolchain/version binaries (that are typically very large, like many hundreds of MBs), because the application depends on some shared library inside the toolchain.
This seems to work well in Conan 1 with [build_requires] and has been the way we have been managing our toolchains thus far and its a shame that it doesn't behave well in Conan 2 which is causing us issues in migrating.
This might not be as much a Conan1->Conan2 migration as a "1 profile"->"2 profiles" migration, which was introduced in Conan 1 long time ago, and recommended since then. Were you using 2 profiles in Conan 1? Because that is what introduces the separation of "build" and "host" context, and changes how the environment is managed. Having everything in 1 profile didn't have the issue, but it also had many limitations that were fixed with the 2 profiles model, which has proven in the last years to work way better in the vast majority of cases.
We have wanted since then to explore the concept of "toolchain" packages, but unfortunately it wasn't possible to prioritize it. Maybe it might be possible in the close feature.
Still, the major challenge is that it is not clear what should be done. Toolchains take many aspects, forms, and ways of using them. It is not only about the runtime, but also about what information should be propagated at build time regarding libraries, if any.
I'll try to push the toolchain concept a bit in the next iterations, lets see what is possible, thanks for the feedback!