conan icon indicating copy to clipboard operation
conan copied to clipboard

[question] full_mode on embedded deps

Open lo1ol opened this issue 1 month ago • 10 comments

What is your question?

Hi!

We faced with some problem

At new versions of our custom conanfile extension (python_requires_extend) we stoped to set manually package_id_mode for deps with (visible=False). On old versions it was something like:

       # Hash only recipe for non-visible requires
        visible = kwargs.get("visible", True)
        if "package_id_mode" not in kwargs and not visible:
            kwargs["package_id_mode"] = "revision_mode"

Now we faced with following problem:

  1. We have Debug and Release version of pkgA (static-library)
  2. We have Release version of pkgB (application). It requires pkgA.
  3. I try to build Debug version of pkgC. It requires pkgB. I have compatibility setting, that allows using release version of pkgB instead of debug version:
cat > conanfileA.py <<EOF
from conan import ConanFile

class conanfileA(ConanFile):
    name="pkg_a"
    version="0.0.1"
    package_type="static-library"

    settings = "os", "compiler", "arch", "build_type"
EOF

cat > conanfileB.py <<EOF
from conan import ConanFile

class conanfileB(ConanFile):
    name="pkg_b"
    version="0.0.1"
    package_type="application"

    settings = "os", "compiler", "arch", "build_type"

    def requirements(self):
        self.requires("pkg_a/0.0.1", visible=False)
EOF

cat > conanfileC.py <<EOF
from conan import ConanFile

class conanfileC(ConanFile):
    name="pkg_c"
    version="0.0.1"

    settings = "os", "compiler", "arch"

    def requirements(self):
        self.requires("pkg_b/0.0.1")
EOF

conan export-pkg conanfileA.py -s build_type=Debug
conan export-pkg conanfileA.py -s build_type=Release
conan export-pkg conanfileB.py -s build_type=Release

# -s pkg_b/*:build_type=Release -- emulate compatibility
conan install -s build_type=Debug -s "pkg_b/*:build_type=Release" conanfileC.py

With default package_id_embed_mode=full_mode, pkg_b couldn't be found because it's hash computed with Debug version of pkg_a.

We don't want to change default_embed_mode at global config, because it affect package_id of existing packages, that doesn't specify package_id manually.

Could you recommend us what to do in this case? It looks like we need to rebuild all our new packages, what doesn't set package_id explicitly.

Is there some better variants in this case? Or at least something, what help us in future?

We thinking about adding package_id_embed_mode = package_id_non_embed_mode = "revision_mode" to our base conanfie, to stop depends on default values at global config

conan 2.22

Have you read the CONTRIBUTING guide?

  • [x] I've read the CONTRIBUTING guide

lo1ol avatar Nov 13 '25 15:11 lo1ol

Hi @lo1ol

Thanks for your question.

Let me try to summarize to see if I understood the scenario correctly:

When the package for pkg_b is built with

conan export-pkg conanfileB.py -s build_type=Release

that creates a package with the executable app_b inside that has statically linked the pkg_a static-library lib_a inside it, being this lib_a also in Release mode.

Now we are doing the command:

conan install -s build_type=Debug -s "pkg_b/*:build_type=Release" conanfileC.py

This commands means:

  • I want all dependencies, including pkg_a to be in Debug mode
  • I want specifically the application inside pkg_b to be in Release mode.
  • So I want an app_b application executable that statically links a Debug lib_a from pkg_a

If that is what you want, then it is expected that it will give a "missing binary" for pkg_b because that binary for pkg_b was still not built. That would require a creation of pkg_b like:

conan export-pkg conanfileB.py -s build_type=Release -s pkg_a/*:build_type=Debug

Isn't this what you want?

Then, there are different possibilities depending on the intent. What is the relationship of pkg_c and pkg_b with a regular requires(). In general this is not defined from a C/C++ building point of view, because applications are not artifacts that can be included or linked in a C/C++ build. A different story is if you do a tool_requires("pkg_b/version") in pkg_c, because that would automatically put the pkg_b in the "build" context, and use the "build" settings.

Another approach would be to explicitly tell it:

conan install -s build_type=Debug -s "pkg_b/*:build_type=Release" -s "pkg_a/*:build_type=Release" conanfileC.py

Or invert the logic with:

conan install -s build_type=Release -s "&:build_type=Debug" conanfileC.py

That means that only pkg_c is in Debug and all the other dependencies are in release.

But that would totally depend on the use case, what is being modeled by pkg_c -> pkg_b if pkg_b is an application.

memsharded avatar Nov 13 '25 18:11 memsharded

So I want an app_b application executable that statically links a Debug lib_a from pkg_a

No, I want a prebuild Release executable app_b. And I don't care how it was linked with pkg_a. But of cause I suppose that Release applications linked with release version of static libs

What is the relationship of pkg_c and pkg_b

pkg_c is a multi services project. It contains not just one application, it includes several applications. One of it -- is the prebuild application from pkg_b

lo1ol avatar Nov 14 '25 06:11 lo1ol

Also, you can reproduce this if pkg_b is a shared library. It is more popular case

lo1ol avatar Nov 14 '25 07:11 lo1ol

No, I want a prebuild Release executable app_b. And I don't care how it was linked with pkg_a. But of cause I suppose that Release applications linked with release version of static libs

I see, so if you suppose that is because you do have a "preference" (to be clear, in some cases, for example MSVC, it is usually not even possible to link a debug library into a release executable), so in general, as Conan is a development package manager, focused on managing C/C++ build artifacts such as libraries, the build type of the dependencies matter, and Conan does what it is told to do.

If you do:

conan install -s build_type=Debug -s "pkg_b/*:build_type=Release" conanfileC.py

That doesn't mean "I don't care what pkg_a type is", that means that pkg_a will be a Debug library, and pkg_b will be a Release (app or shared library, doesn't really matter). If you want instead pkg_a to be a Release library it has to be told somehow, like pkg_a/*:build_type=Release or similar.

If what you want is a fully independent and isolated dependency graph that fully decouples how pkg_b is built, what are its dependencies, etc, then you might want to use the vendor=True feature. With that the pkg_a becomes a real internal implementation detail, once the package binary for pkg_b is built. See:

  • https://docs.conan.io/2/devops/vendoring.html
  • https://blog.conan.io/2024/07/09/Introducing-vendoring-packages.html

Note that if pkg_b needs to be built from source, the definition of pkg_a build type will be relevant and necessary again, and it will matter if pkg_a is Debug or Release. The full isolation happens when pkg_b has already built a binary.

memsharded avatar Nov 14 '25 09:11 memsharded

Hello again. Sorry for late answer.

I got your idea about vendor deps. We discussed at our team. And thinking about two methods of fixing it:

  1. Mark shared libs and application as vendor=True
  2. Set package_id_mode= revision_mode on deps of shared libs and application

Both methods fixing our problem, but we don't know which one to choose.

I like vendor=True method, because it completly hide not necessary deps for consumers. But we have situations, when they should be discoverable. For example, SBOM analysis: we need to know whole graph to lookup vulnerabilities. I know that -c tools.graph:vendor=build should force conan show whole dependency graph, but we still not sure that this behaviour will not be changed in future

package_id_mode= revision_mode method have no problem with SBOM, but it force user to download not necessary conanfile's. We have some packages that depends on each other (as test packages). Ability to skip_test make downloading of their conanfiles unnecessary, but I still don't like that not used conanfile's still could affect on graph somehow

What method do you think is better in our case?

Btw. We have some packages that could be assembled as shared and static libs. If we use vendor method, then vendor property should be changed depending on shared option. Is it ok to set it that way?

def requirements(self):
    self.vendor=self.options.shared

All examples, that I found was with setting vendor property directly at class.

lo1ol avatar Nov 18 '25 11:11 lo1ol

What method do you think is better in our case?

The vendor=True is intended for fully decoupling the sub-dependencies, but it erases completely the need to re-build the vendoring package when its dependencies change, and that needs to be explicitly invoked by users. This method is designed for distribution across organization boundaries in most cases, when you want to release a "final" product to other teams or organizations, that never expect to have to build it from source, and you might not want to share the transitive recipes to them. For teams using this and also having SBOM needs, the vendoring package should package its own SBOM about its dependencies, and then there should be some mechanism downstream that is able to gather this SBOM and incorporate it.

That doesn't sound your use case at the moment.

The other approach revision_mode seems a bit more aligned with your intention and needs, but it might be a bit risky if not managed carefully. Because that means that no variance of the dependency will ever be included in the consumer (shared library or application) package_id. That means that any other change rather than the build_type=Debug/Release will ever affect the shared library or applications, which will never be automatically re-built when the dependencies change.

Imagine that your pkg_a has different options such as optimized=True/False, or use_extra_precision=True/False, etc. Then you do a conan install ... -o pkg_a/*:optimized=True -o pkg_a/*:use_extra_precision=True, expecting that your software runs with that configuration, but that will never happen because Conan can find a binary for pkg_b that was built with whatever other options for pkg_a, and the revision_mode basically erased that effect.

This is a real and important risk, something there waiting to bite at some point. It will require a lot of discipline and care to avoid these potential issues. I am doing a proof of concept of being able to model what options of dependencies are factored into consumers for some similar cases, but that would still be a partial solution because there might be other input configurations like settings that could also have effect, but are also being ignored by revision_mode.

Given these risks I am still not sure this is a solution that I would recommend either. I think it is better to model the dependency graph configurations explicitly in profiles, and use controlled variability, for example using the -s &:build_type=Debug -s build_type=Release will work very well if you only want to have in Debug your current project under development, but dependencies in Release (including transitive dependencies), or if you want to debug more than one package define it explicitly with -s mypkg/*:build_type=Debug. Or maybe just build Debug variants for your packages by default, that would probably be simpler, cheaper, no need to mix Debug and Release artifacts, but use all the graph with the same -s build_type=Debug/Release?

memsharded avatar Nov 18 '25 13:11 memsharded

Thank you for so detailed response!

I don't got when package with different option could be used. I thought conan calculates options of deps before discovering requires of packages and downloading there conanfiles

I couldn't reproduce problem you mentioned:

Example
cat > conanfileA.py <<EOF
from conan import ConanFile

class conanfileA(ConanFile):
    name="pkg_a"
    version="0.0.1"
    package_type="static-library"

    settings = "os", "compiler", "arch", "build_type"
EOF

cat > conanfileB.py <<EOF
from conan import ConanFile

class conanfileB(ConanFile):
    name="pkg_b"
    version="0.0.1"
    package_type="static-library"

    settings = "os", "compiler", "arch", "build_type"
    options = {
        "with_pkg_a": [ True, False ]
    }

    default_options = {
        "with_pkg_a": True
    }

    def requirements(self):
        if self.options.with_pkg_a:
            self.requires("pkg_a/0.0.1")
EOF

cat > conanfileC.py <<EOF
from conan import ConanFile

class conanfileC(ConanFile):
    name="pkg_c"
    version="0.0.1"
    package_type="shared-library"

    settings = "os", "compiler", "arch"

    def requirements(self):
        self.requires("pkg_b/0.0.1", visible=False, package_id_mode="revision_mode")
EOF

cat > conanfileD.py <<EOF
from conan import ConanFile

class conanfileD(ConanFile):
    name="pkg_d"
    version="0.0.1"
    package_type="application"

    settings = "os", "compiler", "arch"

    def requirements(self):
        self.requires("pkg_c/0.0.1")
        self.requires("pkg_b/0.0.1", options={"with_pkg_a": False})
EOF

conan rtexport-pkg conanfileA.py
conan rtexport-pkg conanfileB.py -o with_pkg_a=True
conan rtexport-pkg conanfileC.py

conan remove --confirm "pkg_b/0.0.1"
conan rtexport-pkg conanfileB.py -o with_pkg_a=False

conan rtinstall conanfileD.py

We thinking that revision_mode is more preferable for us. The problem that you mentioned could happen when conan build packages. We use only prebuilt packages in our organisation .

Correct me if I'm wrong, please

Also, to ensure that we will use package_id_mode= revision_mode for private libraries we will set package_id_mode this way:

    def custom_requires(self, *args: Any, **kwargs: Any) -> None:
        if "package_id_mode" not in kwargs:
            if not kwargs.get("visible", True):
                # Hash only version and recipe revision for non-visible requires. This deps could be used by SBOM
                kwargs["package_id_mode"] = "revision_mode"

        self.requires(*args, **kwargs)

lo1ol avatar Nov 18 '25 18:11 lo1ol

I don't got when package with different option could be used. I thought conan calculates options of deps before discovering requires of packages and downloading there conanfiles

There are 2 different effects of options:

  • One effect is when they are used to compute the dependency graph, for example your usage that use options to define conditional dependencies, that are used or not depending on an option value
  • But they can also be used to model different pure binary variants, and those binary variants might have an effect on the binary of the consumers.

I'll try to put a full example. Lets imagine that pkg_a contains an option such as:

from conan import ConanFile

class conanfileA(ConanFile):
    name="pkg_a"
    version="0.0.1"
    package_type="static-library"

    settings = "os", "compiler", "arch", "build_type"
    options = {"use_fast_approx": [False, True]}``
    default_options = {"use_fast_approx": False}

    def generate(self):
          tc = CMakeToolchain(self)
          tc.cache_variables["USE_FAST_APPROX"] = self.options.use_fast_approx
          tc.generate()

Then, you can create 2 different variants of that pkg_a, one for each value of the option. One of them will do some computations faster but in an approximate way, while the other will be slower but more accurate. The interface and API is exactly the same, it only changes the internal implementation detail.

When you have some other package that "embeds" it, like:

from conan import ConanFile

class conanfileC(ConanFile):
    name="pkg_c"
    version="0.0.1"
    package_type="shared-library"

    settings = "os", "compiler", "arch"

    def requirements(self):
        self.requires("pkg_a/0.0.1", visible=False, package_id_mode="revision_mode")

(It doesn't matter much if pkg_a is a direct or a transitive dependency in this regard, I have used a direct dependency here for simplicity).

Now, lets say that you create a package for pkg_c, using the default options. You will get a pkg_c/0.0.1:package_id1 binary for those options, that embed a copy of the pkg_a binary that is accurate and slow.

Now you try to build your application and do something like:

conan install . ...... -o pkg_a/*:use_fast_approx=True``

Then, the system will compute the existing ````pkg_c/0.0.1:package_id1, because the package_id1doesn't depend on thepkg_a`` options at all. So it will give that binary, that was built with the accurate and slow computation as good, and use it.

Now the user thinks that they are using the use_fast_approx=True, because they requested it, but that is not true, because the necessary information was removed with the revision_mode.

With the default package-id-modes, which for "embed" cases is the full_mode, will safely say that it needs to build a specific binary for pkg_c against the use_fast_approx=True variant of pkg_a, so the final user application uses that.

It seems that you are not seeing the issue yet because you don't have any other binary variability over the static libraries, they don't define options, besides the conditional requirement one, that already introduces variability because of the extra dependency. So it might luckily work, and not cause issues, but this is what I meant about it being risky, it is likely that some package will eventually use some option or use other combination of settings or the like, and this will happen.

I hope this example clarify the risks.

memsharded avatar Nov 19 '25 00:11 memsharded

Thank you for explanation, Now I reproduced your example:)

Example
cat > conanfileA.py <<EOF
from conan import ConanFile

class conanfileA(ConanFile):
    name="pkg_a"
    version="0.0.1"
    package_type="static-library"

    options = {"use_fast_approx": [False, True]}
    default_options = {"use_fast_approx": False}

    settings = "os", "compiler", "arch", "build_type"
EOF

cat > conanfileC.py <<EOF
from conan import ConanFile

class conanfileC(ConanFile):
    name="pkg_c"
    version="0.0.1"
    package_type="shared-library"

    settings = "os", "compiler", "arch", "build_type"

    def requirements(self):
        self.requires("pkg_a/0.0.1", visible=False, package_id_mode="revision_mode")
EOF

cat > conanfileD.py <<EOF
from conan import ConanFile

class conanfileD(ConanFile):
    name="pkg_d"
    version="0.0.1"
    package_type="application"

    settings = "os", "compiler", "arch"

    def requirements(self):
        self.requires("pkg_c/0.0.1")
EOF

conan rtexport-pkg conanfileA.py
conan rtexport-pkg conanfileC.py

conan rtinstall conanfileD.py -o "pkg_a/*:use_fast_approx=True"

So, till we don't pass subdependency options indirectly for packages with revision package_id_mode -- we shouldn't face with it. In our team we will take this as a rule

lo1ol avatar Nov 19 '25 08:11 lo1ol

And final question

You recommended to set dependency settings directly, but in our organisation we have really complicated compatibility rules.

For example:

  1. Release packages could be used as dependency during Debug build
  2. static-library packages for gcc11 could be used as dependency for gcc12 builds
  3. fat arm64+x86_64 packages could be used for arm64 and x86_64

We have to use them because we have a lot of toolchains in our projects and from time to time they are upgrading and differ.

Conan has compatibility plugins to resolve this problem. But it's sounds like you don't recommend to use them and always try to specify difference in settings explicitly.

I think it is better to model the dependency graph configurations explicitly in profiles, and use controlled variability

May be it could be really good idea explicitly control difference in dependency settings explicitly, but it could turn into real hell in our case:

Doing it this way looks inconvenient:

sub_settings="-s "&:build_type=Release"
if [[ "$is_gcc13" ]]; then
    sub_settings="$sub_settings -s pkg_a:compiler.version=11 -s pkg_b:compiler.version.gcc=11 ..."
fi

if [[ "$is_macos"  ]]; then
    sub_settings="$sub_settings -s pkg_a:arch=x86_64+arm64 ..."
fi

conan install . $sub_settings

Maybe it could look better if compatibility settings would be specified on conanfile directly, but as I know it's not possible:(

What do you think about it?

lo1ol avatar Nov 19 '25 09:11 lo1ol