conan
conan copied to clipboard
[question] Is there a way to upload all packages to their coresponding remotes
Atm I am upload all packages like this:
conan upload "*" --confirm --parallel -r my_remote
However, this will upload all packages to one certain remote my_remote
Is there a way I can upload all package to their corresponding remotes (binary_remote), where the recipe was downloaded?
When not, how can I achieve this easily?
Hi @derived-coder
With revisions, the concept of this recipe was downloaded from this server is not that critical anymore, because as long as the revision is the same, different binaries can be spread in different servers too.
So Conan doesn't have that behavior anymore (in 1.X it sometimes allow it, if the tracking information has not been dropped, but Conan 2.0 has completely removed tracking). So the upload to remotes will always be to the explicitly defined remote in the argument, and if something different is intended, it should be implemented by the user (via custom commands and the PythonAPI). In any case, it seems an anti-pattern to upload different packages to different repos at once and indiscriminately for the whole cache. Instead the typical operation is to get the information from the dependency graph, and those packages that have been built locally, those are the ones to be uploaded.
We have in our artifactory an access pattern established. Multiple conan repos with different access possibilities. Imagine a public one for foss recipes and a private one for in house software.
When uplading the package, we would like to avoid that the private package are uploaded to the public one accessible.
Just a short explanation what use case we have. Imagine it is possible to recompile all our recipes (with all the configs, options etc..) for a new compiler with one simple command. Now we would like to upload the result to the conan repo in our artifactory again.
We have in our artifactory an access pattern established. Multiple conan repos with different access possibilities. Imagine a public one for foss recipes and a private one for in house software.
This is where a channel makes sense, to clearly differentiate between foss recipes and private ones. Relying on the tracked remote only is risky. Actually the best practices process might be a bit different. You don't want to depend on the public external facing repo in production for your product. Instead you upload everything to your production repo, including foss binaries. Then you decide, with a promotion (could be based on channel, or other criteria), to copy some of those artifacts to the public facing repo. This kind of thing tends to be more usually automated in server side than in client side. Artifactory has virtual repos that can aggregate multiple local repos, and it can define the distribution of packages in a consistent way, without relying on the client side, that can always easily screw things by just using the wrong -r=remote argument.
We have one artifactory server with multiple conan repositories there. We have a lot of people access to the artifactory server also external companies. That why we need this access rights. Additional we distinguish there also with the channel identifier .
I hope it clears it more up.
As far as I understand your answer is now, you should only use one conan repo per artifactory instance?
As far as I understand your answer is now, you should only use one conan repo per artifactory instance?
No, I am not saying that. You can have as many conan repos you want per artifactory instance, and it is a good and recommended practice. I am saying that you should leverage it, that you should have multiple repos, and you should use the Artifactory automation mechanisms, instead of trying to upload to different repos in one single instruction from the Conan client, which is fragile and insecure by design.
Closing this question as responded some time ago.