Requests for comments: how does opam-repository scale?
I've observed over the years that there's the sentiment that "no package shall be removed from opam-repository" (I still don't quite understand the resoning behind it -- maybe it is lock files that fail to register the git commit of the opam-repository? Maybe it is for some platforms that require the opam-repository for removing opam packages?
So, I'd like to raise the question why this sentiment exists. Answers (or pointers to arguments) are highly welcome.
Why am I asking this? Well, several years back Louis worked on "removing packages that aren't ever being installed" (with the reasoning that if foo in version 6.0.0 and in version 6.0.1 (with the very same dependencies) are available, 6.0.1 is always chosen (which makes sense due to it may very well be a bugfix).
Now, I also observe that rarely, really rarely old package releases get bumped their minor version -- i.e. the "long term support" of opam packages does not really exist (happy if you prove me wrong): bug fixes are incorporated with new features and API changes, and so new (major) versions are released.
Taking a step back, it is pretty clear that collecting more and more packages will lead to larger amount of work for the solver (which needs all the packages being parsed, and then find a solution). This means that the solver needs to improve speed-wise roughly every year. This is rather challenging, and in the end leads to opam not being very usable on smaller computers (Raspberry PI, or even tinier computers...).
Also with carbon footprint in mind, I fail to see why opam-repository may not delete packages. In the end, it is a git repository -- if you wish to install compiler in an arcane version, it is fine to roll back your opam-repository git commit to an arcane commit that fits nicely. The amount of work for the opam-repo CI could as well be minimized by removing old opam packages (since a revdep run will be much smaller).
Please bear with me if you've already answered this, and have written up the design rationale (and maybe the answer how opam-repository will scale). Comments and feedback is welcome. Once I understand the reasoning, it'll be much easier for me to figure out how to move forward. Thanks a lot. //cc @AltGr @avsm @kit-ty-kate @mseri @samoht
As a first observation: the opam repository is a git repository, so anyone who wants to install old releases can always use an old commit.
One idea I discussed with @mro is to add an expiration date on each opam package when publishing, i.e. "this is being maintained for 3 months". I was initially skeptical, but think this is a fine idea since various contributors (authors) and companies have different release cycles, and for me personally code that I don't use and touch for several years should either be released as stable (with a 1.x) and expiration of infinite; or will bitrot.
And for reference, I found these PRs and their discussion pretty insightful:
- https://github.com/ocaml/opam-repository/pull/11194
- https://github.com/ocaml/opam-repository/pull/11400
- https://github.com/ocaml/opam-repository/pull/11559
Especially the comment https://github.com/ocaml/opam-repository/pull/11400#issuecomment-365243680 is worth reading, where different people have different opinions (in respect to whether it "is good for reproducibility to remove packages"). And note to myself, there's opam admin check --obsolete.
Looks like nobody else cares (and Louis cared earlier, as outlined in closed issues and pull request). Closing.
I do actually care, but I don't have a good solution yet and, worse, there is not yet a real consensus on what to do. I hope we will settle on some periodic stable releases and an archive repository where old packages go to rest, or some other compromise that will make things slimmer.
If you ask me we should leave this open as an opportunity to get more ideas. There was also a discuss (or even more than one) thread where we had a discussion about this. If I can find it, I'll link it
I also care.
But what I would like to see before deleting packages is a process to move these deprecated/unmaintained packages to a separate repository. Keeping them around is really helpful when you need to test large-scale language features (for new release, but also to check langage statistics for old releases).
I think the main blocker for this right now is to have the appropriate tooling (for instance to split an existing opam-repo easily).
I doubt this is a tooling issue to be honest, since there's opam admin check --obsolete around (surprise: it may need some improvements), as well as opam admin check --installability.
It feels more like a policy decision about what should opam-repository achieve - together with what compilers to support (see related issues https://github.com/ocaml/opam-repository/pull/24868 https://github.com/ocaml/opam/issues/5748 -- but also https://github.com/ocaml/infrastructure/issues/87 https://github.com/ocaml/infrastructure/issues/48).
But what I would like to see before deleting packages is a process to move these deprecated/unmaintained packages to a separate repository.
What prevents you from adding a tag before deletion of anything in this repository, and establish a (e.g.) quarterly routine on what to delete (i.e. have a quarterly tag, 2023Q1..Q4)?
Maybe it is just me, using a ~10 year old laptop, who suffers from this issue -- but it looks like the "opam.ocaml.org update job" is having quite some trouble (related to the amount of packages).
To have a concrete proposal that can be discussed:
- Let's have a quarterly delete job. Each deletion will be preceeded by a git tag (so someone interested in earlier releases can just
opam repo add old-stuff https://github.com/ocaml/opam-repository.git#2023Q4). - For Q4/2023, there will be a tag, and only support OCaml 4.08 (see https://discuss.ocaml.org/t/raising-the-minimum-tested-version-of-ocaml-to-4-05-or-4-08-from-4-02/12464) and opam 2.1 (to be discussed what the minimum opam version should be, certainly 2.0)
- Also, all "available: false" packages will be removed
-
opam admin check --installabewill be executed and evaluated whether the output list should be removed - The upside is: smaller repository, fewer packages -> fewer work on all clients
- Additionally, for each package, only the latest patch version will be retained: i.e. if there's a package with versions 0.1.0, 0.1.1, 0.2.0, 0.2.1, 1.0.0, 1.0.1, 1.0.2, 1.1.0, 1.1.1 - only 0.1.1, 0.2.1, 1.0.2, 1.1.1 are kept.
- Exemptions for the above rule are the following set of packages: ocamlformat
- After removal, the output of
opam admin check --installablewill be evaluated - If there's breakage, old packages can be revived (and added to the above set of exemptions)
- since we now have a path how packages are removed, any maintainer/author can opt in to add their package (+version) to the set of "to-be-deleted" packages (at the end of each quarter)
Yes, this will break "opam lockfiles" - in theory at least. In practise, feel free to point to hyperlinks of lockfiles that are broken by such a mechanism (and yes, I really really think these lockfiles need to be fixed to include the repository commit so that they are sustainable).
How can we "discuss" such a policy? Is this issue good for it? How can we reach something that we then act on? Or will we just sit down and wait until it became so annoying that nobody is using it anymore?
On opam’s side we’re thinking about adding a compressed format for the repository after 2.2.0 (https://github.com/ocaml/opam/issues/5648). Such format would eliminate the issue for opam update at least.
For the solver/clutter/git-growth/… though the problem still remains and removing packages could be a solution. It is also to note that we currently have nobody knowledgeable enough to work on the solver on a regular manner.
What I do not really like with tags and saying that you can always look back at the history to install old packages is that we are constantly improving the quality of the metadata of opam-repository. So when you use an old opam-repo, you loose these improvements unless you have a (custom) merge of this old repo with the live repo.
My ideal workflow would be to have layered repositories with a well-defined merge strategy - so our automated system could easily test old packages with recent metadata fixes.
So a few more details on what I have in mind. I'm describing an hypothetical worfklow - we can get there incrementally or not.
- We have 3 official opam repositories instead of one:
unstable/stable/archive - optionnaly, there's a new opam field
x-opam-repository-statusthat contains unstable/stable/archive (if we have want the opam client to do something clever about those) - optionanly, there's a new opam field
x-opam-repository-expirywith an expiry date for a package - every opam repository has a different policy:
-
unstable: no gatekeepers - a very light CI ("it lints and it builds on ubuntu+latest OCaml release, no x-platform checks, no revdeps"). Strong encouragement to have an expiration date for the packages -
stable: same as today but with all the unstable/old/deprecated packages removed. The CI runs all of todays' checks, but just on the stable repo. -
archive: all the old/deprecated packages. No (or very few) CI checks - archive packages are supposed to always build.
-
- the lifecycle of a package could be:
- people can submit new packages to
unstableandstable(these correspond to different level of CI/process checks) - optionally, there's a weekly/monthly bulk build of
unstablepackages with extra CI check - the ones that pass the extra CI checks (including revdeps with stable packages) are submitted in bulk to the stable repository - there's a weekly/monthly bulk build of packages "on the deprecation list", that we want to delete from
stableand submit toarchive. The CI runs with stable+archive. We need to fix the constraint before adding anything into archive (ie. archive packages should always build). This is super important when there's a new OCaml release (we should check that it doesn't break archive packages and if yes, add the new constraints). When the packages are added toarchive, they are removed fromstable
- people can submit new packages to
- we could host the 3 repositories as different branches in ocaml/opam-repository
What I am trying to get is some kind of guarantees that unstable/old/deprecated package metadata continue to somehow be maintained but without putting too much hassle on our CI infrastucture and on opam-repo gatekeepers. And I'm also keen to have a system where "submitting a simple package for fun" doesn't add more load to our system/process.
What do you think?
I am new and started volunteer work in opam admin a few weeks ago. I haven't done much yet except for just attending a few weekly meetings. I shall have more time from this week. I am so glad to see this issue discussing the scaling problem. Here are some topics appearing in your discussions and my opinion.
Retiring old packages
I don't agree with retiring old packages directly. The direct reason is we cannot guarantee the packages will observe the plausible invariant from semantic versioning so that e.g. projects or libraries depending on 6.0.0 can always run without problems after changing dependency to 6.0.1. Reproducibility is such a painful problem.
However, I do agree that we should suggest to use a better alternative of a package version. No matter when users try installing a package with a non-optimal version, or opam (.opam) detects a non-optimal version is specified. But does it require the user to always have a most recent view of the latest opam-repo?
So my point is the package versioning suggests a good package choice but not reliable. That's the reason lockfiles are widely used.
Using GitHub as the backend
Another unclear point to me is when evaluating a design suggestion above, shall we assume that opam repo will stick to using GitHub (git) backend or is it also discussable? It will be helpful to consider whether a suggest is mainly for better design or a better implementation.
We have 3 official opam repositories instead of one: unstable/stable/archive
It has some debian flavor but I have to say using opam where there is just one official central repo is much easier than using ubuntu, with which I have to manually tune it the src-list for many times. And there do have some popular repo e.g. https://coq.inria.fr/opam/released. Does the repo kind (unstable/stable/archive) also apply to other repoes, or is it actually a package metadata?
CI burden, Package author burden, Opam repo admin burden
I would like to observe for some more time and hand on more work to figure out where the CI burden and repo admin burden comes from.
I think x-opam-repository-expiry may increase the Package author burden because it's too complicated and rarely possible to anticipate. I may lose some context or experience for this.
I agree with the fact that CI sometimes runs slower to block things. However, in my very limited observation, some CI issues due to the service itself could be improved (accidental error). Some CI errors are caused by package incompatiblity (essential error), and they do rescue users from suffering from this problem. That's one big achievement of the current admin team and opam repo workflow outperforms many other package management tools I have used, in my opinion. I don't think changing the CI arrangement can reduce this problem. Or maybe it helps, if more packages are staying the unstable without moving into the stable then the overall CI check can be reduced.
If we can offload some checking on the publishing side, some burdens are transferred from the CI/admin side to the author side.
p.s. In recap, I am very interested this problem for long time. I have some plan to survey (and surveyed a little, but not sort them out yet) more package managers especially for programming languages.
My take on scaling cares about growth and reliability.
Growth
There is no such thing as eternal growth, so if we design for a liveable future, we should reach a saturation in size. IMO this is simple via expiration. It can be 10 years for "LTS" versions (e.g. the last one before a major/minor jump) but it has to be less than "eternity". There is no maintenance promised during these 10 years, just mere existence. A HTTP 200 instead a 404. The eol date should be in the metadata inside the tarball (and a http header).
Who needs it longer (we come to that in a second), should be able to easily mirror/vendor single version tarballs.
Reliability
IMO we should embrace reproducible builds. This means we need reliably retrieveable sources incl. toolchain. Certain versions may do, it doesn't have to be any version. But once declared such (think 'best before Jan 1st 2040) it must to be available under the same url (or redirected). No need for 5 nines availability and neither bandwidth nor latency shouldn't matter too much, as you can easily cache/download stuff.
Maybe signing the tarballs may be useful. (Web of trust, uh)
In the end it's an url registry with signed, expiring entries.
For reliability (legal, technical, moral) we shouldn't tie ourself to any third party concerning the build toolchain. GitHub may have been fashionable recently, but IMO the opam repository should be consumable without touching billionaire-run infrastructure let alone namespaces. I would want my project to be buildable by peers in Teheran, Moskow, Paris, Beijing and New York alike. And if not, it should be our decision (ocaml.org) and not a billionaire-clerk's.
Monitoring may be helpful.
Hope this makes sense, Marcus
On Tue, 12 Dec 2023 21:43:20 -0800 Weng Shiwei 翁士伟 @.***> wrote:
I am new and started volunteer work in opam admin a few weeks ago. I haven't done much yet except for just attending a few weekly meetings. I shall have more time from this week. I am so glad to see this issue discussing the scaling problem. Here are some topics appearing in your discussions and my opinion.
Retiring old packages
I don't agree with retiring old packages directly. The direct reason is we cannot guarantee the packages will observe the plausible invariant from semantic versioning so that e.g. projects or libraries depending on 6.0.0 can always run without problems after changing dependency to 6.0.1. Reproducibility is such a painful problem.
However, I do agree that we should suggest to use a better alternative of a package version. No matter when users try installing a package with a non-optimal version, or opam (
.opam) detects a non-optimal version is specified. But does it require the user to always have a most recent view of the latest opam-repo?So my point is the package versioning suggests a good package choice but not reliable. That's the reason lockfiles are widely used.
Using GitHub as the backend
Another unclear point to me is when evaluating a design suggestion above, shall we assume that opam repo will stick to using GitHub (git) backend or is it also discussable? It will be helpful to consider whether a suggest is mainly for better design or a better implementation.
We have 3 official opam repositories instead of one: unstable/stable/archive
It has some debian flavor but I have to say using opam where there is just one official central repo is much easier than using ubuntu, with which I have to manually tune it the src-list for many times. And there do have some popular repo e.g.
https://coq.inria.fr/opam/released. Does the repo kind (unstable/stable/archive) also apply to other repoes, or is it actually a package metadata?CI burden, Package author burden, Opam repo admin burden
I would like to observe for some more time and hand on more work to figure out where the CI burden and repo admin burden comes from.
I think
x-opam-repository-expirymay increase the Package author burden because it's too complicated and rarely possible to anticipate. I may lose some context or experience for this.I agree with the fact that CI sometimes runs slower to block things. However, in my very limited observation, some CI issues due to the service itself could be improved (accidental error). Some CI errors are caused by package incompatiblity (essential error), and they do rescue users from suffering from this problem. That's one big achievement of the current admin team and opam repo workflow outperforms many other package management tools I have used, in my opinion. I don't think changing the CI arrangement can reduce this problem. Or maybe it helps, if more packages are staying the unstable without moving into the stable then the overall CI check can be reduced.
If we can offload some checking on the publishing side, some burdens are transferred from the CI/admin side to the author side.
p.s. In recap, I am very interested this problem for long time. I have some plan to survey (and surveyed a little, but not sort them out yet) more package managers especially for programming languages.
-- Reply to this email directly or view it on GitHub: https://github.com/ocaml/opam-repository/issues/23789#issuecomment-1853303542 You are receiving this because you were mentioned.
Message ID: @.***>
-- mro @.***>
@arbipher you say "lockfiles are widely used" -- would you mind to elaborate on that? From what I can see, some people use lockfiles. Do you have a specific insight in which domain "lockfiles are widely used"?
There are some loud voices on "lockfiles" that prevent any progress on pressing issues since "lockfiles may break".
FWIW, if you're keen on reproducibility - lockfiles don't achieve this at all. You'll need to put some more effort (and include all the metadata into your build-information, together with environment variables and system packages that have been used). If you're fine with "sometimes you may be able to reproduce the same version / behaviour", obviously lockfiles are fine.
@hannesm Ah, I realize I made a vague word. I mean lockfiles are widely used for projects within other package managers e.g. npm, rubygem, cabal, etc. I don't observe it's widely used in opam.
I don't concern about reproducibility most of the time except for some manual instructions for building from source (in preparing research artifact). I do agree with your saying that lockfiles are far from ensuring reproducibility, but just a freezing of explicit package dependencies.
Surely it must be possible to comb through opam-repository.git to get a maximal opam-repository with the latest and greatest metadata so your lock files don't break? Given that this problem is solvable I think it shouldn't block removing packages.
Personally I have at least two packages I would like removed. They are let-if and ssh-agent-unix. The former was a fun experiment to show that you can get something similar to the "if let" construct from e.g. Rust, but it is not something I would use or recommend anyone using, and I have not heard of or found any usage of this library. The latter was accidentally published due to dune-release by default publishing all opam packages in a repository. The initially published package was of very poor quality (I didn't intend on publishing it), I don't think anyone uses it and I don't really want to maintain that package (though I am happy to maintain ssh-agent). I am sure there are many other packages that are in a similar poor state.
Given the many, many CPU cycles, bandwidth and energy being wasted on this due to the number of users I think it is important to take a decision sooner rather than later. I believe this would make a great impact towards the carbon footprint policy, too.
Personally I have at least two packages I would like removed. They are let-if and ssh-agent-unix.
Even though this is a partial answer, the removal can already be achieved with available: false. Removing the packages themselves breaks all macos switches due to an issue with macos patch. It should be fixed in opam 2.2
PS I have not yet added a comment here since I don't yet have a clear vision on how to make it work properly, but I am following the thread and I would like to reach a good and scalable solution
Even though this is a partial answer, the removal can already be achieved with
available: false. Removing the packages themselves breaks all macos switches due to an issue with macos patch. It should be fixed in opam 2.2
That' makes me sad. I thought that was something of the past, and would hope that such a bugfix would be ported to the 2.1 series and released as soon as possible.
While a available: false "marks it as not available", still the opam file is around, needs to be created on disk, and needs to be parsed. Thus, it is a rather CO2-intense "solution".
It should be fixed in opam 2.2
It is not currently slotted for 2.2.0 (though that could change) but could be slotted for 2.2.1. This is https://github.com/ocaml/opam/pull/5400 and it needs a review from @dra27 for Windows who is not available at the moment
TL;DR I suggest approaching package management much more like debian and providing a community repo with which is entirely maintained by package authors.
I think there's some tension around opam being held to the expectations of popular language package managers like npm while opam's philosophy is much closer to that of debian's package manager.
As an npm user I expect old packages to always be available (possibly with loud warnings on install) and I expect to release my own packages with very little effort to myself or npm's maintainers. Further, I don't expect npm to handle my package's CI, and I accept the risk that I might accidentally install a malicious package from npm. All the maintenance burden is placed on package authors so the human cost of repo maintenance is distributed in a way that scales as the repo grows.
As a debian user I expect that every package in its repo has been vetted and tested by the debian repo maintainers (whom I implicitly trust as a debian user), but I don't expect to be able to install old versions of a package. Since the repo maintainers choose which packages are in the repo, and there are fewer versions of each package in the repo, the human and CO2 cost of maintaining the repo is limited.
The debian repo isn't expected to grow very quickly as new software that's both useful and mature enough to consider including doesn't appear that frequently, however as OCaml becomes more popular it's likely we'll see the rate of new packages being released go up. Opam attempts to have the security benefits of debian, but it's expensive to scale since random people can suggest to add new packages via a PR which creates work for the human maintainers and every new package in the repo adds an ongoing maintenance cost. There's no blessed alternative way to release OCaml packages and so all the new users who want to make their hobby projects available by releasing them will be inadvertently adding to the repo maintenance burden.
I propose three things:
- Lean into the debian-ness of opam to reduce the repo maintenance cost. Include a smaller set of packages and don't keep old versions of packages in the repo. To allow reproducible builds, encourage users to use lockfiles either with opam-monorepo or dune (once it gets lockfiles). Lockfiles need to include the source urls of the packages they download since the opam repo won't include old versions of packages.
- Every 6 months or so cut a new stable release of the repo, possibly adding/removing packages and including major updates to packages. The only other times a package will be updated is for bugfixes. Maybe do LTS releases less frequently that are maintained for longer.
- Make a blessed community repo with no human checks or centralised CI. Users who just want to release a hobby project but don't expect anyone but themselves to use it will still be able to release their package without incurring any maintenance cost. As with npm, packages are released instantly (no review process), old packages are never removed, maintenance is done entirely by package authors, and users install packages at their own risk. Even if we just did this and not the previous two steps, I expect this to reduce much of the maintenance burden as new users come to the language.
@gridbugs what you propose is similar to what I am saying in https://github.com/ocaml/opam-repository/issues/23789#issuecomment-1846846781. Any specific difference you want to highlight?
This is also compatible with the short-term goal of triming the central repository.
So, I propose we aim for the following final state:
- 3 branches in
opam/opam-repository:bazaar,mainandarchive - a straightforward CI set-up for
bazaar(just run one distro / + the latest OCaml version / no rev-deps) andarchive, no change to the CI formain
To get there:
- we move all the packages suggested by @hannesm from
maintoarchive(and we document that policy, ideally with some automatic tools to help do this more automatically in the future) - We make one bulk build for all the newly archived packages to check if there area easy build failures that we can fix. We mark everything else as "allowed to fail" or something like that. I'm happy to help triage/fix this, as I did a bunch of bulk build fixes in the past.
- we make sure
dune-releaseandopam-publishcan easily publish on thebazaarbranch - we make sure running release readiness checks work (or any large-scale tests when we want to test for runtime changes) when
main+archiveare used together
WDYT?
That generally sounds good to me.
A small thing but let's not call it bazaar as I doubt many people will get the reference. What about community or unstable. Maybe also rename main to stable or core or something since main is just the default name and its semantics in the context of the opam repos is unclear.
we make sure dune-release and opam-publish can easily publish on the bazaar branch I really like this idea! I'm a huge fan of the commands in npm and cargo that let you publish a new package or version with a single command and no additional steps. I'd love to have this for ocaml.
@samoht What do you think about the idea of cutting official releases of the stable repo semi-regularly and reserving breaking changes for those releases? Kinda like how nixos has two big releases a year but also has an unstable branch you can follow if you want to subscribe to the rolling release model.
@gridbugs https://github.com/ocaml/opam-repository/issues/23789#issuecomment-1882107221
I hope you're not suggesting that, if a user publishes a new version of their package one day, they might have to wait months before having it available on opam?
If it's only a cadence for removing old releases that are subsumed, I imagine it can be useful, though.
I hope you're not suggesting that, if a user publishes a new version of their package one day, they might have to wait months before having it available on opam?
Only if they want to release their package into the stable repo. I'm proposing to use the unstable/community repo for quick releases and the stable repo for slow, debian-esque releases.
There are some loud voices on "lockfiles" that prevent any progress on pressing issues since "lockfiles may break".
As someone using lockfiles, I'd say that lockfiles done right is not an an aspect that creates additional problems. True, if you just have a list of packages and versions (opam export style) pulling a package will break but from experience I saw plenty of "locked" builds break because of conf-package changes.
The approach in both Dune and opam-monorepo lockfiles doesn't have a problem with packages disappearing from the repo because once locked neither of them need access to the opam-repository. opam-monorepo doesn't because it assumes all packages can be built with dune, Dune doesn't because it copies the build instructions from the packages at lock-time.
Of course, creating a new, updated lockfile for a project that depends on potentially removed packages is still a problem, but it is the same problem as you'd have with opam install ..
I disagree with the approach to split "opam-repository" into three branches. My intuition is, similar to Debian, this mainly adds confusion to newcomers, tardiness in getting software out.
What I value about the opam-repository is the quickness of updates (bugfixes, new packages), how easy it is to install a package released yesterday, and also the impressive high quality (thanks to both manual checking and CI systems). I don't want to loose any of these properties.
From a package authors perspective, I'd be confused whether to pick the "bazaar"/"community" or "stable" branch to submit something. How would I decide? As an opam user, how would I decide which branch to use? As a "zero-configuration CI system", which branch would be used (why stable? why community?)?
If we're moving to a repository where it is fine to remove (or archive) packages, it is fine to submit to the "stable" branch, or am I misunderstanding something?
Goal
The goal I have in mind, since the start of this issue, is to reduce the repository size. The impact is manifold:
- less CI work of the opam-repo-cI (fewer dependencies / reverse dependencies),
- less work for the opam.ocaml.org automation (which does something for each package),
- less work for the documentation generation on ocaml.org,
- fewer data to be transmitted (from opam.ocaml.org to each client) and preserved (on the client disks),
- this also means fewer work for the opam client (less files to read and decode, less work for the solver),
- also importantly fewer human hours spent on "adjusting upper bounds".
And all of that while keeping the user experience (and workflows/tooling) for (a) package authors and (b) OCaml users the same (for 99.9% of users).
Even though I don't see much value of an "archive" branch, I'm fine to have this around, and instead of "removing packages", keeping them in the archive branch (so people can do bulk builds and fix packages). The workflows could be:
- a package deletion is scheduled, and done on the main branch - this doesn't impact the archive branch
- a package is added/modified on the main branch -> here the diff is applied to the archive branch (automatically, if it cannot be applied, a GitHub issue is opened with the failure output) - this can be merged manually or automatically, or collected and merged in bulk after a monthly bulk build.
Things needed
This also means that not much needs to be changed:
- decision to be done about the archive branch (should it be kept up to date for each merged PR, or every month? who's in charge and having the capabilities and hours to take care of it?) -- likely best those people who use that branch and want to keep it around
- the system (maybe a GitHub action?) that kicks in once a PR is merged to apply the diff to the archive branch needs to be developed (once the semantics are settled)
- the bulk build (AFAIU these are invoked manually) is triggered periodically for the archive branch (and someone - eventually the same people who care about the first item - should observe the output and decide on changes needed or when to merge)
There's explicitly nothing needed in terms of "adding things to dune-release/opam-publish" or "revising documentation for package authors / opam users".
Please let me know what you think, and which goal(s) you have in mind that are not covered above. Thanks for reading.
There are some loud voices on "lockfiles" that prevent any progress on pressing issues since "lockfiles may break".
As someone using lockfiles, I'd say that lockfiles done right is not an an aspect that creates additional problems. True, if you just have a list of packages and versions (
opam exportstyle) pulling a package will break but from experience I saw plenty of "locked" builds break because of conf-package changes.
I'm not sure what "opam export" is, at least with opam 2.1.5 this is not a valid subcommand. Are these 'plenty of "locked" builds' publicly available? Or are these locked builds locked away from the free and open source community?
The approach in both Dune and opam-monorepo lockfiles doesn't have a problem with packages disappearing from the repo because once locked neither of them need access to the opam-repository. opam-monorepo doesn't because it assumes all packages can be built with dune, Dune doesn't because it copies the build instructions from the packages at lock-time.
Maybe you want to look into "opam switch export --full --freeze" as well - here you get something independent of opam-repository, including all patch files, everything versioned.
From my experience, opam-monorepo does some intermediate step where it uses "opam lock" (maybe the "opam-monorepo lock" command?), thus if you consider the "lock" and "pull" step separately (and exchange the output of "lock"), you can get stuck.
I've no clue about "dune package management" and their lockfiles. But I trust you to have a good sense of what is needed in there to be independent of opam-repository changes.
I'm not sure what "opam export" is, at least with opam 2.1.5 this is not a valid subcommand.
Excuse me, I misrembembered the command, I meant opam switch export.
Are these 'plenty of "locked" builds' publicly available? Or are these locked builds locked away from the free and open source community?
These locked builds are locked away from the FLOSS community because the software in question was a proprietary CRUD backend. However, I don't think that this is an issue, I believe we should also have the OCaml ecosystem work for proprietary software. And selling OCaml to your managers is hard if you need to fix a supposedly locked docker build every 2 weeks because some (well-meant!) change in opam-repository made the docker build fall over.
Maybe you want to look into "opam switch export --full --freeze" as well - here you get something independent of opam-repository, including all patch files, everything versioned.
Ah, yes that's nice indeed! Unfortunately also well hidden.
In any case, I don't want to derail this into a discussion into lockfiles, just wanted to point out that for lockfiles and locked builds the way opam-repository is scaled does not make that much of a difference in my opinion.
From a package authors perspective, I'd be confused whether to pick the "bazaar"/"community" or "stable" branch to submit something. How would I decide? As an opam user, how would I decide which branch to use? As a "zero-configuration CI system", which branch would be used (why stable? why community?)?
This "community" bazaar would just be for people to publish packages that they don't intend to maintain for long (exactly the case of @reynir in https://discuss.ocaml.org/t/ann-marking-let-if-and-ssh-agent-unix-unavailable/13767) with a minimal CI (ie. much of the "quality burden" would lie on the package author). For these packages, building rev-deps and testing on many distributions do not make sense. The goal is to ease code sharing. I've heard a few users who are used to npm-like repositories where it's much easier to publish packages than with opam and that will like this.
I would expect most people still interested in submitting high-quality packages (and to maintain that code) to keep submitting to the "stable"/"main" branch like it is today. For those packages, the "quality burden" lies on the repository gatekeeper (with help from CI tools and package authors). This is similar to today.
I also want to keep a fast iteration loop, so I'm not super attracted to the Debian-style (cabal-style) bulk release idea. But maybe we can run a more extensive CI test every quarter (or even align with the OCaml compiler release) and tag the repository after we've fixed the results (extending the "release readiness" scope, maybe?)
@mseri (in quality of main opam-repository maintainer) and I (in quality of opam developer) would like to invite everyone interested in the subject for a public meeting on Jitsi to discuss this issue.
Would Wednesday the 24th of January at 14:00 London time work for everyone interested?
Would Wednesday the 24th of January at 14:00 London time work for everyone interested?
I can participate on that date and time,
I can participate on that date too :)