add ipython output _ipython_tests
Guidelines for marking packages as broken:
- We prefer to patch the repo data (see here) instead of marking packages as broken. This alternative workflow makes environments more reproducible.
- Packages with requirements/metadata that are too strict but otherwise work are not technically broken and should not be marked as such.
- Packages with missing metadata can be marked as broken on a temporary basis but should be patched in the repo data and be marked unbroken later.
- In some cases where the number of users of a package is small or it is used by the maintainers only, we can allow packages to be marked broken more liberally.
- We (
conda-forge/core) try to make a decision on these requests within 24 hours.
What will happen when a package is marked broken?
- Our bots will add the
brokenlabel to the package. Themainlabel will remain on the package and this is normal. - Our bots will rebuild our repodata patches to remove this package from the repodata.
- In a few hours after the
anaconda.orgCDN picks up the new patches, you will no longer be able to install the package from themainchannel.
Checklist:
-
[ ] I want to mark a package as broken (or not broken):
- [ ] Added a description of the problem with the package in the PR description.
- [ ] Pinged the team for the package for their input.
-
[ ] I want to archive a feedstock:
- [ ] Pinged the team for that feedstock for their input.
- [ ] Make sure you have opened an issue on the feedstock explaining why it was archived.
- [ ] Linked that issue in this PR description.
- [ ] Added links to any other relevant issues/PRs in the PR description.
-
[ ] I want to request (or revoke) access to an opt-in CI resource:
- [ ] Pinged the relevant feedstock team(s)
- [ ] Added a small description explaining why access is needed
-
[ ] I want to copy an artifact following CFEP-3:
- [ ] Pinged the relevant feedstock team(s)
- [ ] Added a reference to the original PR
- [ ] Posted a link to the conda artifacts
- [ ] Posted a link to the build logs
-
[x] I want to add a package output to a feedstock:
- [x] Pinged the relevant feedstock team(s)
- [x] Added a small description of why the output is being added.
ping @conda-forge/ipython
This adds an output _ipython_tests to ipython, added in https://github.com/conda-forge/ipython-feedstock/pull/231
@conda-forge/core ready for review
A separate output just for testing seems wasteful. Is there a reason it needs to be an output?
needs to be an output?
Nope, guess it doesn't have to. If the answer is no (and that's fine) i can put it back in, but...
seems wasteful
Right: wasteful for whom? It's pretty much same amount of storage, maybe a little worse from a compression point of view. Certainly this will make another 2 packages per release, so the rate of repodata growth will increase. However, this cuts 30% off the download/install size for ipython users.
After spending a couple years trying to get it out of the pypi wheel (mostly for in-browser consumption), was trying to realize some gains here as well, and had seen some other suites that created unvendored test packages.
I guess some other ways forward:
- download the tests at test time (ick)
- lobby for a more configurable, per-output
channel_targets, such that it could be uploaded to e.g.label/ipython_tests
Thanks again for the review: hoisted this up to zulip, and will roll back the split on the feedstock PR, maybe we can get some more discussion.
I've been meaning to add a feature that allows folks to skip uploads for some packages via a config option in the conda-forge.yml.
That seems like it'd be a straight forward solution here.
Downloading the tests at test some seems fine to me fwiw. Idk why that option isn't ok either?
Downloading the tests at test some seems fine
Welp, I try to think about "what happens in conda-forge stays in conda-forge": while not frequently exercised, the ability to run tests for some package you have, offline, given a reproducible test environment description and source of test dependencies is a really strong feature of conda. Once a test starts talking to the internet it feels bad... and sure, I have a bunch of tests that do need internet, but I often end up having to skip those, or cache what they would have used.
skip uploads for some packages
Building but not shipping an output is certainly an excellent option for many cases, and has been proposed as a way to avoid special-casing intermediates without bizarre edge cases, e.g. cache.
In the case of tests, however, I would like to see a package go somewhere, so that at least part of the above holds, provided those channels are discoverable, and even better follow some naming convention. Today, channel_targets will let you say anything... but with some "domain" labels and linter/inspect conventions like:
conda-forge/label/tests(this conversation)- only accepts packages that populate e.g.
/opt/test/{feedstock-name}
- only accepts packages that populate e.g.
conda-forge/label/documentation(related conversations, for building offline-readable HTML docs)- only accepts packages that populate e.g.
/var/www/{feedstock-name}
- only accepts packages that populate e.g.
... one could start getting a lot of bang for the buck on a build, pushing very optimized builds in main, optionally with strong ties between main and the other ones, without introducing new features to (meta|recipe).yaml.
My long-term concerns are about storage costs / size for these kinds of uses for the index packages. Having the tests on the internet is no worse than having the package source there, which we also do not store.
Reframed some thoughts here: https://github.com/conda-forge/conda-smithy/issues/2263