Pyston workflow
This PR adds a basic Pyston workflow.
Something that is not still working is pytest, since it seems to use Python and not Pyston to run the tests.
@mmatera All of this is good and I'd like to see this in, but it really should be done in 3 separate smaller PRs
- revise the mathics doc handling
- changes to Builtin functions that allow them to work when scikit or other dependencies don't exist
- Workflows CI for pyston
Doing this makes it easier to for someone like me to independently test the pieces and in the future make it easier to track activity.
* revise the mathics doc handling * changes to Builtin functions that allow them to work when scikit or other dependencies don't exist * Workflows CI for pyston
Actually, I tried to do that. #362 contains the two first bullet points, adding the third one in this PR.
Member
As for #362, the concerns there weren't addressed. When I tried this it wasn't working and gave a traceback showing the results I got.
Given this please split off the builtin changes to calculus.py and algorithm/optimizers.py which is an independent thing from the doctest changes and let's get that in.
Then with the narrowed #362 I will try testing that again to see what went wrong. It could be something in my setup but it also might not.
After that, then rebase this on the parts that don't include #362.
I understand the desire and tendency of mixing separable (but causally related) and sometimes unrelated things in a single PR. I tend to do this too to try to move things along faster. However, as soon as there is a problem with any of this, then one needs to go back and split things up. Thanks.
in
@rocky, here you can see the two pieces composing #362:
- https://github.com/mmatera/mathics-core/pull/10
- https://github.com/mmatera/mathics-core/pull/11
which fails in different places. Together, tested against the basic installation, they pass all the tests. On the top of that, I added the pyston workflow that you can see here.
On the other hand, I was not able to reproduce your traceback. This is why I introduced the ubuntu-minimal workflow.
in
@rocky, here you can see the two pieces composing #362:
Neither of these addresses the changes to NIntegrate in isolation: to expand testing, and to remove the requirement of scipy on that.
I have that split out into https://github.com/Mathics3/mathics-core/pull/374
If we can agree on this, then let's merge that in. It is a well defined piece of work that stands on its own.
which fails in different places. Together, tested against the basic installation, they pass all the tests. On the top of that, I added the pyston workflow that you can see here.
On the other hand, I was not able to reproduce your traceback. This is why I introduced the ubuntu-minimal workflow.
Ok - but it is sad that we have to go through all of this discussion and additional PR's before you report that you were not able to reproduce the traceback.
It is likely a problem in my environment or on my side. But it is something I'd like to understand and resolve. It may also be that more work is needed in a particular area and we decide to do that in a different PR. But again, I'd like to understand what's up here.
So want to try again, but with a smaller and more isolated PR. So after #374 goes through, let's just do the check_requires part in doctest and moved to core part by itself - no need to add CI testing with Pyston into that part. That can be a separate PR.
Neither of these addresses the changes to NIntegrate in isolation: to expand testing, and to remove the requirement of scipy on that.
I have that split out into #374
@rocky, thanks for this.
If we can agree on this, then let's merge that in. It is a well defined piece of work that stands on its own.
Good. My problem was that I couldn't test this in a fresh environment, without adding the workflow. But OK, let´ s merge it as it is in #374.
Ok - but it is sad that we have to go through all of this discussion and additional PR's before you report that you were not able to reproduce the traceback.
I thought that adding the ubuntu-minimal.yml workflow was enough to show that. In any case, I am not sure about what is failing in your environment.
So want to try again, but with a smaller and more isolated PR. So after #374 goes through, let's just do the check_requires part in doctest and moved to core part by itself - no need to add CI testing with Pyston into that part. That can be a separate PR.
OK, but that part was already isolated. Once the other parts are ready, we can just add the workflow for Pyston.
Good. My problem was that I couldn't test this in a fresh environment, without adding the workflow. But OK, let´ s merge it as it is in #374.
All I did was remove scipy in 3.8.13 and test:
$ pytest -svx test/builtin/numbers/test_nintegrate.py
==================================================================================================== test session starts =====================================================================================================
platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 -- /home/rocky/.pyenv/versions/3.8.13/bin/python3.8
cachedir: .pytest_cache
rootdir: /src/external-vcs/github/Mathics3/mathics-core
plugins: forked-1.4.0, xdist-2.5.0
collected 8 items
test/builtin/numbers/test_nintegrate.py::test_nintegrate[NIntegrate[x^2, {x,0,1}, Method->Automatic ]-1/3.-] PASSED
test/builtin/numbers/test_nintegrate.py::test_nintegrate[NIntegrate[x^2 y^2, {y,0,1}, {x,0,1}, Method->Automatic ]-1/9.-] PASSED
test/builtin/numbers/test_nintegrate.py::test_nintegrate[NIntegrate[x^2, {x,0,1}, Method->Romberg ]-1/3.-] PASSED
test/builtin/numbers/test_nintegrate.py::test_nintegrate[NIntegrate[x^2 y^2, {y,0,1}, {x,0,1}, Method->Romberg ]-1/9.-] PASSED
test/builtin/numbers/test_nintegrate.py::test_nintegrate[NIntegrate[x^2, {x,0,1}, Method->Internal ]-1/3.-] PASSED
test/builtin/numbers/test_nintegrate.py::test_nintegrate[NIntegrate[x^2 y^2, {y,0,1}, {x,0,1}, Method->Internal ]-1/9.-] PASSED
test/builtin/numbers/test_nintegrate.py::test_nintegrate[NIntegrate[x^2, {x,0,1}, Method->NQuadrature ]-1/3.-] PASSED
test/builtin/numbers/test_nintegrate.py::test_nintegrate[NIntegrate[x^2 y^2, {y,0,1}, {x,0,1}, Method->NQuadrature ]-1/9.-] PASSED
$ pip uninstall scipy
Found existing installation: scipy 1.8.0
Uninstalling scipy-1.8.0:
...
Proceed (Y/n)? y
Successfully uninstalled scipy-1.8.0
$ pytest -svx test/builtin/numbers/test_nintegrate.py
==================================================================================================== test session starts =====================================================================================================
platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0 -- /home/rocky/.pyenv/versions/3.8.13/bin/python3.8
cachedir: .pytest_cache
rootdir: /src/external-vcs/github/Mathics3/mathics-core
plugins: forked-1.4.0, xdist-2.5.0
collected 2 items
test/builtin/numbers/test_nintegrate.py::test_nintegrate[NIntegrate[x^2, {x,0,1}]-1/3.-] PASSED
test/builtin/numbers/test_nintegrate.py::test_nintegrate[NIntegrate[x^2 y^2, {y,0,1}, {x,0,1}]-1/9.-] PASSED
====================================================================================================== warnings summary ======================================================================================================
...
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=============================================================================================== 2 passed, 2 warnings in 3.99s ================================================================================================
@mmatera look at https://github.com/Mathics3/mathics-omnibus/tree/5.0.0.dev0/docker/src for pre-built wheels that can be used to speed up pyston CI which currently takes so long. I don't know right now what the best way to manage the pyston wheels is though.
@mmatera what is the status of this PR?
@mmatera what is the status of this PR?
It needs to be redone using prebuilt packages to make running CI not take a long time to run.
It needs to be redone using prebuilt packages to make not take a long time to run.
How can that be done?
It needs to be redone using prebuilt packages to make not take a long time to run.
How can that be done?
Here https://github.com/Mathics3/mathics-core/pull/368#issuecomment-1192525282 it is supposed to be a clue, but I still didn't try to dig on it.
@rocky
I managed to pick the pre-built wheels from mathics-omnibus, and install them during the CI. However, for some reason, scipy wheel fails to be installed.
I tryied to install it locally and I get this error:
pip -v -v -v install scipy-1.9.3-pyston38-pyston_23_x86_64_linux_gnu-linux_x86_64.whl
Using pip 23.2 from /home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip (python 3.8)
Non-user install because site-packages writeable
Created temporary directory: /tmp/pip-build-tracker-8rwyf93c
Initialized build tracking at /tmp/pip-build-tracker-8rwyf93c
Created build tracker: /tmp/pip-build-tracker-8rwyf93c
Entered build tracker: /tmp/pip-build-tracker-8rwyf93c
Created temporary directory: /tmp/pip-install-jpwha13a
Created temporary directory: /tmp/pip-ephem-wheel-cache-ve265a8e
Processing ./scipy-1.9.3-pyston38-pyston_23_x86_64_linux_gnu-linux_x86_64.whl
Added scipy==1.9.3 from file:///tmp/scipy-1.9.3-pyston38-pyston_23_x86_64_linux_gnu-linux_x86_64.whl to build tracker '/tmp/pip-build-tracker-8rwyf93c'
Removed scipy==1.9.3 from file:///tmp/scipy-1.9.3-pyston38-pyston_23_x86_64_linux_gnu-linux_x86_64.whl from build tracker '/tmp/pip-build-tracker-8rwyf93c'
ERROR: Wheel 'scipy' located at /tmp/scipy-1.9.3-pyston38-pyston_23_x86_64_linux_gnu-linux_x86_64.whl is invalid.
Exception information:
Traceback (most recent call last):
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/metadata/pkg_resources.py", line 115, in from_wheel
with wheel.as_zipfile() as zf:
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/metadata/base.py", line 679, in as_zipfile
return zipfile.ZipFile(self.location, allowZip64=True)
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/zipfile.py", line 1269, in __init__
self._RealGetContents()
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/zipfile.py", line 1336, in _RealGetContents
raise BadZipFile("File is not a zip file")
zipfile.BadZipFile: File is not a zip file
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/cli/base_command.py", line 180, in exc_logging_wrapper
status = run_func(*args)
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/cli/req_command.py", line 248, in wrapper
return func(self, options, args)
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/commands/install.py", line 377, in run
requirement_set = resolver.resolve(
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/resolver.py", line 73, in resolve
collected = self.factory.collect_root_requirements(root_reqs)
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 491, in collect_root_requirements
req = self._make_requirement_from_install_req(
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 453, in _make_requirement_from_install_req
cand = self._make_candidate_from_link(
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 206, in _make_candidate_from_link
self._link_candidate_cache[link] = LinkCandidate(
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 293, in __init__
super().__init__(
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 156, in __init__
self.dist = self._prepare()
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 225, in _prepare
dist = self._prepare_distribution()
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 304, in _prepare_distribution
return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True)
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/operations/prepare.py", line 529, in prepare_linked_requirement
return self._prepare_linked_requirement(req, parallel_builds)
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/operations/prepare.py", line 644, in _prepare_linked_requirement
dist = _get_prepared_distribution(
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/operations/prepare.py", line 72, in _get_prepared_distribution
return abstract_dist.get_metadata_distribution()
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/distributions/wheel.py", line 26, in get_metadata_distribution
return get_wheel_distribution(wheel, canonicalize_name(self.req.name))
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/metadata/__init__.py", line 105, in get_wheel_distribution
return select_backend().Distribution.from_wheel(wheel, canonical_name)
File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/metadata/pkg_resources.py", line 123, in from_wheel
raise InvalidWheel(wheel.location, name) from e
pip._internal.exceptions.InvalidWheel: Wheel 'scipy' located at /tmp/scipy-1.9.3-pyston38-pyston_23_x86_64_linux_gnu-linux_x86_64.whl is invalid.
Remote version of pip: 23.2
Local version of pip: 23.2
Was pip installed by pip? False
Removed build tracker: '/tmp/pip-build-tracker-8rwyf93c'
When you have time, could you please check if there is a problem with this file?
@rocky, I had to hack a little bit an older version of scipy, but now it works. The next time you built mathics-omnibus, we can change the link to the wheel.
@rocky
I managed to pick the pre-built wheels from mathics-omnibus, and install them during the CI. However, for some reason,
scipywheel fails to be installed. I tryied to install it locally and I get this error:pip -v -v -v install scipy-1.9.3-pyston38-pyston_23_x86_64_linux_gnu-linux_x86_64.whl Using pip 23.2 from /home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip (python 3.8) Non-user install because site-packages writeable Created temporary directory: /tmp/pip-build-tracker-8rwyf93c Initialized build tracking at /tmp/pip-build-tracker-8rwyf93c Created build tracker: /tmp/pip-build-tracker-8rwyf93c Entered build tracker: /tmp/pip-build-tracker-8rwyf93c Created temporary directory: /tmp/pip-install-jpwha13a Created temporary directory: /tmp/pip-ephem-wheel-cache-ve265a8e Processing ./scipy-1.9.3-pyston38-pyston_23_x86_64_linux_gnu-linux_x86_64.whl Added scipy==1.9.3 from file:///tmp/scipy-1.9.3-pyston38-pyston_23_x86_64_linux_gnu-linux_x86_64.whl to build tracker '/tmp/pip-build-tracker-8rwyf93c' Removed scipy==1.9.3 from file:///tmp/scipy-1.9.3-pyston38-pyston_23_x86_64_linux_gnu-linux_x86_64.whl from build tracker '/tmp/pip-build-tracker-8rwyf93c' ERROR: Wheel 'scipy' located at /tmp/scipy-1.9.3-pyston38-pyston_23_x86_64_linux_gnu-linux_x86_64.whl is invalid. Exception information: Traceback (most recent call last): File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/metadata/pkg_resources.py", line 115, in from_wheel with wheel.as_zipfile() as zf: File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/metadata/base.py", line 679, in as_zipfile return zipfile.ZipFile(self.location, allowZip64=True) File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/zipfile.py", line 1269, in __init__ self._RealGetContents() File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/zipfile.py", line 1336, in _RealGetContents raise BadZipFile("File is not a zip file") zipfile.BadZipFile: File is not a zip file The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/cli/base_command.py", line 180, in exc_logging_wrapper status = run_func(*args) File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/cli/req_command.py", line 248, in wrapper return func(self, options, args) File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/commands/install.py", line 377, in run requirement_set = resolver.resolve( File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/resolver.py", line 73, in resolve collected = self.factory.collect_root_requirements(root_reqs) File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 491, in collect_root_requirements req = self._make_requirement_from_install_req( File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 453, in _make_requirement_from_install_req cand = self._make_candidate_from_link( File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 206, in _make_candidate_from_link self._link_candidate_cache[link] = LinkCandidate( File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 293, in __init__ super().__init__( File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 156, in __init__ self.dist = self._prepare() File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 225, in _prepare dist = self._prepare_distribution() File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 304, in _prepare_distribution return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True) File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/operations/prepare.py", line 529, in prepare_linked_requirement return self._prepare_linked_requirement(req, parallel_builds) File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/operations/prepare.py", line 644, in _prepare_linked_requirement dist = _get_prepared_distribution( File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/operations/prepare.py", line 72, in _get_prepared_distribution return abstract_dist.get_metadata_distribution() File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/distributions/wheel.py", line 26, in get_metadata_distribution return get_wheel_distribution(wheel, canonicalize_name(self.req.name)) File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/metadata/__init__.py", line 105, in get_wheel_distribution return select_backend().Distribution.from_wheel(wheel, canonical_name) File "/home/mauricio/.conda/envs/pyston2/lib/python3.8-pyston2.3/site-packages/pip/_internal/metadata/pkg_resources.py", line 123, in from_wheel raise InvalidWheel(wheel.location, name) from e pip._internal.exceptions.InvalidWheel: Wheel 'scipy' located at /tmp/scipy-1.9.3-pyston38-pyston_23_x86_64_linux_gnu-linux_x86_64.whl is invalid. Remote version of pip: 23.2 Local version of pip: 23.2 Was pip installed by pip? False Removed build tracker: '/tmp/pip-build-tracker-8rwyf93c'When you have time, could you please check if there is a problem with this file?
I think I ran into the same thing in reworking the docker file for more current sources. I had since removed that wheel, and for simplicity I now just stick to one version - Python 3.10.12 for everything.
OSX is taking over 7 minutes to run and Windows is taking over 11 minutes.
These times are too long to be used in our normal CI workflow.
OSX is taking over 7 minutes to run and Windows is taking over 11 minutes.
These times are too long to be used in our normal CI workflow.
Again, this is a reason to split mathics-core into a true "core" and move specialized builtins to different external modules. In any case, it worth to look at what is taking too much time.
OSX is taking over 7 minutes to run and Windows is taking over 11 minutes. These times are too long to be used in our normal CI workflow.
Again, this is a reason to split mathics-core into a true "core" and move specialized builtins to different external modules. In any case, it worth to look at what is taking too much time.
That is a possibility, but it is not the only one. There are other things that can be done that require less effort.
Currently 1-2 minutes is taken in just installing OS dependencies, and here, so not having to build packages over and over again helps.
But most of the time is in testing. So running tests in parallel based on modules can speed things up. But come to think of it, that option is available now, but using options to docpipeline and pytest.
However when autoloading is done, we have the possibility of having a more transparent way to separate WMA builtins that rely on special libraries like image libraries.
While yes, splitting up core would be nice, the remark that it is going to fix the problems in testing are glib.
I mentioned that getting this done and done properly is going to take time. But that aside, he overall computer time will be the same (actually, probably more since there is duplication in setup), but we'll be able to do more in parallel. Having more CI runs increases complexity a little.
I believe that in the free CI tier the overall CPU time is counted and once we go over that, then new CI jobs are not allowed to run. So again, parallelizing testing while it speeds things up in elapsed time, does not as directly address the problem as say reducing the OS and package setup time.
@mmatera I see that I was confused about things. If the other OS's were slow, then that is slowness that is in master as well! The CI code is the same as in master. My apologies here.
I am okay with merging this now (even if I am not enthusiastic about it).
@mmatera merge whenever you are satisfied.