djangocms-picture
djangocms-picture copied to clipboard
Django 4.2/DjangoCMS 3.11 support
If I ran python manage.py makemigrations
after updating to Django 4.2/DjangoCMS 3.11, new migrations will be generated.
Housekeeping
Support of Versions:
py311-django42-cms311
py310-django42-cms311
py310-django32-cms{38, 39, 310}
py39-django32-cms{38, 39, 310}
py38-django32-cms{38, 39, 310}
- [x] Update
setup.py
for supported Versions - [x] Add
project.toml
- [x] Update
CHANGELOG.rst
- [x] Add
.pre-commit-config.yaml
(Not sure how to use this)
Testing
- [x] Update tox.ini for testing. Update supported Versions
- [x] Autogenerate
tests/requirements/*
withcompile.py
Pre commit linting
- [x] Update
.github/workflows/lint.yml
Change testing fromflake8
&isort
toruff
- [x] Add
.github/workflows/lint-pr.yml
for conventional commits
Code Coverage
- [x] Add
.github/workflows/test.yml
.
Pypi publishing pipeline
- [x] Update
.github/workflows/publish-to-test-pypi.yml
- [x] Update
.github/workflows/publish-to-live-pypi.yml
Add migrations
There are already two pull requests, for adding migrations.
- #118
- #120
We should pick one, might need to rebase it onto master and merge it. The other pull request should be closed.
Would be best with updates based on https://github.com/django-cms/djangocms-audio/pull/32
I have edited the top comment into a to do list (more or less a list).
I have opened the pull request #123. It is to fix the tasks of this issue and complement #120.
Question: How do I prepare the environment in a way, that I can run tests/requirements/compile.py
?
Not every python-version is installed in my package manager.
I managed to run compile.py
with pyenv.
I think it should be considered to move a tool like compile.py
into a separated tool repository.
Can someone help me fixing the ci? I tried to configure the .y(a)ml files but they are quit confusing and there is so much tooling involved.
I think it should be considered to move a tool like
compile.py
into a separated tool repository.
Because it's a script used to generate the test requirements and not a generic module I'm not sure it'd work as a separate tool. Although with the tweaks that @fsbraun made to it recently it could be made into something that could be packaged up.
@marksweb Maybe it's a candidate for a manual github action. Would require a runner with multiple python versions installed and prevent contributors from drowning their hard drives with countless python versions. This might not even be difficult: https://github.com/actions/setup-python/blob/main/docs/advanced-usage.md#specifying-multiple-pythonpypy-version
- Running the action would need to create a PR that updates the test requirements.
- This would go through normal review.
- We might need to think of a mechanism that once a PR is created just updates it to avoid having countless PRs on test requirements
@fsbraun yes, love this idea. That'd make it really easy to upgrade requirements as well.
@marksweb @fsbraun
Although with the tweaks that @fsbraun made to it recently it could be made into something that could be packaged up.
What tweaks? Where is the most recent version? I did some tweaks myself: https://github.com/CERES-RUB/djangocms-picture/blob/feat-django-4-2-support/tests/requirements/compile.py
The problem with the same script in every repository, that they are drifting apart and it is hard to keep track of that.
It is for sure possible, that we turn this into a somewhat generic tool / module, that reeds the path of requirements.in
from an environment variable or command line parameter.
But I understand, if that is not priority right now.
@mogoh I have tracked down the changes Fabian made to djangocms-alias:
https://github.com/django-cms/djangocms-alias/blob/master/tests/requirements/compile.py
Looks like you were thinking along the same lines with your changes.
If we can develop this into an action, then that solves the issue of the script drifting between repos.
@marksweb Ah, thanks! I don't know about making it an action, because they are still complicated to me. But if that solves the problem, I am fine with that.
Here is another problem:
- My pull request #123 requires migrations for the tests to be successful.
- @vasekch s pull request #120 requieres tests to be merged successful.
So we are both depending on each other. Proposed solutions:
- Create a feature branch, merge both into the feature branch and then merge that feature branch.
- Integrate one pull request into the other and delete the original pull request. Then merge both.
* Integrate one pull request into the other and delete the original pull request. Then merge both.
I'm happy with closing mine PR in favor of #123, mine is one independent file so it will be easy to integrate.
@mogoh feel free to close #120 when done.
I'm happy with closing mine PR in favor of #123, mine is one independent file so it will be easy to integrate.
@vasekch thx, I will integrate your file. But I am no maintainer, so someone else must close your pull request, once it is done.
I can't please the coverage ci.
Here it runs with 99%: https://github.com/CERES-RUB/djangocms-picture/actions/runs/5669526899/job/15362500833 But in the individual cases it always has 100%. I don't know why.
https://github.com/CERES-RUB/djangocms-picture/actions/runs/5669526899/job/15362423893 https://github.com/CERES-RUB/djangocms-picture/actions/runs/5669526899/job/15362424081 https://github.com/CERES-RUB/djangocms-picture/actions/runs/5669526899/job/15362424300 https://github.com/CERES-RUB/djangocms-picture/actions/runs/5669526899/job/15362424467
If I run
tox run -f py38
tox run -f py311
python -m coverage report --fail-under=100
I get a coverage report of 100%.
If I how ever run
tox run -f py311
tox run -f py38
python -m coverage report --fail-under=100
I get a coverage report of 99%.
This suggest to me, that our coverage report is misconfigured and only works for the latest run of tox.
I think the reason, why coverage varies between python version might be the coverage of decorator functions. Also, if coverage vary between python versions, it might be this reason: https://github.com/nedbat/coveragepy/issues/866
I don't know how to achieve 100% coverage and help would be appreciated.
@mogoh Yeah I've never really liked configuring coverage and often found similar results.
There is the combine
command (docs). This combines multiple data files, so the idea being you run this before running the report.
So what happens if you run;
tox run -f py311
tox run -f py38
python -m coverage combine
python -m coverage html --skip-covered --skip-empty
python -m coverage report --fail-under=100
So, I learned that to combine, one needs to run coverage run --parallel-mode
, so that the .coverage
files are created like .coverage.<something>
.
So I did that.
But the problem did not went away. I have no idea why.
Here is what I did:
coverage erase
tox run -f py311
tox run -f py38
coverage combine
coverage html
coverage report --fail-under=100
This leads to 100% coverage
However this does not:
coverage erase
tox run -f py38
coverage combine
coverage html
coverage report --fail-under=100
Only 99% coverage.
If no one can find a solution, I suggest removing --fail-under=100
and accept less then 100% coverage.
Did you look into the different coverage reports. I would assume that some code is only run by py11...?
Here are some reports:
coverage erase
tox run -f py311
coverage combine
coverage html
yields to
coverage erase
tox run -f py38
coverage combine
coverage html
yields to
(BTW: coverage doesn't generate plain text reports, only short summaries, right? Thus screenshotted html reports.)
Hm, that seems more like differences between python versions counting lines..., Since it works with the newer versions, I'd leave the target. We all know that the 99% are ok.
Sorry, for taking so long to answer.
@fsbraun I am not sure if I understand you correctly.
If we don't change the 100%-code-coverage-target, we are not able to merge the change into master branch, right?
Since it works with the newer versions, I'd leave the target.
What do you mean by that? Leave the target in place (do not change it)? Or leave the target behind and change it?
Because the way I understand you is, that we should change it, but then the rest does not fit. Please clarify.
@mogoh if the 100% thing is the problem, just remove that from the report command.
Should this issue stay open, until the new support is released or should we close this? The other pull-requests (#118 and #120) should be closed in any case.