codecov-action
codecov-action copied to clipboard
Unable to find coverage xml file in current working directory
After upgrading to v4, the upload step consistently fails with this message:
debug - 2024-02-07 17:18:18,054 -- Running preparation plugin: <class 'codecov_cli.plugins.pycoverage.Pycoverage'>
info - 2024-02-07 17:18:18,054 -- Generating coverage.xml report in /home/runner/work/tm_devices/tm_devices
debug - 2024-02-07 17:18:18,146 -- Collecting relevant files
warning - 2024-02-07 17:18:18,156 -- Some files being explicitly added are found in the list of excluded files for upload. --- {"files": [".coverage_tests.xml"]}
warning - 2024-02-07 17:18:18,175 -- Some files were not found --- {"not_found_files": [".coverage_tests.xml"]}
info - 2024-02-07 17:18:18,193 -- Found 0 coverage files to upload
Error: No coverage reports found. Please make sure you're generating reports successfully.
See https://github.com/tektronix/tm_devices/actions/runs/7818613874/job/21329253422 for details.
What needs to be done for v4 of this action to be able to find the xml file that does exist at that location?
This is currently blocking https://github.com/tektronix/tm_devices/pull/140
This is what the contents of the working directory are:
ls -la
total 10920
drwxr-xr-x 13 runner docker 4096 Feb 7 17:18 .
drwxr-xr-x 3 runner docker 4096 Feb 7 17:16 ..
-rw-r--r-- 1 runner docker 10817536 Feb 7 17:18 .coverage
-rw-r--r-- 1 runner docker 210820 Feb 7 17:18 .coverage_tests.xml
drwxr-xr-x 8 runner docker 4096 Feb 7 17:16 .git
-rw-r--r-- 1 runner docker 459 Feb 7 17:16 .gitattributes
drwxr-xr-x 4 runner docker 4096 Feb 7 17:16 .github
-rw-r--r-- 1 runner docker 1288 Feb 7 17:16 .gitignore
-rw-r--r-- 1 runner docker 4[11](https://github.com/tektronix/tm_devices/actions/runs/7818613874/job/21329253422#step:7:12)1 Feb 7 17:16 .pre-commit-config.yaml
-rw-r--r-- 1 runner docker 454 Feb 7 17:16 .readthedocs.yml
drwxr-xr-x 3 runner docker 4096 Feb 7 17:18 .results_tests
drwxr-xr-x 3 runner docker 4096 Feb 7 17:17 .ruff_cache
drwxr-xr-x 6 runner docker 4096 Feb 7 17:17 .tox
-rw-r--r-- 1 runner docker 15334 Feb 7 17:16 CHANGELOG.md
-rw-r--r-- 1 runner docker 3276 Feb 7 17:16 CODE_OF_CONDUCT.md
-rw-r--r-- 1 runner docker 6503 Feb 7 17:16 CONTRIBUTING.md
-rw-r--r-- 1 runner docker 10[12](https://github.com/tektronix/tm_devices/actions/runs/7818613874/job/21329253422#step:7:13)6 Feb 7 17:16 LICENSE.md
-rw-r--r-- 1 runner docker 11485 Feb 7 17:16 README.rst
-rw-r--r-- 1 runner docker 320 Feb 7 17:16 SECURITY.md
-rw-r--r-- 1 runner docker 164 Feb 7 17:16 codecov.yml
drwxr-xr-x 7 runner docker 4096 Feb 7 17:16 docs
drwxr-xr-x 6 runner docker 4096 Feb 7 17:16 examples
-rw-r--r-- 1 runner docker [13](https://github.com/tektronix/tm_devices/actions/runs/7818613874/job/21329253422#step:7:14)897 Feb 7 17:[16](https://github.com/tektronix/tm_devices/actions/runs/7818613874/job/21329253422#step:7:17) pyproject.toml
drwxr-xr-x 2 runner docker 4096 Feb 7 [17](https://github.com/tektronix/tm_devices/actions/runs/7818613874/job/21329253422#step:7:18):16 python_semantic_release_templates
drwxr-xr-x 2 runner docker 4096 Feb 7 17:16 scripts
drwxr-xr-x 3 runner docker 4096 Feb 7 17:16 src
drwxr-xr-x 5 runner docker 4096 Feb 7 17:17 tests
I do have the very same issue and I can also confirm that this is a regression compared to v3.
Hi @nfelt14 thanks for raising this. It looks like (from reading the error message) that there are some files in an excluded path that are specified to be uplaoded. This is indeed a regression, and is related to what others have reported (see https://github.com/codecov/feedback/issues/265)
We're actively working on a fix for this, you can follow along https://github.com/codecov/engineering-team/issues/1143
@nfelt14 @dokempf I made a fix that might have fixed your issue, would you be able to try again and see if it's working now?
@nfelt14 @dokempf I made a fix that might have fixed your issue, would you be able to try again and see if it's working now?
@thomasrockhu-codecov Is there a specific version of the action I should use? Re-running the jobs as-is didn't make a difference.
@rohan-at-sentry, is there any update on this? I saw the linked issue was fixed, but this issue appears to not be fixed.
@nfelt14 - We merged the linked issue 2 days ago. We haven't triggered a release yet, which could probably be why your runs still fail (I noticed v 0.4.8 as the CLI being used in a failing run on your repo from 30 min ago)
I'll reach back out here notifying when we've released the CLI
@nfelt14 - We merged the linked issue 2 days ago. We haven't triggered a release yet, which could probably be why your runs still fail (I noticed
v 0.4.8as the CLI being used in a failing run on your repo from 30 min ago)I'll reach back out here notifying when we've released the CLI
@rohan-at-sentry, any update on when a new release will be made?
@nfelt14 - We merged the linked issue 2 days ago. We haven't triggered a release yet, which could probably be why your runs still fail (I noticed
v 0.4.8as the CLI being used in a failing run on your repo from 30 min ago) I'll reach back out here notifying when we've released the CLI@rohan-at-sentry, any update on when a new release will be made?
@rohan-at-sentry any update on a timeline for a release that will fix this issue?
@nfelt14 We're aiming for Thursday next week (28th) ... I'll update here when it's done
@nfelt14 can you please try again (the CLI has a new minor version that should help with this issue - no upgrade of the GH action should be necessary)
@nfelt14 can you please try again (the CLI has a new minor version that should help with this issue - no upgrade of the GH action should be necessary)
I already tried it and it works as expected. Thanks!