github-action icon indicating copy to clipboard operation
github-action copied to clipboard

422 Client Error: Unprocessable Entity

Open Mulugruntz opened this issue 4 years ago • 0 comments

Okay, I have the infamous issue. I've spent a few hours trying to make this work and I feel like I'm very close to it. But now I've got this 422 Client Error. Nothing that I could find online actually helped. So maybe it's an actual new bug (or maybe I'm just blind).

I've tried many different ways to do it.

This version is (mostly) a copy-paste of https://coveralls-python.readthedocs.io/en/latest/usage/configuration.html#github-actions-support

As we can see here, on a non-debug run: https://github.com/Mulugruntz/aiosubprocess/runs/2657803772?check_suite_focus=true#step:9:1

coveralls.exception.CoverallsException: Could not submit coverage: 422 Client Error: Unprocessable Entity for url: https://coveralls.io/api/v1/jobs
Error: Process completed with exit code 1.

Whereas on a debug run: https://github.com/Mulugruntz/aiosubprocess/runs/2657448470?check_suite_focus=true#step:9:1

Run coveralls debug --service=github
  coveralls debug --service=github
  shell: /usr/bin/bash -e {0}
  env:
    pythonLocation: /opt/hostedtoolcache/Python/3.6.13/x64
    COVERALLS_REPO_TOKEN: ***
    GITHUB_TOKEN: ***
    COVERALLS_FLAG_NAME: run-ubuntu-latest-py3.6
    COVERALLS_PARALLEL: true
Missing .coveralls.yml file. Using only env variables.
Testing coveralls-python...
{"source_files": [{"name": "aiosubprocess.py", "source": "\"\"\"Boilerplate for asyncio applications\"\"\"\n\nimport asyncio\nimport logging\nfrom asyncio import AbstractEventLoop\nfrom asyncio import create_subprocess_exec, create_subprocess_shell\nfrom asyncio.subprocess import PIPE, Process as asyncioProcess\nfrom functools import partial\nfrom itertools import count\nfrom typing import Callable, Tuple, Optional, Coroutine, Any\n\n__all__ = [\"Process\"]\n__version__ = \"2021.05.24\"\nlogger = logging.getLogger(\"aiosubprocess\")\n\nSLEEP_RESOLUTION = 0.1\n\n\nclass Process:\n    name_count = count()\n\n    def __init__(\n        self,\n        *command: str,\n        loop: Optional[AbstractEventLoop] = None,\n        name: Optional[str] = None,\n        expected_returncode: int = 0,\n        stdout: Callable[[str], None] = logger.info,\n        stderr: Callable[[str], None] = logger.error,\n        with_prefix: bool = True,\n        sleep_resolution: float = SLEEP_RESOLUTION,\n    ) -> None:\n        \"\"\"An async subprocess that keeps on getting stdout and stderr.\n\n        :param command: The shell command.\n        :param loop: An asyncio event loop.\n        :param name: Optional name of the subprocess (useful when logging).\n                    If not provided, will assign one automatically.\n        :param expected_returncode: Which error code is considered a success?\n        :param stdout: A callback called for every line of stdout.\n                    Useful for printing and logging.\n        :param stderr: A callback called for every line of stderr.\n                    Useful for printing and logging.\n        :param with_prefix: Should a prefix (based on `name`) be added before\n                    each stdout/stderr?\n        :param sleep_resolution: The minimum time resolution at which stdout/stderr\n                    will be checked. This is not guaranteed, as there might be\n                    long blocking operations somewhere else in the loop.\n        \"\"\"\n        self._run_command: Tuple[str, ...] = command\n        self.loop: AbstractEventLoop = loop or asyncio.get_event_loop()\n        self.child: Optional[asyncioProcess] = None\n        self.name = name or f\"AIO Subprocess-{next(Process.name_count)}\"\n        self.expected_returncode = expected_returncode\n        self.__prefix = f\"[{self.name}] \"\n        self.__stdout = stdout\n        self.__stderr = stderr\n        self.with_prefix = with_prefix\n        self.sleep_resolution = sleep_resolution\n\n    async def exec(self) -> bool:\n        return await self._run(partial(create_subprocess_exec, *self._run_command))\n\n    async def shell(self) -> bool:\n        return await self._run(\n            partial(create_subprocess_shell, \" \".join(self._run_command))\n        )\n\n    async def _run(\n        self, create_subprocess_function: Callable[..., Coroutine[Any, Any, asyncioProcess]]\n    ) -> bool:\n        logger.info(\"%sAbout to start %s\", self.__prefix, \" \".join(self._run_command))\n        self.child = await create_subprocess_function(stdout=PIPE, stderr=PIPE)\n        while True:\n            retcode = self.child.returncode\n            if retcode is not None:  # Process finished\n                await self._check_io()\n                break\n            else:\n                await asyncio.sleep(self.sleep_resolution)\n            await self._check_io()\n\n        if retcode != self.expected_returncode:\n            logger.error(\n                \"%sError! The subprocess terminated with a non-0 return code: %s\",\n                self.__prefix,\n                retcode,\n            )\n            return False\n\n        logger.info(\n            \"%sSuccess! The subprocess completed.\",\n            self.__prefix,\n        )\n        return True\n\n    async def _check_io(self):\n        out = asyncio.ensure_future(self._pipe_stdout(), loop=self.loop)\n        err = asyncio.ensure_future(self._pipe_stderr(), loop=self.loop)\n        await asyncio.gather(out, err, loop=self.loop)\n\n    async def _pipe_stdout(self):\n        while True:\n            output = (\n                (await self.child.stdout.readline())\n                .decode(errors=\"backslashreplace\")\n                .rstrip()\n            )\n            if not output:\n                break\n            if self.with_prefix:\n                self.__stdout(f\"{self.__prefix}{output}\")\n            else:\n                self.__stdout(output)\n\n    async def _pipe_stderr(self):\n        while True:\n            error = (\n                (await self.child.stderr.readline())\n                .decode(errors=\"backslashreplace\")\n                .rstrip()\n            )\n            if not error:\n                break\n            if self.with_prefix:\n                self.__stderr(f\"{self.__prefix}{error}\")\n            else:\n                self.__stderr(error)\n", "coverage": [null, null, 1, 1, 1, 1, 1, 1, 1, 1, null, 1, 1, 1, null, 1, null, null, 1, 1, null, 1, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, null, 1, 1, null, 1, 1, null, null, null, 1, null, null, 1, 1, 1, 1, 1, 1, 1, null, 1, 1, null, 1, 1, null, null, null, null, 1, null, 1, null, null, null, 1, null, 1, 1, 1, 1, null, 1, 1, 1, null, null, null, null, 1, 1, 1, 1, null, 1, null, 1, 1, 1, null, null, null, null, 1, 1, 1, 1, null, 1], "branches": [76, 0, 77, 1, 76, 0, 80, 1, 83, 0, 84, 1, 83, 0, 91, 1, 109, 0, 110, 1, 109, 0, 111, 1, 111, 0, 112, 1, 111, 0, 114, 1, 123, 0, 124, 1, 123, 0, 125, 1, 125, 0, 126, 1, 125, 0, 128, 1]}], "git": {"branch": "master", "head": {"id": "9b69ec3dcc5c922d637d95e8a407d6268cc7e501", "author_name": "Samuel Giffard", "author_email": "[email protected]", "committer_name": "Samuel Giffard", "committer_email": "[email protected]", "message": "(WIP) Fixing Code coverage (coveralls.io)."}, "remotes": [{"name": "origin", "url": "https://github.com/Mulugruntz/aiosubprocess"}]}, "service_job_id": null, "repo_token": "[secure]", "service_name": "github", "service_number": "871918197", "parallel": true, "flag_name": "run-ubuntu-latest-py3.6", "config_file": ".coveragerc"}
==
Reporting 1 files
==

aiosubprocess.py - 64/128

All the data seems to be there. But it won't proceed if I remove the debug command.

Any idea?

Mulugruntz avatar May 24 '21 17:05 Mulugruntz