etl
etl copied to clipboard
SSLError when running grapher steps
Problem
While working on this branch, and running, e.g. etl run faostat --grapher
I sometimes get the following error:
botocore.exceptions.SSLError: SSL validation failed for https://078fcdfed9955087315dd86792e71a7e.r2.cloudflarestorage.com/owid-api-staging/prosado/v1/indicators/831789.data.json [SSL: SSLV3_ALERT_BAD_RECORD_MAC] ssl/tls alert bad record mac (_ssl.c:2580)
After trying a few times, the step was finished successfully.
Impact
- It appears to be happening on arm64 Macs, but not in production, so it's a dev issue
- Even before we were checking the errors, Lars was aggressively mirroring our API data daily, and did not encounter holes in the data
Technical notes
- I also noticed that in the staging PR server the issue did not happen, and the grapher steps were successfully executed.
- Previously we had futures that weren't being awaited
- That means that errors like this were happening silently and we didn't know
- Now that we await them, we realise that they're happening
Click for logs
``` --- 2. grapher://grapher/faostat/2024-03-14/faostat_lc... 2024-03-20 11:00:29 [info ] upsert_dataset.verify_namespace namespace=faostat 2024-03-20 11:00:29 [info ] upsert_dataset.upsert_dataset.start short_name=faostat_lc 2024-03-20 11:00:29 [info ] upsert_dataset.upsert_dataset.end id=6417 short_name=faostat_lc 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=1360 variable_id=831767 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=536 variable_id=831768 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=7873 variable_id=831766 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=5717 variable_id=831769 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=1360 variable_id=831770 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=1360 variable_id=831772 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=536 variable_id=831773 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=7873 variable_id=831771 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=5717 variable_id=831774 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=7873 variable_id=831775 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=1360 variable_id=831776 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=536 variable_id=831781 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=5717 variable_id=831782 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=536 variable_id=831778 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=7873 variable_id=831779 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=1360 variable_id=831780 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=5717 variable_id=831777 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=7873 variable_id=831783 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=5717 variable_id=831784 2024-03-20 11:00:30 [info ] upsert_table.uploaded_to_s3 size=536 variable_id=831785 2024-03-20 11:00:40 [info ] upsert_dataset.continue_without_logging 2024-03-20 11:00:40,071 - [bugsnag] WARNING - No API key configured, couldn't notify Traceback (most recent call last): File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/connectionpool.py", line 715, in urlopen httplib_response = self._make_request( ^^^^^^^^^^^^^^^^^^^ File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/connectionpool.py", line 467, in _make_request six.raise_from(e, None) File "During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/httpsession.py", line 448, in send
urllib_response = conn.urlopen(
^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/connectionpool.py", line 799, in urlopen
retries = retries.increment(
^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/util/retry.py", line 525, in increment
raise six.reraise(type(error), error, _stacktrace)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/packages/six.py", line 769, in reraise
raise value.with_traceback(tb)
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/connectionpool.py", line 715, in urlopen
httplib_response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/connectionpool.py", line 467, in _make_request
six.raise_from(e, None)
File "
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/prosado/Documents/owid/repos/etl/.venv/bin/etl", line 6, in
</details>
I couldn't replicate it, but added SSLError
exception to a list of errors that should be retried. Fingers crossed...
Pablo's intuition is that longer steps are more likely to break.
Things we could try:
- [x] Moar retries
- [ ] A newer version of botocore
- [ ] Decreasing parallelism (different to
--workers=N
) - [ ] Rebuilding Python on Pablo's machine with different/newer openssl (if Pablo only)
An offline datalog would also avoid this issue 😃
@pabloarosado could you try setting GRAPHER_WORKERS env variable to a value lower than 10? Something like 5 could be fast enough.
Even using GRAPER_WORKERS=5, I still get tenacity.RetryError: RetryError[<Future at 0x1665dedd0 state=finished raised SSLError>]
. So I keep running the steps multiple times until they all finish. This is annoying when working on FAOSTAT in local grapher (which I'm doing at the moment), but it may not affect other datasets, or working on staging.
Suggest to pair with @Marigold early next week to try to resolve it.
@pabloarosado Are you still getting this issue? If not, I think we should close it.
I just experienced this error again on local grapher:
--- 7. grapher://grapher/emdat/2024-04-11/natural_disasters (1m 13.8s)
2024-05-07 16:42:23 [info ] upsert_dataset.verify_namespace namespace=emdat
2024-05-07 16:42:23 [info ] upsert_dataset.upsert_dataset.start short_name=natural_disasters
2024-05-07 16:42:23 [info ] upsert_dataset.upsert_dataset.end id=6477 short_name=natural_disasters
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899577
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899579
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899589
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=900639
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=900640
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=900641
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899581
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899580
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899575
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899587
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899585
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899588
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=900642
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899584
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899576
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899578
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899582
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899586
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899583
2024-05-07 16:42:25 [info ] upsert_table.uploaded_to_s3 size=3068 variable_id=899590
2024-05-07 16:42:30 [warning ] yield_wide_table.null_variable column=insured_damages_per_gdp_fog_yearly dim_dict={}
2024-05-07 16:42:32 [warning ] yield_wide_table.null_variable column=reconstruction_costs_per_gdp_fog_yearly dim_dict={}
2024-05-07 16:42:33 [warning ] yield_wide_table.null_variable column=total_damages_per_gdp_fog_yearly dim_dict={}
2024-05-07 16:42:33 [info ] upsert_dataset.continue_without_logging
2024-05-07 16:44:45,197 - [bugsnag] WARNING - No API key configured, couldn't notify
Traceback (most recent call last):
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/connectionpool.py", line 715, in urlopen
httplib_response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/connectionpool.py", line 467, in _make_request
six.raise_from(e, None)
File "<string>", line 3, in raise_from
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/connectionpool.py", line 462, in _make_request
httplib_response = conn.getresponse()
^^^^^^^^^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/http/client.py", line 1378, in getresponse
response.begin()
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/http/client.py", line 318, in begin
version, status, reason = self._read_status()
^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/http/client.py", line 279, in _read_status
line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/socket.py", line 706, in readinto
return self._sock.recv_into(b)
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/ssl.py", line 1311, in recv_into
return self.read(nbytes, buffer)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/ssl.py", line 1167, in read
return self._sslobj.read(len, buffer)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ssl.SSLError: [SSL: SSLV3_ALERT_BAD_RECORD_MAC] ssl/tls alert bad record mac (_ssl.c:2580)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/httpsession.py", line 448, in send
urllib_response = conn.urlopen(
^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/connectionpool.py", line 799, in urlopen
retries = retries.increment(
^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/util/retry.py", line 525, in increment
raise six.reraise(type(error), error, _stacktrace)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/packages/six.py", line 769, in reraise
raise value.with_traceback(tb)
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/connectionpool.py", line 715, in urlopen
httplib_response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/connectionpool.py", line 467, in _make_request
six.raise_from(e, None)
File "<string>", line 3, in raise_from
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/urllib3/connectionpool.py", line 462, in _make_request
httplib_response = conn.getresponse()
^^^^^^^^^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/http/client.py", line 1378, in getresponse
response.begin()
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/http/client.py", line 318, in begin
version, status, reason = self._read_status()
^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/http/client.py", line 279, in _read_status
line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/socket.py", line 706, in readinto
return self._sock.recv_into(b)
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/ssl.py", line 1311, in recv_into
return self.read(nbytes, buffer)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/ssl.py", line 1167, in read
return self._sslobj.read(len, buffer)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
urllib3.exceptions.SSLError: [SSL: SSLV3_ALERT_BAD_RECORD_MAC] ssl/tls alert bad record mac (_ssl.c:2580)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/prosado/Documents/owid/repos/etl/apps/backport/datasync/datasync.py", line 37, in upload_gzip_dict
client.put_object(
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/client.py", line 508, in _api_call
return self._make_api_call(operation_name, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/client.py", line 898, in _make_api_call
http, parsed_response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/client.py", line 921, in _make_request
return self._endpoint.make_request(operation_model, request_dict)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/endpoint.py", line 119, in make_request
return self._send_request(request_dict, operation_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/endpoint.py", line 202, in _send_request
while self._needs_retry(
^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/endpoint.py", line 354, in _needs_retry
responses = self._event_emitter.emit(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/hooks.py", line 412, in emit
return self._emitter.emit(aliased_event_name, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/hooks.py", line 256, in emit
return self._emit(event_name, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/hooks.py", line 239, in _emit
response = handler(**kwargs)
^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/retryhandler.py", line 207, in __call__
if self._checker(**checker_kwargs):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/retryhandler.py", line 284, in __call__
should_retry = self._should_retry(
^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/retryhandler.py", line 320, in _should_retry
return self._checker(attempt_number, response, caught_exception)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/retryhandler.py", line 363, in __call__
checker_response = checker(
^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/retryhandler.py", line 247, in __call__
return self._check_caught_exception(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/retryhandler.py", line 416, in _check_caught_exception
raise caught_exception
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/endpoint.py", line 281, in _do_get_response
http_response = self._send(request)
^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/endpoint.py", line 377, in _send
return self.http_session.send(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/botocore/httpsession.py", line 475, in send
raise SSLError(endpoint_url=request.url, error=e)
botocore.exceptions.SSLError: SSL validation failed for https://078fcdfed9955087315dd86792e71a7e.r2.cloudflarestorage.com/owid-api-staging/prosado/v1/indicators/900684.data.json [SSL: SSLV3_ALERT_BAD_RECORD_MAC] ssl/tls alert bad record mac (_ssl.c:2580)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/prosado/Documents/owid/repos/etl/.venv/bin/etl", line 6, in <module>
sys.exit(cli())
^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/click/core.py", line 1157, in __call__
return self.main(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/rich_click/rich_command.py", line 126, in main
rv = self.invoke(ctx)
^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/click/core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/etl/command.py", line 217, in main_cli
main(**kwargs) # type: ignore
^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/etl/command.py", line 245, in main
run_dag(
File "/Users/prosado/Documents/owid/repos/etl/etl/command.py", line 362, in run_dag
return exec_steps(steps, strict=strict)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/etl/command.py", line 378, in exec_steps
time_taken = timed_run(lambda: step.run())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/etl/command.py", line 582, in timed_run
f()
File "/Users/prosado/Documents/owid/repos/etl/etl/command.py", line 378, in <lambda>
time_taken = timed_run(lambda: step.run())
^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/etl/steps/__init__.py", line 854, in run
variable_upsert_results = [future.result() for future in as_completed(futures)]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/etl/steps/__init__.py", line 854, in <listcomp>
variable_upsert_results = [future.result() for future in as_completed(futures)]
^^^^^^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/etl/grapher_import.py", line 320, in upsert_table
[f.result() for f in futures]
File "/Users/prosado/Documents/owid/repos/etl/etl/grapher_import.py", line 320, in <listcomp>
[f.result() for f in futures]
^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/concurrent/futures/_base.py", line 456, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/Users/prosado/.pyenv/versions/3.11.5/lib/python3.11/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/apps/backport/datasync/datasync.py", line 31, in upload_gzip_dict
for attempt in Retrying(
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 347, in __iter__
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/prosado/Documents/owid/repos/etl/.venv/lib/python3.11/site-packages/tenacity/__init__.py", line 326, in iter
raise retry_exc from fut.exception()
tenacity.RetryError: RetryError[<Future at 0x16ec8b3d0 state=finished raised SSLError>]
I also have the impression that grapher steps get stuck at the end for a few seconds, sometimes over a minute, before ending.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Looks like this is not as much an issue now