wger
wger copied to clipboard
SSL Error when syncing ingredients
Priority/Impact
Medium (affects some functionality)
Description
When running sync-ingredients-async the procress runs for a few days, then crashes with the following
[2025-03-31 15:03:21,969: ERROR/ForkPoolWorker-8] Task wger.nutrition.tasks.sync_all_ingredients_task[65793efa-4752-4f36-b025-73c9bcce2e5e] raised unexpected: SSLError(MaxRetryError("HTTPSConnectionPool(host='wger.de', port=443): Max retries exceeded with url: /api/v2/ingredient/?limit=999&offset=864135 (Caused by SSLError(SSLEOFError(8, '[SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1000)')))"))
[2025-03-31 15:00:44,001: INFO/ForkPoolWorker-8] updated ingredient b5431217-8795-4d01-82fb-cf5f256ea822 - Fine Chopped Garlic In Pure Olive Oil, Pure Olive Oil
Traceback (most recent call last):
File "/home/wger/.local/lib/python3.12/site-packages/urllib3/connectionpool.py", line 464, in _make_request
self._validate_conn(conn)
File "/home/wger/.local/lib/python3.12/site-packages/urllib3/connectionpool.py", line 1093, in _validate_conn
File "/home/wger/.local/lib/python3.12/site-packages/urllib3/connection.py", line 741, in connect
conn.connect()
sock_and_verified = _ssl_wrap_socket_and_match_hostname(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/wger/.local/lib/python3.12/site-packages/urllib3/connection.py", line 920, in _ssl_wrap_socket_and_match_hostname
ssl_sock = ssl_wrap_socket(
^^^^^^^^^^^^^^^^
File "/home/wger/.local/lib/python3.12/site-packages/urllib3/util/ssl_.py", line 460, in ssl_wrap_socket
ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_in_tls, server_hostname)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/wger/.local/lib/python3.12/site-packages/urllib3/util/ssl_.py", line 504, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/ssl.py", line 455, in wrap_socket
return self.sslsocket_class._create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/ssl.py", line 1042, in _create
self.do_handshake()
File "/usr/lib/python3.12/ssl.py", line 1320, in do_handshake
self._sslobj.do_handshake()
ssl.SSLEOFError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1000)
During handling of the above exception, another exception occurred:
File "/home/wger/.local/lib/python3.12/site-packages/urllib3/connectionpool.py", line 787, in urlopen
response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/home/wger/.local/lib/python3.12/site-packages/urllib3/connectionpool.py", line 488, in _make_request
raise new_e
urllib3.exceptions.SSLError: [SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1000)
The above exception was the direct cause of the following exception:
File "/home/wger/.local/lib/python3.12/site-packages/requests/adapters.py", line 667, in send
resp = conn.urlopen(
^^^^^^^^^^^^^
File "/home/wger/.local/lib/python3.12/site-packages/urllib3/connectionpool.py", line 841, in urlopen
retries = retries.increment(
^^^^^^^^^^^^^^^^^^
File "/home/wger/.local/lib/python3.12/site-packages/urllib3/util/retry.py", line 519, in increment
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type]
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='wger.de', port=443): Max retries exceeded with url: /api/v2/ingredient/?limit=999&offset=864135 (Caused by SSLError(SSLEOFError(8, '[SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1000)')))
File "/home/wger/.local/lib/python3.12/site-packages/celery/app/trace.py", line 453, in trace_task
R = retval = fun(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^
File "/home/wger/.local/lib/python3.12/site-packages/celery/app/trace.py", line 736, in __protected_call__
return self.run(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/wger/src/wger/nutrition/tasks.py", line 67, in sync_all_ingredients_task
sync_ingredients(logger.info)
File "/home/wger/src/wger/nutrition/sync.py", line 305, in sync_ingredients
_sync_ingredients()
File "/home/wger/src/wger/nutrition/sync.py", line 256, in _sync_ingredients
for data in get_paginated(url, headers=wger_headers()):
File "/home/wger/src/wger/utils/requests.py", line 64, in get_paginated
response = requests.get(url, headers=headers).json()
return request("get", url, params=params, **kwargs)
File "/home/wger/.local/lib/python3.12/site-packages/requests/api.py", line 73, in get
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/wger/.local/lib/python3.12/site-packages/requests/api.py", line 59, in request
return session.request(method=method, url=url, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/wger/.local/lib/python3.12/site-packages/requests/sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "/home/wger/.local/lib/python3.12/site-packages/requests/sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/wger/.local/lib/python3.12/site-packages/requests/adapters.py", line 698, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='wger.de', port=443): Max retries exceeded with url: /api/v2/ingredient/?limit=999&offset=864135 (Caused by SSLError(SSLEOFError(8, '[SSL: UNEXPECTED_EOF_WHILE_READING] EOF occurred in violation of protocol (_ssl.c:1000)')))```
### Server version
_No response_
### Mobile app version
_No response_
Some background: if for whatever reason (cough ai crawlers hitting your server like there's no tomorrow cough) the server might be come unresponsive and the sync process breaks down.
- simple workaround -> allow the user to pass an offset
- better workaround -> better config so the server doesn't become unresponsive / configure requests to handle this better
- real solution -> don't sync by iterating over the API. We could create a dump of the ingredient data server side in regular intervals and import that like we can do with the off one (just a jsonl file that can be processed line by line). This one obviously needs a bit more thought but is the one we want long term
Issue to improve the sync task: #2056 closing here