anchore-engine icon indicating copy to clipboard operation
anchore-engine copied to clipboard

Failed to POST Large Result to DB/Catalog

Open m4dsc1 opened this issue 2 years ago • 1 comments

Is this a request for help?:

Yes

Is this a BUG REPORT or a FEATURE REQUEST? (choose one):

BUG REPORT

docker-compose exec api anchore-cli system status Service simplequeue (anchore-quickstart, http://queue:8228): up Service catalog (anchore-quickstart, http://catalog:8228): up Service policy_engine (anchore-quickstart, http://policy-engine:8228): up Service analyzer (anchore-quickstart, http://analyzer:8228): up Service apiext (anchore-quickstart, http://api:8228): up

Engine DB Version: 0.0.13 Engine Code Version: 0.8.2

What happened:

While processing an ~3GB container image the scan fails to complete successfully. The error messages from the anchore-worker.log (shown below) indicates that the scan completes successfully but fails to record the result due to a Postgres DB error - 'invalid memory alloc request size 1073741824'.

What did you expect to happen:

Expectation was for a successful result to be recorded for the scan.

Any relevant log output from /var/log/anchore:

anchore-worker.log 2021-10-18 16:24:31+0000 [-] [Thread-85] [anchore_engine.services.analyzer/perform_analyze_nodocker()] [INFO] performing analysis on image complete: <path-ecr/container-name@sha256:hash> 2021-10-18 16:24:31+0000 [-] [Thread-85] [anchore_engine.services.analyzer/process_analyzer_job()] [INFO] adding image analysis data to catalog: userid=admin imageId=<hash> imageDigest=<hash> 2021-10-18 16:24:42+0000 [-] "127.0.0.1" - - [18/Oct/2021:16:24:41 +0000] "GET /health HTTP/1.1" 200 5 "-" "curl/7.61.1" 2021-10-18 16:25:22+0000 [-] [Thread-85] [anchore_engine.clients.services.internal/dispatch()] [ERROR] Failed client call to service catalog for url: http://catalog:8228/v1/objects/analysis_data/<hash> Response: {'httpcode': 500, 'anchore_error_raw': 'b\'{\\n "detail": {\\n "error_codes": []\\n },\\n "httpcode": 500,\\n "message": "(psycopg2.errors.InternalError_) invalid memory alloc request size 1073741824\\\\n\\\\n[SQL: INSERT INTO archive_document (bucket, \\\\"archiveId\\\\", \\\\"userId\\\\", \\\\"documentName\\\\", created_at, last_updated, record_state_key, record_state_val, jsondata, b64_encoded) VALUES (%(bucket)s, %(archiveId)s, %(userId)s, %(documentName)s, %(created_at)s, %(last_updated)s, %(record_state_key)s, %(record_state_val)s, %(jsondata)s, %(b64_encoded)s)]\\\\n[parameters: {\\\'bucket\\\': \\\'analysis_data\\\', \\\'archiveId\\\': \\\’<hash>\\\’, \\\'userId\\\': \\\'admin\\\', \\\'documentName\\\': \\\’<hash>.json\\\’, \\\'created_at\\\': 1634574296, \\\'last_updated\\\': 1634574296, \\\'record_state_key\\\': \\\'active\\\', \\\'record_state_val\\\': None, \\\'jsondata\\\': \\\'{\\\\"document\\\\": [{\\\\"image\\\\": {\\\\"imageId\\\\": \\\\”<hash>\\\\”, \\\\"imagedata\\\\": {\\\\"analyzer_manifest\\\\": {}, \\\\"analy ... (775495736 characters truncated) ... <hash>\\\\”], \\\\"RepoTags\\\\": [\\\\"<path-ecr/container-name:tag>\\\\”]}}}}}]}\\\’, \\\'b64_encoded\\\': False}]\\\\n(Background on this error at: http://sqlalche.me/e/2j85)"\\n}\\n\'', 'anchore_error_json': {'detail': {'error_codes': []}, 'httpcode': 500, 'message': '(psycopg2.errors.InternalError_) invalid memory alloc request size 1073741824\n\n[SQL: INSERT INTO archive_document (bucket, "archiveId", "userId", "documentName", created_at, last_updated, record_state_key, record_state_val, jsondata, b64_encoded) VALUES (%(bucket)s, %(archiveId)s, %(userId)s, %(documentName)s, %(created_at)s, %(last_updated)s, %(record_state_key)s, %(record_state_val)s, %(jsondata)s, %(b64_encoded)s)]\n[parameters: {\'bucket\': \'analysis_data\', \'archiveId\': \’<hash>\’, \'userId\': \'admin\', \'documentName\': \’<hash>.json\', \'created_at\': 1634574296, \'last_updated\': 1634574296, \'record_state_key\': \'active\', \'record_state_val\': None, \'jsondata\': \'{"document": [{"image": {"imageId": "<hash>”, "imagedata": {"analyzer_manifest": {}, "analy ... (775495736 characters truncated) ... <hash>”], "RepoTags": [“<path-ecr/container-name:tag>“]}}}}}]}\’, \'b64_encoded\': False}]\n(Background on this error at: http://sqlalche.me/e/2j85)'}} 2021-10-18 16:25:22+0000 [-] Traceback (most recent call last): 2021-10-18 16:25:22+0000 [-] File "/usr/local/lib/python3.6/site-packages/anchore_engine/services/analyzer/__init__.py", line 145, in process_analyzer_job 2021-10-18 16:25:22+0000 [-] rc = catalog_client.put_document('analysis_data', imageDigest, image_data) 2021-10-18 16:25:22+0000 [-] File "/usr/local/lib/python3.6/site-packages/anchore_engine/clients/services/catalog.py", line 179, in put_document 2021-10-18 16:25:22+0000 [-] return self.call_api(http.anchy_post, 'objects/{bucket}/{name}', path_params={'bucket': bucket, 'name': name}, body=json.dumps(payload)) 2021-10-18 16:25:22+0000 [-] File "/usr/local/lib/python3.6/site-packages/anchore_engine/clients/services/internal.py", line 179, in call_api 2021-10-18 16:25:22+0000 [-] return self.dispatch(base_url, method, path, path_params, query_params, extra_headers, body, connect_timeout, read_timeout, files) 2021-10-18 16:25:22+0000 [-] File "/usr/local/lib/python3.6/site-packages/anchore_engine/clients/services/internal.py", line 237, in dispatch 2021-10-18 16:25:22+0000 [-] raise e 2021-10-18 16:25:22+0000 [-] File "/usr/local/lib/python3.6/site-packages/anchore_engine/clients/services/internal.py", line 234, in dispatch 2021-10-18 16:25:22+0000 [-] return method(url=final_url, headers=request_headers, data=body, auth=auth, params=filtered_qry_params, verify=self.verify_ssl, files=files) 2021-10-18 16:25:22+0000 [-] File "/usr/local/lib/python3.6/site-packages/anchore_engine/clients/services/http.py", line 269, in anchy_post 2021-10-18 16:25:22+0000 [-] raise e 2021-10-18 16:25:22+0000 [-] Exception: failed post url=http://catalog:8228/v1/objects/analysis_data/<hash> 2021-10-18 16:25:22+0000 [-] 2021-10-18 16:25:22+0000 [-] During handling of the above exception, another exception occurred: 2021-10-18 16:25:22+0000 [-] 2021-10-18 16:25:22+0000 [-] Traceback (most recent call last): 2021-10-18 16:25:22+0000 [-] File "/usr/local/lib/python3.6/site-packages/anchore_engine/services/analyzer/__init__.py", line 150, in process_analyzer_job 2021-10-18 16:25:22+0000 [-] raise err 2021-10-18 16:25:22+0000 [-] anchore_engine.services.analyzer.CatalogClientError: Failed to upload analysis data to catalog - exception: failed post url=http://catalog:8228/v1/objects/analysis_data/<hash> 2021-10-18 16:25:22+0000 [-] [Thread-85] [anchore_engine.services.analyzer/process_analyzer_job()] [ERROR] problem analyzing image - exception: Failed to upload analysis data to catalog - exception: failed post url=http://catalog:8228/v1/objects/analysis_data/<hash>

What docker images are you using:

  • https://github.com/overleaf/overleaf

How to reproduce the issue:

  1. Add large container (likely with a large number of vulnerabilities)
  • docker-compose exec api anchore-cli image wait path-to-ecr/container-name:tag
  1. wait for analysis to complete
  2. image fails to analyze and following error reported
  • Error: Requested image not found in system

Anything else we need to know:

Appears to be a known issue with Postgres accepting large number of rows/datasize within 1 field based upon searches of the error.

m4dsc1 avatar Oct 18 '21 17:10 m4dsc1

Hi @m4dsc1 we'll take a look. The insertion you're referring to is the save of the analysis document (a single large json object) into the db, so that size is a function of the amount of things found in an image.

zhill avatar Dec 02 '21 06:12 zhill