sentry-generic-metrics-consumer and sentry-metrics-consumer crashed
Problem Statement
django.db.utils.OperationalError: server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request. server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request.
SQL: SELECT "sentry_stringindexer"."id", "sentry_stringindexer"."string", "sentry_stringindexer"."organization_id", "sentry_stringindexer"."date_added", "sentry_stringindexer"."last_seen", "sentry_stringindexer"."retention_days" FROM "sentry_stringindexer" WHERE (("sentry_stringindexer"."organization_id" = %s AND "sentry_stringindexer"."string" = %s) OR ("sentry_stringindexer"."organization_id" = %s AND "sentry_stringindexer"."string" = %s) OR ("sentry_stringindexer"."organization_id" = %s AND "sentry_stringindexer"."string" = %s)) 07:37:09 [ERROR] arroyo.processing.processor: Caught exception, shutting down... 07:37:09 [INFO] arroyo.processing.strategies.run_task_with_multiprocessing: Terminating <arroyo.processing.strategies.run_task_with_multiprocessing.MultiprocessingPool object at 0x7fb44f162120>... 07:37:09 [INFO] arroyo.processing.strategies.run_task_with_multiprocessing: Shutting down <multiprocessing.managers.SharedMemoryManager object at 0x7fb4553252b0>... 07:37:09 [INFO] arroyo.processing.strategies.run_task_with_multiprocessing: Terminating <sentry.sentry_metrics.consumers.indexer.parallel.Unbatcher object at 0x7fb455325010>... 07:37:09 [INFO] arroyo.processing.processor: Closing <arroyo.backends.kafka.consumer.KafkaConsumer object at 0x7fb44f1634d0>... 07:37:09 [INFO] arroyo.processing.processor: Partitions to revoke: [Partition(topic=Topic(name='ingest-metrics'), index=0)] 07:37:09 [INFO] arroyo.processing.processor: Partition revocation complete. 07:37:09 [INFO] arroyo.backends.kafka.consumer: Paused partitions after revocation: set() 07:37:09 [INFO] arroyo.processing.processor: Processor terminated multiprocessing.pool.RemoteTraceback:
Solution Brainstorm
No response
Can you take a look on your system resource? Do you have maxed out CPU / RAM during that time?
limits cpu and memory not set. At moment set limit memory 1Gi and request 512 Mi
it didn't help. Just this two service reboot at 24h until 25 times
Can you extract more logs from that container?
08:31:00 [INFO] arroyo.processing.processor: New partitions assigned: {Partition(topic=Topic(name='ingest-performance-metrics'), index=0): 9609087} 08:31:00 [INFO] arroyo.processing.processor: Member id: 'rdkafka-fd08ffb3-6f16-45f8-958a-c08b59f3d40f' 08:31:00 [INFO] arroyo.backends.kafka.consumer: Paused partitions after assignment: set() Traceback (most recent call last): File "/.venv/lib/python3.13/site-packages/django/db/backends/utils.py", line 105, in _execute return self.cursor.execute(sql, params) ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ File "/usr/src/sentry/src/sentry/db/postgres/decorators.py", line 16, in inner return func(self, *args, **kwargs) File "/usr/src/sentry/src/sentry/db/postgres/base.py", line 95, in execute return self.cursor.execute(sql, params) ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ psycopg2.OperationalError: server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request. server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request.
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/.venv/lib/python3.13/site-packages/arroyo/processing/strategies/run_task_with_multiprocessing.py", line 252, in parallel_run_task_worker_apply function(cast(Message[TStrategyPayload], message)) ~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/sentry/src/sentry/sentry_metrics/consumers/indexer/processing.py", line 85, in process_messages return self._process_messages_impl(outer_message) ~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ File "/usr/src/sentry/src/sentry/sentry_metrics/consumers/indexer/processing.py", line 134, in _process_messages_impl record_result = self._indexer.bulk_record(extracted_strings) File "/usr/src/sentry/src/sentry/sentry_metrics/indexer/strings.py", line 300, in bulk_record indexer_results = self.indexer.bulk_record( { ...<2 lines>... } ) File "/usr/src/sentry/src/sentry/sentry_metrics/indexer/cache.py", line 252, in bulk_record db_record_key_results = self.indexer.bulk_record( { ...<2 lines>... } ) File "/usr/src/sentry/src/sentry/sentry_metrics/indexer/postgres/postgres_v2.py", line 244, in bulk_record return self._bulk_record(strings) ~~~~~~~~~~~~~~~~~^^^^^^^^^ File "/usr/src/sentry/src/sentry/sentry_metrics/indexer/postgres/postgres_v2.py", line 141, in _bulk_record for db_obj in self._get_db_records(db_read_keys) ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^ File "/.venv/lib/python3.13/site-packages/django/db/models/query.py", line 384, in iter self._fetch_all() ~~~~~~~~~~~~~~~^^ File "/.venv/lib/python3.13/site-packages/django/db/models/query.py", line 1945, in _fetch_all self._result_cache = list(self._iterable_class(self)) ~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/.venv/lib/python3.13/site-packages/django/db/models/query.py", line 91, in iter results = compiler.execute_sql( chunked_fetch=self.chunked_fetch, chunk_size=self.chunk_size ) File "/.venv/lib/python3.13/site-packages/django/db/models/sql/compiler.py", line 1623, in execute_sql cursor.execute(sql, params) ~~~~~~~~~~~~~~^^^^^^^^^^^^^ File "/.venv/lib/python3.13/site-packages/sentry_sdk/utils.py", line 1816, in runner return sentry_patched_function(*args, **kwargs) File "/.venv/lib/python3.13/site-packages/sentry_sdk/integrations/django/init.py", line 652, in execute result = real_execute(self, sql, params) File "/.venv/lib/python3.13/site-packages/django/db/backends/utils.py", line 79, in execute return self._execute_with_wrappers( ~~~~~~~~~~~~~~~~~~~~~~~~~~~^ sql, params, many=False, executor=self._execute ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/.venv/lib/python3.13/site-packages/django/db/backends/utils.py", line 92, in _execute_with_wrappers return executor(sql, params, many, context) File "/usr/src/sentry/src/sentry/db/postgres/base.py", line 70, in _execute__include_sql_in_error return execute(sql, params, many, context) File "/usr/src/sentry/src/sentry/db/postgres/base.py", line 58, in _execute__clean_params return execute(sql, clean_bad_params(params), many, context) File "/.venv/lib/python3.13/site-packages/django/db/backends/utils.py", line 100, in _execute with self.db.wrap_database_errors: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/.venv/lib/python3.13/site-packages/django/db/utils.py", line 91, in exit raise dj_exc_value.with_traceback(traceback) from exc_value File "/.venv/lib/python3.13/site-packages/django/db/backends/utils.py", line 105, in _execute return self.cursor.execute(sql, params) ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ File "/usr/src/sentry/src/sentry/db/postgres/decorators.py", line 16, in inner return func(self, *args, **kwargs) File "/usr/src/sentry/src/sentry/db/postgres/base.py", line 95, in execute return self.cursor.execute(sql, params) ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ django.db.utils.OperationalError: server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request. server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request.
SQL: SELECT "sentry_perfstringindexer"."id", "sentry_perfstringindexer"."string", "sentry_perfstringindexer"."organization_id", "sentry_perfstringindexer"."date_added", "sentry_perfstringindexer"."last_seen", "sentry_perfstringindexer"."retention_days", "sentry_perfstringindexer"."use_case_id" FROM "sentry_perfstringindexer" WHERE (("sentry_perfstringindexer"."organization_id" = %s AND "sentry_perfstringindexer"."string" = %s AND "sentry_perfstringindexer"."use_case_id" = %s) OR ("sentry_perfstringindexer"."organization_id" = %s AND "sentry_perfstringindexer"."string" = %s AND "sentry_perfstringindexer"."use_case_id" = %s)) 08:48:13 [WARNING] arroyo.processing.strategies.run_task_with_multiprocessing: Caught exception while applying <bound method MessageProcessor.process_messages of <sentry.sentry_metrics.consumers.indexer.processing.MessageProcessor object at 0x7fb497f63e30>> to Message({Partition(topic=Topic(name='ingest-performance-metrics'), index=0): 9612187})! multiprocessing.pool.RemoteTraceback: """ Traceback (most recent call last): File "/.venv/lib/python3.13/site-packages/django/db/backends/utils.py", line 105, in _execute return self.cursor.execute(sql, params) ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ File "/usr/src/sentry/src/sentry/db/postgres/decorators.py", line 16, in inner return func(self, *args, **kwargs) File "/usr/src/sentry/src/sentry/db/postgres/base.py", line 95, in execute return self.cursor.execute(sql, params) ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ psycopg2.OperationalError: server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request. server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request.
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/usr/local/lib/python3.13/multiprocessing/pool.py", line 125, in worker result = (True, func(*args, **kwds)) ~~~~^^^^^^^^^^^^^^^ File "/.venv/lib/python3.13/site-packages/arroyo/processing/strategies/run_task_with_multiprocessing.py", line 252, in parallel_run_task_worker_apply function(cast(Message[TStrategyPayload], message)) ~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/src/sentry/src/sentry/sentry_metrics/consumers/indexer/processing.py", line 85, in process_messages return self._process_messages_impl(outer_message) ~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^ File "/usr/src/sentry/src/sentry/sentry_metrics/consumers/indexer/processing.py", line 134, in _process_messages_impl record_result = self._indexer.bulk_record(extracted_strings) File "/usr/src/sentry/src/sentry/sentry_metrics/indexer/strings.py", line 300, in bulk_record indexer_results = self.indexer.bulk_record( { ...<2 lines>... } ) File "/usr/src/sentry/src/sentry/sentry_metrics/indexer/cache.py", line 252, in bulk_record db_record_key_results = self.indexer.bulk_record( { ...<2 lines>... } ) File "/usr/src/sentry/src/sentry/sentry_metrics/indexer/postgres/postgres_v2.py", line 244, in bulk_record return self._bulk_record(strings) ~~~~~~~~~~~~~~~~~^^^^^^^^^ File "/usr/src/sentry/src/sentry/sentry_metrics/indexer/postgres/postgres_v2.py", line 141, in _bulk_record for db_obj in self._get_db_records(db_read_keys) ~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^ File "/.venv/lib/python3.13/site-packages/django/db/models/query.py", line 384, in iter self._fetch_all() ~~~~~~~~~~~~~~~^^ File "/.venv/lib/python3.13/site-packages/django/db/models/query.py", line 1945, in _fetch_all self._result_cache = list(self._iterable_class(self)) ~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/.venv/lib/python3.13/site-packages/django/db/models/query.py", line 91, in iter results = compiler.execute_sql( chunked_fetch=self.chunked_fetch, chunk_size=self.chunk_size ) File "/.venv/lib/python3.13/site-packages/django/db/models/sql/compiler.py", line 1623, in execute_sql cursor.execute(sql, params) ~~~~~~~~~~~~~~^^^^^^^^^^^^^ File "/.venv/lib/python3.13/site-packages/sentry_sdk/utils.py", line 1816, in runner return sentry_patched_function(*args, **kwargs) File "/.venv/lib/python3.13/site-packages/sentry_sdk/integrations/django/init.py", line 652, in execute result = real_execute(self, sql, params) File "/.venv/lib/python3.13/site-packages/django/db/backends/utils.py", line 79, in execute return self._execute_with_wrappers( ~~~~~~~~~~~~~~~~~~~~~~~~~~~^ sql, params, many=False, executor=self._execute ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/.venv/lib/python3.13/site-packages/django/db/backends/utils.py", line 92, in _execute_with_wrappers return executor(sql, params, many, context) File "/usr/src/sentry/src/sentry/db/postgres/base.py", line 70, in _execute__include_sql_in_error return execute(sql, params, many, context) File "/usr/src/sentry/src/sentry/db/postgres/base.py", line 58, in _execute__clean_params return execute(sql, clean_bad_params(params), many, context) File "/.venv/lib/python3.13/site-packages/django/db/backends/utils.py", line 100, in _execute with self.db.wrap_database_errors: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/.venv/lib/python3.13/site-packages/django/db/utils.py", line 91, in exit raise dj_exc_value.with_traceback(traceback) from exc_value File "/.venv/lib/python3.13/site-packages/django/db/backends/utils.py", line 105, in _execute return self.cursor.execute(sql, params) ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ File "/usr/src/sentry/src/sentry/db/postgres/decorators.py", line 16, in inner return func(self, *args, **kwargs) File "/usr/src/sentry/src/sentry/db/postgres/base.py", line 95, in execute return self.cursor.execute(sql, params) ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ django.db.utils.OperationalError: server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request. server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request.
SQL: SELECT "sentry_perfstringindexer"."id", "sentry_perfstringindexer"."string", "sentry_perfstringindexer"."organization_id", "sentry_perfstringindexer"."date_added", "sentry_perfstringindexer"."last_seen", "sentry_perfstringindexer"."retention_days", "sentry_perfstringindexer"."use_case_id" FROM "sentry_perfstringindexer" WHERE (("sentry_perfstringindexer"."organization_id" = %s AND "sentry_perfstringindexer"."string" = %s AND "sentry_perfstringindexer"."use_case_id" = %s) OR ("sentry_perfstringindexer"."organization_id" = %s AND "sentry_perfstringindexer"."string" = %s AND "sentry_perfstringindexer"."use_case_id" = %s)) """
Your postgres container is problematic. What's your current self-hosted Sentry version? Do you have pgbouncer as the postgres host as seen here? https://github.com/getsentry/self-hosted/blob/fe477b41d937b838146f83a48bbbe705ec959f49/sentry/sentry.conf.example.py#L51
25.9.0 setry, bouncer disable, connection 100 from 500max connection PG
And this trouble just woth this two service( add topic partiotion and add replica count for this service just now. it had lag
Can you upgrade to 25.10.0, and enable the pgbouncer? I believe that would help with the postgres connection issue.
We try update version to 25.10.0, but it need add new service, so we get error with new way with grpc. We use helm chart sentry-27.6.0
So we wait new chart for 25.10.0. Current helm chart not work, if just replace image version
Ah you're running self-hosted Sentry with the Helm chart. Yeah, we (at Sentry) don't support that. You'll have to file an issue on their repoistory (https://github.com/sentry-kubernetes/charts/issues).
This issue has gone three weeks without activity. In another week, I will close it.
But! If you comment or otherwise update it, I will reset the clock, and if you remove the label Waiting for: Community, I will leave it alone ... forever!
"A weed is but an unloved flower." ― Ella Wheeler Wilcox 🥀