core
core copied to clipboard
Duplicate key value violates unique constraint "state_attributes_pkey"
The problem
At some point recently, not sure exactly which upgrade. The following errors started popping up in the logs by the 1000s. It seems to only effect certain entities. Basically any time this error happens a piece of event data is not stored.
Unhandled database error while processing task StatesContextIDMigrationTask():
(raised as a result of Query-invoked autoflush; consider using a session.no_autoflush block if this flush is occurring prematurely)
(psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "state_attributes_pkey" DETAIL: Key (attributes_id)=(5729439) already exists.
Unhandled database error while processing task StatesContextIDMigrationTask():
(raised as a result of Query-invoked autoflush; consider using a session.no_autoflush block if this flush is occurring prematurely)
(psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "state_attributes_pkey" DETAIL: Key (attributes_id)=(5729440) already exists.
psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "state_attributes_pkey"
DETAIL: Key (attributes_id)=(5729440) already exists.
How might I begin to debug/fix this?
Thanks!
What version of Home Assistant Core has the issue?
core-2023.4.6
What was the last working version of Home Assistant Core?
No response
What type of installation are you running?
Home Assistant Container
Integration causing the issue
recorder
Link to integration documentation on our website
https://www.home-assistant.io/integrations/recorder/
Diagnostics information
No response
Example YAML snippet
No response
Anything in the logs that might be useful for us?
Logger: homeassistant.components.recorder.entity_registry
Source: components/recorder/entity_registry.py:56
Integration: Recorder
First occurred: April 30, 2023 at 10:49:44 PM (3 occurrences)
Last logged: April 30, 2023 at 10:51:02 PM
Cannot rename entity_id `sensor.chinese_air_pollution_level` to `sensor.nyc_air_pollution_level` because the states meta manager is not yet active
Cannot rename entity_id `sensor.chinese_air_quality_index` to `sensor.nyc_air_quality_index` because the states meta manager is not yet active
Cannot rename entity_id `sensor.chinese_main_pollutant` to `sensor.nyc_main_pollutant` because the states meta manager is not yet active
Additional information
No response
Hey there @home-assistant/core, mind taking a look at this issue as it has been labeled with an integration (recorder
) you are listed as a code owner for? Thanks!
Code owner commands
Code owners of recorder
can trigger bot actions by commenting:
-
@home-assistant close
Closes the issue. -
@home-assistant rename Awesome new title
Renames the issue. -
@home-assistant reopen
Reopen the issue. -
@home-assistant unassign recorder
Removes the current integration label and assignees on the issue, add the integration domain after the command.
(message by CodeOwnersMention)
recorder documentation recorder source (message by IssueLinks)
Found https://github.com/home-assistant/core/issues/92214 and tried a REINDEX DATABASE home_assistant;
and restarted, but still get these issues.
When looking at the database state_attributes
table I notice that the attribute_id key errors are for IDs that are well back in the past. Current IDs are around 6645640.
Any ideas for how I can remedy this?
Repair the state_attributes
table by removing all the duplicate primary keys. Start with 5729440
, and remove any state that is linked to it
The state_attributes
table has no duplicate primary keys.
Running SELECT COUNT(*) - COUNT(DISTINCT attributes_id) AS duplicate_count FROM state_attributes;
yields duplicate_count = 0
When you mention removing any state that is linked to it, do you mean linked in the states
table?
Can you post a log with debug logging turned on for sqlalchemy
?
Also it would be better to upgrade to 2023.5.0 first since it uses a newer version of sqlalchemy
which will give better debug information.
Also the first rule if corruption is suspected
https://wiki.postgresql.org/wiki/Corruption#VITALLY_IMPORTANT_FIRST_RESPONSE
I upgraded to 2023.5.0. The log is massive, and very verbose. I'm not sure exactly what you're looking for, but there are no errors other than the following types:
2023-05-04 01:03:55.815 ERROR (Recorder) [homeassistant.components.recorder.core] Unhandled database error while processing task EventsContextIDMigrationTask(): (raised as a result of Query-invoked autoflush; consider using a session.no_autoflush block if this flush is occurring prematurely)
(psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "state_attributes_pkey"
DETAIL: Key (attributes_id)=(5792071) already exists.
[SQL: INSERT INTO state_attributes (hash, shared_attrs) VALUES (%(hash)s, %(shared_attrs)s) RETURNING state_attributes.attributes_id]
[parameters: {'hash': 137352028, 'shared_attrs': '{"source_type":"gps","battery_level":54,"latitude":<latitude>,"longitude":<longitude>,"gps_accuracy":200,"velocity":0,"tid":"l7","battery_status":1,"friendly_name":"j"}'}]
(Background on this error at: https://sqlalche.me/e/20/gkpj)
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1963, in _exec_single_context
self.dialect.do_execute(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 920, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "state_attributes_pkey"
DETAIL: Key (attributes_id)=(5792071) already exists.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/recorder/core.py", line 889, in _process_one_task_or_recover
return task.run(self)
File "/usr/src/homeassistant/homeassistant/components/recorder/tasks.py", line 399, in run
not instance._migrate_events_context_ids() # pylint: disable=[protected-access]
File "/usr/src/homeassistant/homeassistant/components/recorder/core.py", line 1234, in _migrate_events_context_ids
return migration.migrate_events_context_ids(self)
File "/usr/src/homeassistant/homeassistant/components/recorder/util.py", line 608, in wrapper
return job(instance, *args, **kwargs)
File "/usr/src/homeassistant/homeassistant/components/recorder/migration.py", line 1420, in migrate_events_context_ids
if events := session.execute(find_events_context_ids_to_migrate()).all():
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2231, in execute
return self._execute_internal(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2105, in _execute_internal
) = compile_state_cls.orm_pre_session_exec(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/context.py", line 499, in orm_pre_session_exec
session._autoflush()
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2908, in _autoflush
raise e.with_traceback(sys.exc_info()[2])
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2897, in _autoflush
self.flush()
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 4153, in flush
self._flush(objects)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 4289, in _flush
with util.safe_reraise():
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/langhelpers.py", line 147, in __exit__
raise exc_value.with_traceback(exc_tb)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 4250, in _flush
flush_context.execute()
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 464, in execute
n.execute_aggregate(self, set_)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 569, in execute_aggregate
self.execute(uow)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 644, in execute
util.preloaded.orm_persistence.save_obj(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 93, in save_obj
_emit_insert_statements(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 1223, in _emit_insert_statements
result = connection.execute(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1413, in execute
return meth(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/sql/elements.py", line 483, in _execute_on_connection
return connection._execute_clauseelement(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1637, in _execute_clauseelement
ret = self._execute_context(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1841, in _execute_context
return self._exec_single_context(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1982, in _exec_single_context
self._handle_dbapi_exception(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 2339, in _handle_dbapi_exception
raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1963, in _exec_single_context
self.dialect.do_execute(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 920, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.IntegrityError: (raised as a result of Query-invoked autoflush; consider using a session.no_autoflush block if this flush is occurring prematurely)
(psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "state_attributes_pkey"
DETAIL: Key (attributes_id)=(5792071) already exists.
2023-05-04 01:03:55.783 ERROR (Recorder) [homeassistant.components.recorder.core] Unhandled database error while processing task StatesContextIDMigrationTask(): (raised as a result of Query-invoked autoflush; consider using a session.no_autoflush block if this flush is occurring prematurely)
(psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "state_attributes_pkey"
DETAIL: Key (attributes_id)=(5792070) already exists.
[SQL: INSERT INTO state_attributes (hash, shared_attrs) VALUES (%(hash)s, %(shared_attrs)s) RETURNING state_attributes.attributes_id]
[parameters: {'hash': 716663640, 'shared_attrs': '{"state_class":"measurement","uuid":"7427f3da-b644-4589-8f0c-860eaf059935","major":0,"minor":38940,"source":"<mac address>","unit_of_measurement":"ft","device_class":"distance","icon":"mdi:signal-distance-variant","friendly_name":"9E7C Estimated Distance"}'}]
(Background on this error at: https://sqlalche.me/e/20/gkpj)
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1963, in _exec_single_context
self.dialect.do_execute(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 920, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "state_attributes_pkey"
DETAIL: Key (attributes_id)=(5792070) already exists.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/recorder/core.py", line 889, in _process_one_task_or_recover
return task.run(self)
File "/usr/src/homeassistant/homeassistant/components/recorder/tasks.py", line 384, in run
not instance._migrate_states_context_ids() # pylint: disable=[protected-access]
File "/usr/src/homeassistant/homeassistant/components/recorder/core.py", line 1230, in _migrate_states_context_ids
return migration.migrate_states_context_ids(self)
File "/usr/src/homeassistant/homeassistant/components/recorder/util.py", line 608, in wrapper
return job(instance, *args, **kwargs)
File "/usr/src/homeassistant/homeassistant/components/recorder/migration.py", line 1385, in migrate_states_context_ids
if states := session.execute(find_states_context_ids_to_migrate()).all():
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2231, in execute
return self._execute_internal(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2105, in _execute_internal
) = compile_state_cls.orm_pre_session_exec(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/context.py", line 499, in orm_pre_session_exec
session._autoflush()
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2908, in _autoflush
raise e.with_traceback(sys.exc_info()[2])
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2897, in _autoflush
self.flush()
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 4153, in flush
self._flush(objects)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 4289, in _flush
with util.safe_reraise():
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/langhelpers.py", line 147, in __exit__
raise exc_value.with_traceback(exc_tb)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 4250, in _flush
flush_context.execute()
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 467, in execute
rec.execute(self)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 644, in execute
util.preloaded.orm_persistence.save_obj(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 93, in save_obj
_emit_insert_statements(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 1223, in _emit_insert_statements
result = connection.execute(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1413, in execute
return meth(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/sql/elements.py", line 483, in _execute_on_connection
return connection._execute_clauseelement(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1637, in _execute_clauseelement
ret = self._execute_context(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1841, in _execute_context
return self._exec_single_context(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1982, in _exec_single_context
self._handle_dbapi_exception(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 2339, in _handle_dbapi_exception
raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1963, in _exec_single_context
self.dialect.do_execute(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 920, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.IntegrityError: (raised as a result of Query-invoked autoflush; consider using a session.no_autoflush block if this flush is occurring prematurely)
(psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "state_attributes_pkey"
DETAIL: Key (attributes_id)=(5792070) already exists.
2023-05-04 01:03:55.806 INFO (Recorder) [sqlalchemy.engine.Engine] INSERT INTO state_attributes (hash, shared_attrs) VALUES (%(hash)s, %(shared_attrs)s) RETURNING state_attributes.attributes_id
2023-05-04 01:03:55.806 INFO (Recorder) [sqlalchemy.engine.Engine] [cached since 0.03759s ago] {'hash': 137352028, 'shared_attrs': '{"source_type":"gps","battery_level":54,"latitude":<latitude>,"longitude":<longitude>,"gps_accuracy":200,"velocity":0,"tid":"l7","battery_status":1,"friendly_name":"j"}'}
2023-05-04 01:03:55.808 INFO (Recorder) [sqlalchemy.engine.Engine] ROLLBACK
2023-05-04 01:03:55.809 ERROR (Recorder) [homeassistant.components.recorder.util] Error executing query: (raised as a result of Query-invoked autoflush; consider using a session.no_autoflush block if this flush is occurring prematurely)
(psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "state_attributes_pkey"
DETAIL: Key (attributes_id)=(5792071) already exists.
[SQL: INSERT INTO state_attributes (hash, shared_attrs) VALUES (%(hash)s, %(shared_attrs)s) RETURNING state_attributes.attributes_id]
[parameters: {'hash': 137352028, 'shared_attrs': '{"source_type":"gps","battery_level":54,"latitude":<latitude>,"longitude":<longitude>,"gps_accuracy":200,"velocity":0,"tid":"l7","battery_status":1,"friendly_name":"j"}'}]
(Background on this error at: https://sqlalche.me/e/20/gkpj)
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1963, in _exec_single_context
self.dialect.do_execute(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 920, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "state_attributes_pkey"
DETAIL: Key (attributes_id)=(5792071) already exists.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/recorder/util.py", line 130, in session_scope
yield session
File "/usr/src/homeassistant/homeassistant/components/recorder/migration.py", line 1420, in migrate_events_context_ids
if events := session.execute(find_events_context_ids_to_migrate()).all():
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2231, in execute
return self._execute_internal(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2105, in _execute_internal
) = compile_state_cls.orm_pre_session_exec(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/context.py", line 499, in orm_pre_session_exec
session._autoflush()
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2908, in _autoflush
raise e.with_traceback(sys.exc_info()[2])
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2897, in _autoflush
self.flush()
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 4153, in flush
self._flush(objects)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 4289, in _flush
with util.safe_reraise():
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/langhelpers.py", line 147, in __exit__
raise exc_value.with_traceback(exc_tb)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 4250, in _flush
flush_context.execute()
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 464, in execute
n.execute_aggregate(self, set_)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 569, in execute_aggregate
self.execute(uow)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 644, in execute
util.preloaded.orm_persistence.save_obj(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 93, in save_obj
_emit_insert_statements(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 1223, in _emit_insert_statements
result = connection.execute(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1413, in execute
return meth(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/sql/elements.py", line 483, in _execute_on_connection
return connection._execute_clauseelement(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1637, in _execute_clauseelement
ret = self._execute_context(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1841, in _execute_context
return self._exec_single_context(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1982, in _exec_single_context
self._handle_dbapi_exception(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 2339, in _handle_dbapi_exception
raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1963, in _exec_single_context
self.dialect.do_execute(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 920, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.IntegrityError: (raised as a result of Query-invoked autoflush; consider using a session.no_autoflush block if this flush is occurring prematurely)
(psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "state_attributes_pkey"
DETAIL: Key (attributes_id)=(5792071) already exists.
[SQL: INSERT INTO state_attributes (hash, shared_attrs) VALUES (%(hash)s, %(shared_attrs)s) RETURNING state_attributes.attributes_id]
[parameters: {'hash': 137352028, 'shared_attrs': '{"source_type":"gps","battery_level":54,"latitude":<latitude>,"longitude":<longitude>,"gps_accuracy":200,"velocity":0,"tid":"l7","battery_status":1,"friendly_name":"j"}'}]
(Background on this error at: https://sqlalche.me/e/20/gkpj)
2023-05-04 01:03:55.815 DEBUG (Recorder) [sqlalchemy.pool.impl.QueuePool] Connection <connection object at 0x7ff65a38f600; dsn: 'user=postgres password=xxx dbname=home_assistant host=home-assistant-db port=5432', closed: 0> being returned to pool
2023-05-04 01:03:55.815 DEBUG (Recorder) [sqlalchemy.pool.impl.QueuePool] Connection <connection object at 0x7ff65a38f600; dsn: 'user=postgres password=xxx dbname=home_assistant host=home-assistant-db port=5432', closed: 0> rollback-on-return
2023-05-04 01:03:55.815 ERROR (Recorder) [homeassistant.components.recorder.core] Unhandled database error while processing task EventsContextIDMigrationTask(): (raised as a result of Query-invoked autoflush; consider using a session.no_autoflush block if this flush is occurring prematurely)
(psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "state_attributes_pkey"
DETAIL: Key (attributes_id)=(5792071) already exists.
[SQL: INSERT INTO state_attributes (hash, shared_attrs) VALUES (%(hash)s, %(shared_attrs)s) RETURNING state_attributes.attributes_id]
[parameters: {'hash': 137352028, 'shared_attrs': '{"source_type":"gps","battery_level":54,"latitude":<latitude>,"longitude":<longitude>,"gps_accuracy":200,"velocity":0,"tid":"l7","battery_status":1,"friendly_name":"j"}'}]
(Background on this error at: https://sqlalche.me/e/20/gkpj)
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1963, in _exec_single_context
self.dialect.do_execute(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 920, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "state_attributes_pkey"
DETAIL: Key (attributes_id)=(5792071) already exists.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/recorder/core.py", line 889, in _process_one_task_or_recover
return task.run(self)
File "/usr/src/homeassistant/homeassistant/components/recorder/tasks.py", line 399, in run
not instance._migrate_events_context_ids() # pylint: disable=[protected-access]
File "/usr/src/homeassistant/homeassistant/components/recorder/core.py", line 1234, in _migrate_events_context_ids
return migration.migrate_events_context_ids(self)
File "/usr/src/homeassistant/homeassistant/components/recorder/util.py", line 608, in wrapper
return job(instance, *args, **kwargs)
File "/usr/src/homeassistant/homeassistant/components/recorder/migration.py", line 1420, in migrate_events_context_ids
if events := session.execute(find_events_context_ids_to_migrate()).all():
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2231, in execute
return self._execute_internal(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2105, in _execute_internal
) = compile_state_cls.orm_pre_session_exec(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/context.py", line 499, in orm_pre_session_exec
session._autoflush()
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2908, in _autoflush
raise e.with_traceback(sys.exc_info()[2])
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 2897, in _autoflush
self.flush()
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 4153, in flush
self._flush(objects)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 4289, in _flush
with util.safe_reraise():
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/util/langhelpers.py", line 147, in __exit__
raise exc_value.with_traceback(exc_tb)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/session.py", line 4250, in _flush
flush_context.execute()
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 464, in execute
n.execute_aggregate(self, set_)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 569, in execute_aggregate
self.execute(uow)
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/unitofwork.py", line 644, in execute
util.preloaded.orm_persistence.save_obj(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 93, in save_obj
_emit_insert_statements(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/orm/persistence.py", line 1223, in _emit_insert_statements
result = connection.execute(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1413, in execute
return meth(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/sql/elements.py", line 483, in _execute_on_connection
return connection._execute_clauseelement(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1637, in _execute_clauseelement
ret = self._execute_context(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1841, in _execute_context
return self._exec_single_context(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1982, in _exec_single_context
self._handle_dbapi_exception(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 2339, in _handle_dbapi_exception
raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/base.py", line 1963, in _exec_single_context
self.dialect.do_execute(
File "/usr/local/lib/python3.10/site-packages/sqlalchemy/engine/default.py", line 920, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.IntegrityError: (raised as a result of Query-invoked autoflush; consider using a session.no_autoflush block if this flush is occurring prematurely)
(psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "state_attributes_pkey"
DETAIL: Key (attributes_id)=(5792071) already exists.
Is that from 2023.4.x or 2023.5.x?
2023.5.x
Interesting that it’s happening with EventsContextIDMigrationTask as well
I'm guessing that is a symptom of maybe a failed migration or some corruption that caused a failed migration?
Currently not very many stats are being recorded into the database. Is there a reasonable path out of this situation, or am I better off starting over and trying to back-fill a new database or something like that?
I’m really not sure what’s going on as state attributes isn’t recent. Maybe there is something wrong with the state_attributes schema
I'm getting similar error some days after migrating from sqlite to postgres. During migration maybe some records failed to be inserted into postgres, and not sure why the serial sequence looks pointing to old record, after sequence grows for some time, the generated id will finally be same with a new record, and duplication happens.
I fixed by resetting each table's serial sequence to max(id) by running SELECT pg_catalog.setval(pg_get_serial_sequence('table_name', 'id_column'), MAX(id_column)) FROM table_name;
Hope this could be helpful.
I'm getting similar error some days after migrating from sqlite to postgres. During migration maybe some records failed to be inserted into postgres, and not sure why the serial sequence looks pointing to old record, after sequence grows for some time, the generated id will finally be same with a new record, and duplication happens.
I fixed by resetting each table's serial sequence to max(id) by running
SELECT pg_catalog.setval(pg_get_serial_sequence('table_name', 'id_column'), MAX(id_column)) FROM table_name;
Hope this could be helpful.
had the same issue, this seems to have fixed it. Migrated with pgloader. Thanks alot!
There hasn't been any activity on this issue recently. Due to the high number of incoming GitHub notifications, we have to clean some of the old issues, as many of them have already been resolved with the latest updates. Please make sure to update to the latest Home Assistant version and check if that solves the issue. Let us know if that works for you by adding a comment 👍 This issue has now been marked as stale and will be closed if no further activity occurs. Thank you for your contributions.
I'm getting similar error some days after migrating from sqlite to postgres. During migration maybe some records failed to be inserted into postgres, and not sure why the serial sequence looks pointing to old record, after sequence grows for some time, the generated id will finally be same with a new record, and duplication happens.
I fixed by resetting each table's serial sequence to max(id) by running
SELECT pg_catalog.setval(pg_get_serial_sequence('table_name', 'id_column'), MAX(id_column)) FROM table_name;
Hope this could be helpful.
Same as @OverDodo , just migrated to Postgres and it resolved my issues. :) Thanks