I can't start the opencti server but why
Prerequisites
- [x] I read the Deployment and Setup section of the OpenCTI documentation as well as the Troubleshooting page and didn't find anything relevant to my problem.
- [x] I went through old GitHub issues and couldn't find anything relevant
- [x] I googled the issue and didn't find anything relevant
Description
I did the opencti update, everything is healthy up to this point, no problem, but out of nowhere it's been giving the error below for 1 hour.
There's no problem with cluster health, the shards are in place
2025-12-12T10:58:55.310Z hh5wn | {"category":"APP","cause":{"attributes":{"cause":{"code":"DATABASE_ERROR","message":"Engine execution fail, some shards are not available, please check your engine status","name":"DATABASE_ERROR","stack":"GraphQLError: Engine execution fail, some shards are not available, please check your engine status\n at error (/opt/opencti/build/back.js:1652:2275)\n at EngineShardsError (/opt/opencti/build/back.js:1652:4949)\n at /opt/opencti/build/back.js:1722:14022\n at process.processTicksAndRejections (node:internal/process/task_queues:105:5)\n at async elPaginate (/opt/opencti/build/back.js:1845:4639)\n at async elRepaginate (/opt/opencti/build/back.js:1845:6187)\n at async elList (/opt/opencti/build/back.js:1845:7236)\n at async loadElementMetaDependencies (/opt/opencti/build/back.js:2847:101516)"},"genre":"TECHNICAL","http_status":500},"code":"DATABASE_ERROR","message":"Fail to execute engine pagination","name":"DATABASE_ERROR","stack":"GraphQLError: Fail to execute engine pagination\n at error (/opt/opencti/build/back.js:1652:2275)\n at DatabaseError (/opt/opencti/build/back.js:1652:3705)\n at elPaginate (/opt/opencti/build/back.js:1845:5587)\n at process.processTicksAndRejections (node:internal/process/task_queues:105:5)\n at async elRepaginate (/opt/opencti/build/back.js:1845:6187)\n at async elList (/opt/opencti/build/back.js:1845:7236)\n at async loadElementMetaDependencies (/opt/opencti/build/back.js:2847:101516)"},"level":"error","message":"[OPENCTI] Platform default initialization failed","source":"backend","timestamp":"2025-12-12T10:58:55.309Z","version":"6.9.0"}
Environment
- OS (where OpenCTI server runs): Debian 12
- OpenCTI version: 6.9.0
- OpenCTI client: 6.9.0
- Other environment details:
Reproducible Steps
It occurs during the startup of new pods after the update, It cannot pass the "[INIT] Existing platform detected, initialization..." stage
Additional information
The status of opencti shards is as follows
[
{
"index": "opencti_stix_cyber_observables-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "3820505",
"store": "5.3gb",
"dataset": "5.3gb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_stix_sighting_relationships-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "0",
"store": "248b",
"dataset": "248b",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_deleted_objects-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "36712",
"store": "17.5mb",
"dataset": "17.5mb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_history-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "66822569",
"store": "50gb",
"dataset": "50gb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_internal_relationships-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "426",
"store": "210.9kb",
"dataset": "210.9kb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_stix_meta_objects-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "3648124",
"store": "4.4gb",
"dataset": "4.4gb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_files-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "0",
"store": "250b",
"dataset": "250b",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_stix_meta_relationships-000003",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "74821350",
"store": "15.9gb",
"dataset": "15.9gb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_internal_objects-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "24378",
"store": "81.2mb",
"dataset": "81.2mb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_history-000002",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "48436269",
"store": "55gb",
"dataset": "55gb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_inferred_relationships-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "0",
"store": "250b",
"dataset": "250b",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_stix_core_relationships-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "13741155",
"store": "6.2gb",
"dataset": "6.2gb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_stix_meta_relationships-000002",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "74208243",
"store": "16.9gb",
"dataset": "16.9gb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_stix_meta_relationships-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "72130299",
"store": "22gb",
"dataset": "22gb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_inferred_entities-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "0",
"store": "250b",
"dataset": "250b",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_history-000003",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "109199",
"store": "200mb",
"dataset": "200mb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_stix_meta_relationships-000004",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "74477262",
"store": "16.8gb",
"dataset": "16.8gb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_draft_objects-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "0",
"store": "250b",
"dataset": "250b",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_stix_meta_relationships-000006",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "186237",
"store": "144.5mb",
"dataset": "144.5mb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_stix_meta_relationships-000005",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "77797341",
"store": "19.1gb",
"dataset": "19.1gb",
"ip": "127.0.0.1",
"node": "elasticsearch"
},
{
"index": "opencti_stix_domain_objects-000001",
"shard": "0",
"prirep": "p",
"state": "STARTED",
"docs": "27755924",
"store": "46.7gb",
"dataset": "46.7gb",
"ip": "127.0.0.1",
"node": "elasticsearch"
}
]
The health status of the cluster is as follows
{
"cluster_name": "elasticsearch",
"status": "green",
"timed_out": false,
"number_of_nodes": 1,
"number_of_data_nodes": 1,
"active_primary_shards": 76,
"active_shards": 76,
"relocating_shards": 0,
"initializing_shards": 0,
"unassigned_shards": 0,
"unassigned_primary_shards": 0,
"delayed_unassigned_shards": 0,
"number_of_pending_tasks": 0,
"number_of_in_flight_fetch": 0,
"task_max_waiting_in_queue_millis": 0,
"active_shards_percent_as_number": 100
}
I enabled debug on elasticsearch and what do I see, the connections field is not in the mapping as nested.
But how can this field not exist in opencti that I've been using since version 6.7 and regularly updating with helm? Something might have happened during migration in version 6.9
Somehow this field disappeared, I didn't understand the reason, we haven't made any changes in ES for a long time
Caused by: java.lang.IllegalStateException: [nested] failed to find nested object under path [connections]
[2025-12-12T12:20:49,117][DEBUG][o.e.a.s.TransportSearchAction] [elasticsearch] [kCd51D1NSjGRjlLpDYNI1Q][opencti_stix_meta_relationships-000006][0]: Failed to execute [SearchRequest{searchType=QUERY_THEN_FETCH, indices=[opencti_inferred_relationships-000001, opencti_inferred_relationships, opencti_stix_meta_relationships-000001, opencti_stix_meta_relationships-000006, opencti_stix_meta_relationships-000005, opencti_stix_meta_relationships-000004, opencti_stix_meta_relationships-000003, opencti_stix_meta_relationships, opencti_stix_meta_relationships-000002], indicesOptions=IndicesOptions[ignore_unavailable=false, allow_no_indices=true, expand_wildcards_open=true, expand_wildcards_closed=false, expand_wildcards_hidden=false, allow_aliases_to_multiple_indices=true, forbid_closed_indices=true, ignore_aliases=false, ignore_throttled=true, allow_selectors=true, include_failure_indices=false], routing='null', preference='null', requestCache=null, scroll=null, maxConcurrentShardRequests=0, batchedReduceSize=512, preFilterShardSize=null, allowPartialSearchResults=true, localClusterAlias=null, getOrCreateAbsoluteStartMillis=-1, ccsMinimizeRoundtrips=true, source={"size":500,"query":{"bool":{"must":[{"bool":{"should":[{"bool":{"should":[{"nested":{"query":{"bool":{"must":[{"bool":{"should":[{"terms":{"connections.internal_id.keyword":["88ec0c6a-13ce-5e39-b486-354fe4a7084f"],"boost":1.0}}],"minimum_should_match":"1","boost":1.0}},{"bool":{"should":[{"query_string":{"query":"*_from","fields":["connections.role^1.0"]}}],"minimum_should_match":"1","boost":1.0}}],"boost":1.0}},"path":"connections","ignore_unmapped":false,"score_mode":"avg","boost":1.0}}],"minimum_should_match":"1","boost":1.0}},{"bool":{"should":[{"multi_match":{"query":"stix-meta-relationship","fields":["entity_type.keyword^1.0","parent_types.keyword^1.0"]}},{"multi_match":{"query":"stix-cyber-observable-relationship","fields":["entity_type.keyword^1.0","parent_types.keyword^1.0"]}},{"multi_match":{"query":"stix-ref-relationship","fields":["entity_type.keyword^1.0","parent_types.keyword^1.0"]}}],"minimum_should_match":"1","boost":1.0}}],"minimum_should_match":"2","boost":1.0}}],"boost":1.0}},"_source":{"includes":["_index","id","internal_id","standard_id","sort","base_type","entity_type","connections","first_seen","last_seen","start_time","stop_time","restricted_members"],"excludes":["rel_*"]},"docvalue_fields":[{"field":"rel_object-marking*.keyword"},{"field":"rel_granted*.keyword"},{"field":"rel_object-covered*.keyword"},{"field":"rel_created-by*.keyword"},{"field":"rel_object-label*.keyword"},{"field":"rel_object-participant*.keyword"},{"field":"rel_object-assignee*.keyword"},{"field":"rel_kill-chain-phase*.keyword"},{"field":"rel_born-in*.keyword"},{"field":"rel_of-ethnicity*.keyword"},{"field":"rel_sample*.keyword"},{"field":"rel_participate-to*.keyword"},{"field":"rel_in-pir*.keyword"}],"sort":[{"standard_id.keyword":{"order":"asc"}}],"track_total_hits":2147483647}}] lastShard [true]
org.elasticsearch.transport.RemoteTransportException: [elasticsearch][127.0.0.1:9300][indices:data/read/search[phase/query]]
Caused by: org.elasticsearch.index.query.QueryShardException: failed to create query: [nested] failed to find nested object under path [connections]
Caused by: java.lang.IllegalStateException: [nested] failed to find nested object under path [connections]