Significant delay or missing messages on 4G in community (without re-login or app closure)
Follow up: https://github.com/status-im/status-mobile/pull/20730
Problem
Users experience issues with delayed or missing messages in community on both iOS and Android devices when reconnecting from offline to online (4G) .
On iOS: There is a noticeable delay in receiving messages when switching from offline to online (4G). On Android: Messages either do not arrive at all (waited up to 10 minutes) or only a portion of the messages is received immediately. In 6 tests conducted: 4 cases of missing messages and 2 message partly deliveries.
Reproduction
- Connect device to a 4G network (setting: WI-FI only for sync/backup)
- Put the app in the background.
- Go offline for a period of time.
- User 1 goes back online and brings the app to the foreground.
- User 2 sends messages to User 1 in real time (both is online)
- Observe the delay in receiving the messages.
Expected behavior
Messages should be received promptly after the user goes back online, even if the app was in the background.
Actual behavior
iOS: Messages are delayed upon reconnecting to the 4G network after being offline. Android: Either no messages are received or only a subset of messages arrives instantly. The issue was observed in 4 out of 6 tests.
Additional Information
- Status version: nightly 05.09.2024 or app version: 2.29.0 (20240905020836)
- Operating System: Android, IOS
Comments:
Examples of tests on Android (4G enabled):
13:53 - device is offline
13:55 - device returns online
13:55 - 10 messages are sent from the desktop app
14:05 - no messages have arrived in the mobile app
14:06 - device is offline
14:07 - device returns online
14:09 - 10 messages are sent from the desktop app
14:09 - 5 out of 10 messages are received
14:21 - the remaining 5 messages still haven't arrived
Android:
Mobile log: logcat.zip
Desktop log: geth.zip
IOS:
https://github.com/user-attachments/assets/7cb0b6dc-ba79-4908-8f13-29ec09906c0b
Mobile log: logs.zip
Desktop log: geth.log
I've looking into this issue and I thought I should share some notes on what I've been seeing.
Summary
- I was able to reproduce the glitch when running the app on a iOS device (not simulator)
- My setup involved me using two devices: one desktop client and one mobile client
- I joined a new open community with both accounts and started sending messages
- I used the Xcode network utilities app to artificially constrain the iOS device to use a 4G connection
- I then put the mobile app into the background by navigating to the iOS home screen, and then waited (roughly 5-10 minutes)
- I then returned to the mobile app and sent a message from the desktop client into the community chat
- I was also able to reproduce this issue when disabling my network connection, backgrounding the app for a 5-10 minutes, then returning to the app, and sending a message from the desktop client.
- My setup involved me using two devices: one desktop client and one mobile client
- It seems that the app does wait a long time for new messages to appear
- When examining the logs, I can see that the app attempts to re-subscribe with a filter, but it fails to resubscribe, and there seems to trying to re-connect to a peer and subscribe but it possibly cannot reach the peer (?)
Logs
Notable logs that happen in sequence
Fail to subscribefailed to resubscribe for filtersubscribing againhandleFilterSubscribeOptionshandleFilterSubscribeOptions exitpeer selectiondialing peerFailed to subscribe
Here's a snippet of the kind of logs that I'm seeing in the app geth.log
Log Snippet
ERROR[10-11|10:12:28.020|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:425] Failed to subscribe pubSubTopic=/waku/2/rs/16/32 contentTopics="[/waku/1/0xc3384567/rfc26 /waku/1/0x365d0749/rfc26 /waku/1/0xad134f3a/rfc26]" error="failed to dial: failed to dial 16Uiu2HAm3BU3SiRt2Evt96KnbCKW9WRYpoGyqYJDMex3e3mFXTTv: all dials failed\n * [/ip4/151.64.178.110/tcp/56006] dial backoff"
DEBUG[10-11|10:12:28.020|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:177] failed to resubscribe for filter apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }" error="subscriptions failed for contentTopics: /waku/1/0xc3384567/rfc26,/waku/1/0x365d0749/rfc26,/waku/1/0xad134f3a/rfc26"
DEBUG[10-11|10:12:28.020|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:143] subscription status apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }" sub-count=1 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }"
DEBUG[10-11|10:12:28.020|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:167] subscribing again apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }" num-peers=1
DEBUG[10-11|10:12:28.020|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:192] subscribing with peers to exclude apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }" excluded-peers=[16Uiu2HAmPqUttg6kDmmk1nyqzq2En3i5BTiye1rTYHooHAeE3RNv]
DEBUG[10-11|10:12:28.020|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:337] handleFilterSubscribeOptions peerCount=1 excludePeersLen=1
DEBUG[10-11|10:12:28.020|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_selection.go:193] Select Peers selectionCriteria="{\"selectionType\":0,\"protocolId\":\"/vac/waku/filter-subscribe/2.0.0-beta1\",\"pubsubTopics\":[\"/waku/2/rs/16/32\"],\"specificPeers\":null,\"maxPeerCount\":1,\"excludePeers\":{\"\\u0000%\\u0008\\u0002\\u0012!\\u0003\\ufffd \\ufffdڿ\\ufffd\\u001f\\ufffd8\\ufffdJ\\ufffd\\ufffd\\ufffd \\ufffdO\\ufffd\\ufffd^ui\\u000e̓'\\ufffdV\\ufffd\\ufffdp\\ufffd\":{}}}" excludedPeers=16Uiu2HAmPqUttg6kDmmk1nyqzq2En3i5BTiye1rTYHooHAeE3RNv
DEBUG[10-11|10:12:28.021|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:354] handleFilterSubscribeOptions exit selectedPeerCount=1
DEBUG[10-11|10:12:28.021|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:379] peer selection params.maxPeers=1
DEBUG[10-11|10:12:28.021|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] host 16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV dialing 16Uiu2HAmCMLcxjaurcqfbsaGWM7fn8mvr8jFJigy9qJirP9u4usC
DEBUG[10-11|10:12:28.021|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:251] dialing peer from=16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV to=16Uiu2HAmCMLcxjaurcqfbsaGWM7fn8mvr8jFJigy9qJirP9u4usC
DEBUG[10-11|10:12:28.022|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] network for 16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV finished dialing 16Uiu2HAmCMLcxjaurcqfbsaGWM7fn8mvr8jFJigy9qJirP9u4usC
ERROR[10-11|10:12:28.022|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:425] Failed to subscribe pubSubTopic=/waku/2/rs/16/32 contentTopics="[/waku/1/0xc3384567/rfc26 /waku/1/0x365d0749/rfc26 /waku/1/0xad134f3a/rfc26]" error="failed to dial: failed to dial 16Uiu2HAmCMLcxjaurcqfbsaGWM7fn8mvr8jFJigy9qJirP9u4usC: no addresses"
DEBUG[10-11|10:12:28.022|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:177] failed to resubscribe for filter apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }" error="subscriptions failed for contentTopics: /waku/1/0xc3384567/rfc26,/waku/1/0x365d0749/rfc26,/waku/1/0xad134f3a/rfc26"
DEBUG[10-11|10:12:28.022|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:143] subscription status apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xad134f3a/rfc26/waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26 ] }" sub-count=1 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }"
DEBUG[10-11|10:12:28.022|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:167] subscribing again apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xad134f3a/rfc26/waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26 ] }" num-peers=1
DEBUG[10-11|10:12:28.022|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:192] subscribing with peers to exclude apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }" excluded-peers=[16Uiu2HAmPqUttg6kDmmk1nyqzq2En3i5BTiye1rTYHooHAeE3RNv]
DEBUG[10-11|10:12:28.022|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:337] handleFilterSubscribeOptions peerCount=1 excludePeersLen=1
DEBUG[10-11|10:12:28.022|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_selection.go:193] Select Peers selectionCriteria="{\"selectionType\":0,\"protocolId\":\"/vac/waku/filter-subscribe/2.0.0-beta1\",\"pubsubTopics\":[\"/waku/2/rs/16/32\"],\"specificPeers\":null,\"maxPeerCount\":1,\"excludePeers\":{\"\\u0000%\\u0008\\u0002\\u0012!\\u0003\\ufffd \\ufffdڿ\\ufffd\\u001f\\ufffd8\\ufffdJ\\ufffd\\ufffd\\ufffd \\ufffdO\\ufffd\\ufffd^ui\\u000e̓'\\ufffdV\\ufffd\\ufffdp\\ufffd\":{}}}" excludedPeers=16Uiu2HAmPqUttg6kDmmk1nyqzq2En3i5BTiye1rTYHooHAeE3RNv
DEBUG[10-11|10:12:28.023|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:354] handleFilterSubscribeOptions exit selectedPeerCount=1
DEBUG[10-11|10:12:28.023|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:379] peer selection params.maxPeers=1
DEBUG[10-11|10:12:28.023|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] host 16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV dialing 16Uiu2HAmE8FeXNJnqppRq6Rt3PTfGLWUHtxjBUmHzrDFeFPoPkLM
DEBUG[10-11|10:12:28.023|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:251] dialing peer from=16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV to=16Uiu2HAmE8FeXNJnqppRq6Rt3PTfGLWUHtxjBUmHzrDFeFPoPkLM
DEBUG[10-11|10:12:28.023|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] network for 16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV finished dialing 16Uiu2HAkypMfCMupzqGz1c56goDxTwMpQ3nyz7sxJU4PhAT6JNSL
ERROR[10-11|10:12:28.023|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:425] Failed to subscribe pubSubTopic=/waku/2/rs/16/64 contentTopics="[/waku/1/0x28259c00/rfc26 /waku/1/0x397f6567/rfc26]" error="failed to dial: failed to dial 16Uiu2HAkypMfCMupzqGz1c56goDxTwMpQ3nyz7sxJU4PhAT6JNSL: all dials failed\n * [/ip4/212.41.7.173/tcp/51312] dial backoff"
DEBUG[10-11|10:12:28.024|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:177] failed to resubscribe for filter apisub-id=c8df0caa-2b49-42dd-be89-c1b046bb22f4 content-filter="{ pubsubTopic: /waku/2/rs/16/64, contentTopics: [ /waku/1/0x28259c00/rfc26/waku/1/0x397f6567/rfc26 ] }" error="subscriptions failed for contentTopics: /waku/1/0x28259c00/rfc26,/waku/1/0x397f6567/rfc26"
DEBUG[10-11|10:12:28.024|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:143] subscription status apisub-id=c8df0caa-2b49-42dd-be89-c1b046bb22f4 content-filter="{ pubsubTopic: /waku/2/rs/16/64, contentTopics: [ /waku/1/0x28259c00/rfc26/waku/1/0x397f6567/rfc26 ] }" sub-count=1 content-filter="{ pubsubTopic: /waku/2/rs/16/64, contentTopics: [ /waku/1/0x28259c00/rfc26/waku/1/0x397f6567/rfc26 ] }"
DEBUG[10-11|10:12:28.024|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:167] subscribing again apisub-id=c8df0caa-2b49-42dd-be89-c1b046bb22f4 content-filter="{ pubsubTopic: /waku/2/rs/16/64, contentTopics: [ /waku/1/0x28259c00/rfc26/waku/1/0x397f6567/rfc26 ] }" num-peers=1
DEBUG[10-11|10:12:28.024|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:192] subscribing with peers to exclude apisub-id=c8df0caa-2b49-42dd-be89-c1b046bb22f4 content-filter="{ pubsubTopic: /waku/2/rs/16/64, contentTopics: [ /waku/1/0x28259c00/rfc26/waku/1/0x397f6567/rfc26 ] }" excluded-peers=[16Uiu2HAmPqUttg6kDmmk1nyqzq2En3i5BTiye1rTYHooHAeE3RNv]
DEBUG[10-11|10:12:28.024|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:337] handleFilterSubscribeOptions peerCount=1 excludePeersLen=1
DEBUG[10-11|10:12:28.024|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_selection.go:193] Select Peers selectionCriteria="{\"selectionType\":0,\"protocolId\":\"/vac/waku/filter-subscribe/2.0.0-beta1\",\"pubsubTopics\":[\"/waku/2/rs/16/64\"],\"specificPeers\":null,\"maxPeerCount\":1,\"excludePeers\":{\"\\u0000%\\u0008\\u0002\\u0012!\\u0003\\ufffd \\ufffdڿ\\ufffd\\u001f\\ufffd8\\ufffdJ\\ufffd\\ufffd\\ufffd \\ufffdO\\ufffd\\ufffd^ui\\u000e̓'\\ufffdV\\ufffd\\ufffdp\\ufffd\":{}}}" excludedPeers=16Uiu2HAmPqUttg6kDmmk1nyqzq2En3i5BTiye1rTYHooHAeE3RNv
DEBUG[10-11|10:12:28.024|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:354] handleFilterSubscribeOptions exit selectedPeerCount=1
DEBUG[10-11|10:12:28.024|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:379] peer selection params.maxPeers=1
DEBUG[10-11|10:12:28.024|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] host 16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV dialing 16Uiu2HAm463fLnSYYQjCtk5p1kAh1SeTT8BHH2ia3sFAVdqkcD9k
DEBUG[10-11|10:12:28.025|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:251] dialing peer from=16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV to=16Uiu2HAm463fLnSYYQjCtk5p1kAh1SeTT8BHH2ia3sFAVdqkcD9k
DEBUG[10-11|10:12:28.025|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] clearing all peer dials: 16Uiu2HAm3BU3SiRt2Evt96KnbCKW9WRYpoGyqYJDMex3e3mFXTTv
DEBUG[10-11|10:12:28.025|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] clearing all peer dials: 16Uiu2HAmCMLcxjaurcqfbsaGWM7fn8mvr8jFJigy9qJirP9u4usC
DEBUG[10-11|10:12:28.025|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] network for 16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV finished dialing 16Uiu2HAmE8FeXNJnqppRq6Rt3PTfGLWUHtxjBUmHzrDFeFPoPkLM
DEBUG[10-11|10:12:28.025|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] clearing all peer dials: 16Uiu2HAkypMfCMupzqGz1c56goDxTwMpQ3nyz7sxJU4PhAT6JNSL
DEBUG[10-11|10:12:28.025|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] clearing all peer dials: 16Uiu2HAmE8FeXNJnqppRq6Rt3PTfGLWUHtxjBUmHzrDFeFPoPkLM
DEBUG[10-11|10:12:28.025|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] clearing all peer dials: 16Uiu2HAm463fLnSYYQjCtk5p1kAh1SeTT8BHH2ia3sFAVdqkcD9k
DEBUG[10-11|10:12:28.025|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] network for 16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV finished dialing 16Uiu2HAm463fLnSYYQjCtk5p1kAh1SeTT8BHH2ia3sFAVdqkcD9k
ERROR[10-11|10:12:28.025|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:425] Failed to subscribe pubSubTopic=/waku/2/rs/16/64 contentTopics="[/waku/1/0x28259c00/rfc26 /waku/1/0x397f6567/rfc26]" error="failed to dial: failed to dial 16Uiu2HAm463fLnSYYQjCtk5p1kAh1SeTT8BHH2ia3sFAVdqkcD9k: all dials failed\n * [/ip4/138.0.200.202/tcp/62515] dial backoff"
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:177] failed to resubscribe for filter apisub-id=c8df0caa-2b49-42dd-be89-c1b046bb22f4 content-filter="{ pubsubTopic: /waku/2/rs/16/64, contentTopics: [ /waku/1/0x28259c00/rfc26/waku/1/0x397f6567/rfc26 ] }" error="subscriptions failed for contentTopics: /waku/1/0x28259c00/rfc26,/waku/1/0x397f6567/rfc26"
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:143] subscription status apisub-id=c8df0caa-2b49-42dd-be89-c1b046bb22f4 content-filter="{ pubsubTopic: /waku/2/rs/16/64, contentTopics: [ /waku/1/0x28259c00/rfc26/waku/1/0x397f6567/rfc26 ] }" sub-count=1 content-filter="{ pubsubTopic: /waku/2/rs/16/64, contentTopics: [ /waku/1/0x28259c00/rfc26/waku/1/0x397f6567/rfc26 ] }"
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:167] subscribing again apisub-id=c8df0caa-2b49-42dd-be89-c1b046bb22f4 content-filter="{ pubsubTopic: /waku/2/rs/16/64, contentTopics: [ /waku/1/0x28259c00/rfc26/waku/1/0x397f6567/rfc26 ] }" num-peers=1
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:192] subscribing with peers to exclude apisub-id=c8df0caa-2b49-42dd-be89-c1b046bb22f4 content-filter="{ pubsubTopic: /waku/2/rs/16/64, contentTopics: [ /waku/1/0x28259c00/rfc26/waku/1/0x397f6567/rfc26 ] }" excluded-peers=[16Uiu2HAmPqUttg6kDmmk1nyqzq2En3i5BTiye1rTYHooHAeE3RNv]
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:337] handleFilterSubscribeOptions peerCount=1 excludePeersLen=1
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_selection.go:193] Select Peers selectionCriteria="{\"selectionType\":0,\"protocolId\":\"/vac/waku/filter-subscribe/2.0.0-beta1\",\"pubsubTopics\":[\"/waku/2/rs/16/64\"],\"specificPeers\":null,\"maxPeerCount\":1,\"excludePeers\":{\"\\u0000%\\u0008\\u0002\\u0012!\\u0003\\ufffd \\ufffdڿ\\ufffd\\u001f\\ufffd8\\ufffdJ\\ufffd\\ufffd\\ufffd \\ufffdO\\ufffd\\ufffd^ui\\u000e̓'\\ufffdV\\ufffd\\ufffdp\\ufffd\":{}}}" excludedPeers=16Uiu2HAmPqUttg6kDmmk1nyqzq2En3i5BTiye1rTYHooHAeE3RNv
ERROR[10-11|10:12:28.025|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:425] Failed to subscribe pubSubTopic=/waku/2/rs/16/32 contentTopics="[/waku/1/0xc3384567/rfc26 /waku/1/0x365d0749/rfc26 /waku/1/0xad134f3a/rfc26]" error="failed to dial: failed to dial 16Uiu2HAmE8FeXNJnqppRq6Rt3PTfGLWUHtxjBUmHzrDFeFPoPkLM: all dials failed\n * [/ip4/103.227.96.201/tcp/40345] dial backoff"
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:354] handleFilterSubscribeOptions exit selectedPeerCount=1
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:177] failed to resubscribe for filter apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26/waku/1/0xc3384567/rfc26 ] }" error="subscriptions failed for contentTopics: /waku/1/0xc3384567/rfc26,/waku/1/0x365d0749/rfc26,/waku/1/0xad134f3a/rfc26"
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:143] subscription status apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }" sub-count=1 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }"
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:167] subscribing again apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }" num-peers=1
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:192] subscribing with peers to exclude apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }" excluded-peers=[16Uiu2HAmPqUttg6kDmmk1nyqzq2En3i5BTiye1rTYHooHAeE3RNv]
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:337] handleFilterSubscribeOptions peerCount=1 excludePeersLen=1
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_selection.go:193] Select Peers selectionCriteria="{\"selectionType\":0,\"protocolId\":\"/vac/waku/filter-subscribe/2.0.0-beta1\",\"pubsubTopics\":[\"/waku/2/rs/16/32\"],\"specificPeers\":null,\"maxPeerCount\":1,\"excludePeers\":{\"\\u0000%\\u0008\\u0002\\u0012!\\u0003\\ufffd \\ufffdڿ\\ufffd\\u001f\\ufffd8\\ufffdJ\\ufffd\\ufffd\\ufffd \\ufffdO\\ufffd\\ufffd^ui\\u000e̓'\\ufffdV\\ufffd\\ufffdp\\ufffd\":{}}}" excludedPeers=16Uiu2HAmPqUttg6kDmmk1nyqzq2En3i5BTiye1rTYHooHAeE3RNv
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:354] handleFilterSubscribeOptions exit selectedPeerCount=1
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:379] peer selection params.maxPeers=1
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] host 16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV dialing 16Uiu2HAmCb3p2uuxh2ncooR1qxNheLNmatj7ePAZQEKsd3cTRTnd
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:251] dialing peer from=16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV to=16Uiu2HAmCb3p2uuxh2ncooR1qxNheLNmatj7ePAZQEKsd3cTRTnd
DEBUG[10-11|10:12:28.026|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] network for 16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV finished dialing 16Uiu2HAmCb3p2uuxh2ncooR1qxNheLNmatj7ePAZQEKsd3cTRTnd
ERROR[10-11|10:12:28.026|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:425] Failed to subscribe pubSubTopic=/waku/2/rs/16/32 contentTopics="[/waku/1/0xc3384567/rfc26 /waku/1/0x365d0749/rfc26 /waku/1/0xad134f3a/rfc26]" error="failed to dial: failed to dial 16Uiu2HAmCb3p2uuxh2ncooR1qxNheLNmatj7ePAZQEKsd3cTRTnd: no addresses"
DEBUG[10-11|10:12:28.027|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:177] failed to resubscribe for filter apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }" error="subscriptions failed for contentTopics: /waku/1/0xc3384567/rfc26,/waku/1/0x365d0749/rfc26,/waku/1/0xad134f3a/rfc26"
DEBUG[10-11|10:12:28.027|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:143] subscription status apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }" sub-count=1 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }"
DEBUG[10-11|10:12:28.027|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:167] subscribing again apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }" num-peers=1
DEBUG[10-11|10:12:28.027|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter.go:192] subscribing with peers to exclude apisub-id=154a6d67-b3e5-41bd-9c58-50edb950f248 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0xc3384567/rfc26/waku/1/0x365d0749/rfc26/waku/1/0xad134f3a/rfc26 ] }" excluded-peers=[16Uiu2HAmPqUttg6kDmmk1nyqzq2En3i5BTiye1rTYHooHAeE3RNv]
DEBUG[10-11|10:12:28.027|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:337] handleFilterSubscribeOptions peerCount=1 excludePeersLen=1
DEBUG[10-11|10:12:28.027|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_selection.go:193] Select Peers selectionCriteria="{\"selectionType\":0,\"protocolId\":\"/vac/waku/filter-subscribe/2.0.0-beta1\",\"pubsubTopics\":[\"/waku/2/rs/16/32\"],\"specificPeers\":null,\"maxPeerCount\":1,\"excludePeers\":{\"\\u0000%\\u0008\\u0002\\u0012!\\u0003\\ufffd \\ufffdڿ\\ufffd\\u001f\\ufffd8\\ufffdJ\\ufffd\\ufffd\\ufffd \\ufffdO\\ufffd\\ufffd^ui\\u000e̓'\\ufffdV\\ufffd\\ufffdp\\ufffd\":{}}}" excludedPeers=16Uiu2HAmPqUttg6kDmmk1nyqzq2En3i5BTiye1rTYHooHAeE3RNv
DEBUG[10-11|10:12:28.027|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:354] handleFilterSubscribeOptions exit selectedPeerCount=1
DEBUG[10-11|10:12:28.027|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:379] peer selection params.maxPeers=1
DEBUG[10-11|10:12:28.027|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] host 16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV dialing 16Uiu2HAmULXfYgvnAhAzBwFGSJoJ2roPudAsAJ3LnYLNtwfR7HGJ
DEBUG[10-11|10:12:28.027|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:251] dialing peer from=16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV to=16Uiu2HAmULXfYgvnAhAzBwFGSJoJ2roPudAsAJ3LnYLNtwfR7HGJ
DEBUG[10-11|10:12:28.027|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] network for 16Uiu2HAm7GY3UyS7vCfBiCqAzVhJG4j1nGys4mfQDzJ8rwPgHkQV finished dialing 16Uiu2HAmULXfYgvnAhAzBwFGSJoJ2roPudAsAJ3LnYLNtwfR7HGJ
updating after some analysis:
- I have noticed that some of the filters have been uninstalled through the logs which should not be the case even if app goes into background. Or maybe there is a flow which just uninstalls filters which is normal (would be good to know what is the cause of filter removal).
- Note that
failed to subscribeand trying to resubscribe is expected because desktop peers also support filter and mobile may be trying to subscribe to a desktop peer which maybe offline and hence you can see dial errors. The idea is in case of failure, it should recover as subscriptions are retried. - i have noticed a sudden drop in filter subscriptions because the peers are returning
404for filter ping. need to analyze why that could be happening. this could lead to delay in messages being received. Wondering if app going into background is causing peers to be disconnected and pings to not happen? any idea of the behaviour cc @seanstrom
i see drop in peer-counts at
10-10|16:22:56.820,10-10|16:24:51.155,10-10|16:28:30.978,10-10|16:42:45.77610-11|08:34:21.105times. it would help if you can give some timeline of events so that i can understand logs better.
- Even if filter fails, there should be periodic store queries happening every 1 minute to check for missing messages and query from store node. is this disabled when user is on 4G? cc @seanstrom Because right now we depend on store for reliability and if this is disabled, might be causing issues. Will analyze further as i also see below. Will update further after analysis.
- ~some weird logs which should not be there , probably due to hitting some sort of race condition between resubcribe and unsubscribe.
ERROR[10-10|15:40:18.671|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:425] Failed to subscribe pubSubTopic=/waku/2/rs/16/32 contentTopics=[] error="at least one contenttopic should be specified"~ this is not a concern as i see this resolved automatically. - there was a perioid where resubscribe did not happen eventhough there are subscriptions. Looks like this is not the case, just the logs were misleading due to the way filter stats are displayed. Will make a small fix for this.
it would help to know which messages were missed, if you have any messageID, contentTopic , time and date to look in more details what exactly is happening.
Hey @chaitanyaprem thanks for analysis, great stuff 🙌
I've attempted to reproduce this glitch again, but this time I reproduced the .glitch using a combination of an iOS simulator for sending messages to another iOS device. This was done because it was easier for me to access the logs of the simulator and I was having some issues running the dev build of the desktop app. If it's important to reproduce this glitch lmk and I can try again.
That being said, it did take me a while to finally capture the same glitch, but I was able to get a copy of the logs from the sending device and the receiver device so we can compare them.
More details here: https://github.com/status-im/status-mobile/issues/21172#issuecomment-2420816752
To respond to your question about the disconnecting peers when backgrounding the app:
i have noticed a sudden drop in filter subscriptions because the peers are returning 404 for filter ping. need to analyze why that could be happening. this could lead to delay in messages being received. Wondering if app going into background is causing peers to be disconnected and pings to not happen? any idea of the behaviour
Supposedly when we background the application there could be some behaviour where the operating system will "idle" the application if it's waiting in the background. This could have an affect on network activity or network resources, but I cannot say for certain. Perhaps @ilmotta or @igor-sirotin may have some more context about this behaviour.
Though what I've been able to notice is that this issue seems to mainly occur when the app returns to the foreground (the app was reopened), and then the network reconnects. So potentially on a low or missing data connection (mobile lte) the app may go through a connection state change. When that happens perhaps there's some logic that handling resources based on whether the connection is offline or online, but I'm not sure so I'd need to do some more digging through the code to find logic related to connection state.
I do know that there is logic for checking the connection state when that is going from the background to the foreground, but I'm not sure if there's any logic for re-examining the connection state when the app is already running. This seems to be an important detail since I'm mainly able to reproduce this issue when reconnecting to the network after opening the app.
To respond to your question about missing messages:
Even if filter fails, there should be periodic store queries happening every 1 minute to check for missing messages and query from store node. is this disabled when user is on 4G? cc @seanstrom Because right now we depend on store for reliability and if this is disabled, might be causing issues.
I do notice from the logs that we are attempting to retrieve missing messages, and it seems we've configured status-go to retrieve these missing message, but I haven't found any conditions for not doing retrieving the messages based on the connection state atm. I'll need to look deeper into the codebase to verify what's happening, but I think we're setup to to retrieve missing messages.
Here's some updated logs and context from reproducing this glitch:
-
It was difficult to reproduce this glitch, so it took me multiple attempts of disconnecting the iOS device from the network and and reconnecting the network after backgrounding the mobile and reopening the app to the foreground. I also needed to wait roughly 5 minutes after backgrounding the app before bringing it to the foreground again.
- This glitch is a little peculiar since I was able to receive a message after reconnecting, but I was not able to receive an additional message after reconnecting.
-
For context, both devices are on the same WiFi network and the iOS simulator remains online while the other device goes offline.
-
From the logs of the iOS Simulator (the device sending a message)
INFO [10-17|21:10:54.912|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/peer_exchange/client.go:156] connecting to newly discovered peers count=20 INFO [10-17|21:10:56.232|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/publish/message_sender.go:115] publishing message via lightpush envelopeHash=0xd91e8399546bfecf15c1ff688d5a0856fad7c6bef920eb4a4fda4af33c6d8595 pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0xe88187f2/rfc26 timestamp=1,729,199,456,232,070,000 INFO [10-17|21:10:56.525|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:214] received message push peerID=16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ hash=0xd91e8399546bfecf15c1ff688d5a0856fad7c6bef920eb4a4fda4af33c6d8595 pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0xe88187f2/rfc26 timestamp=1,729,199,456,232,070,000 INFO [10-17|21:10:56.812|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:214] received message push peerID=16Uiu2HAmAR24Mbb6VuzoyUiGx42UenDkshENVDj4qnmmbabLvo31 hash=0xd91e8399546bfecf15c1ff688d5a0856fad7c6bef920eb4a4fda4af33c6d8595 pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0xe88187f2/rfc26 timestamp=1,729,199,456,232,070,000 INFO [10-17|21:10:56.932|github.com/status-im/status-go/protocol/messenger.go:3585] processing message site=RetrieveAll hash=0xd91e8399546bfecf15c1ff688d5a0856fad7c6bef920eb4a4fda4af33c6d8595 message-id=0x7e48aaa26c5b8c3247479417b133ad55b321821de66e81e937642386a0ebc16f type=CHAT_MESSAGE senderID=0x046bf754f6eecd4a6d73557b4795eef4e824f13c63d4343356f6ec1d6597c9ba02f09dc75809f208283f370d60f0a76555dc132fdc17970460a1f93c87d7d01490 WARN [10-17|21:10:56.932|github.com/status-im/status-go/protocol/messenger_maps.go:65] contacts map: loading own identity contactID=0x046bf754f6eecd4a6d73557b4795eef4e824f13c63d4343356f6ec1d6597c9ba02f09dc75809f208283f370d60f0a76555dc132fdc17970460a1f93c87d7d01490contentTopic=/waku/1/0xe88187f2/rfc26message-id=0x7e48aaa26c5b8c3247479417b133ad55b321821de66e81e937642386a0ebc16ftimestamp=1,729,199,456,232,070,00010-17|21:10:56.232
-
From the logs of the iOS Device (the device receiving a message)
WARN [10-17|21:10:56.699|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:207] received messagepush with invalid subscription parameters peerID=16Uiu2HAmNTpGnyZ8W1BK2sXEmgSCNWiyDKgRU3NBR2DXST2HzxRU hash=0xd91e8399546bfecf15c1ff688d5a0856fad7c6bef920eb4a4fda4af33c6d8595 pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0xe88187f2/rfc26 timestamp=1,729,199,456,232,070,000 DEBUG[10-17|21:10:57.568|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185] retrieving history to check for missing messages peerID=16Uiu2HAm9CQhsuwPR54q27kNj9iaQVfyRzTGKrhFmr94oD8ujU6P contentTopics="[/waku/1/0x7e6732ec/rfc26 /waku/1/0x5759bbff/rfc26 /waku/1/0xacfae801/rfc26 /waku/1/0x696bdb86/rfc26 /waku/1/0xa35b9bb3/rfc26 /waku/1/0xb69e0a57/rfc26 /waku/1/0x0b65ce26/rfc26 /waku/1/0xe5eb41b8/rfc26 /waku/1/0xa9a5e604/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1729192525821337000 to=1729192778950497000 attempt=1 ERROR[10-17|21:10:57.600|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:428] Failed to subscribe pubSubTopic=/waku/2/rs/16/32 contentTopics="[/waku/1/0x6c561c0d/rfc26 /waku/1/0x8396b7a1/rfc26 /waku/1/0xe88187f2/rfc26]" error="failed to dial: failed to dial 16Uiu2HAmBPzatVXrBnv6bJVdUNDXTUJ5y9JRGAbtcHdhzhjm6Ssk: all dials failed\n * [/ip4/152.58.243.206/tcp/37331] dial tcp4 0.0.0.0:49443->152.58.243.206:37331: i/o timeout"- Here I noticed that we do receive a message, but it has a warning of:
received messagepush with invalid subscription parameters - Missing messages do seem to be attempted to be retrieved but it doesn’t seem to be happening for the identified contentTopic, but in this case there weren’t any missing messages to be retrieved because we had received all the existing messages and were only missing the latest message
- There are more logs for failing to resubscribe to certain filters too, but those may be unrelated to the specific community channel that was expecting a message
- Here I noticed that we do receive a message, but it has a warning of:
Logs from the device: ios-device-logs.zip Logs from the simulator: ios-simulator-logs.log.zip
Unfortunately I was only able to extract the geth.log from the iOS simulator, but hopefully that's enough for now.
Some follow-up notes:
-
What’s interesting is that the message does make it through to the physical iOS device, but it seems to be invalid because of
received messagepush with invalid subscription parameters- I’m not sure how subscriptions are coordinated between the two devices, but perhaps when the physical device goes offline and reconnects, maybe the subscriptions are cleared in some way which could cause the message to be seen as an undesired message (?)
- from previous experiments, it seems that the issue is self-corrected when the app switches to background and returns to the foreground. So perhaps there’s some other logic that will re-register subscriptions?
-
I do notice that the missing message verification does happen after the reconnection happens and the message is not delivered, but the contentTopic
/waku/1/0xe88187f2/rfc26seems to be missing from the list of topics that would be retrieved.DEBUG[10-17|21:11:51.613|[github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185](http://github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185)] retrieving history to check for missing messages peerID=16Uiu2HAm9CQhsuwPR54q27kNj9iaQVfyRzTGKrhFmr94oD8ujU6P contentTopics="[/waku/1/0x7e6732ec/rfc26 /waku/1/0x5759bbff/rfc26 /waku/1/0xacfae801/rfc26 /waku/1/0x696bdb86/rfc26 /waku/1/0xa35b9bb3/rfc26 /waku/1/0xb69e0a57/rfc26 /waku/1/0x0b65ce26/rfc26 /waku/1/0xe5eb41b8/rfc26 /waku/1/0xa9a5e604/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1729192525821337000 to=1729192778950497000 attempt=1 DEBUG[10-17|21:11:51.614|[github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268](http://github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268)] sending store request peer=16Uiu2HAm9CQhsuwPR54q27kNj9iaQVfyRzTGKrhFmr94oD8ujU6P requestId=37316536333764356134346664326263646135303038373463656566373765376165353464323833313932306163306430383564303462646634316563633136 ERROR[10-17|21:11:51.614|[github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:190](http://github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:190)] could not query storenode peerID=16Uiu2HAm9CQhsuwPR54q27kNj9iaQVfyRzTGKrhFmr94oD8ujU6P contentTopics="[/waku/1/0x7e6732ec/rfc26 /waku/1/0x5759bbff/rfc26 /waku/1/0xacfae801/rfc26 /waku/1/0x696bdb86/rfc26 /waku/1/0xa35b9bb3/rfc26 /waku/1/0xb69e0a57/rfc26 /waku/1/0x0b65ce26/rfc26 /waku/1/0xe5eb41b8/rfc26 /waku/1/0xa9a5e604/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1729192525821337000 to=1729192778950497000 error="context canceled" attempt=1- In this case, it seems we’re unable to contact the store node for the missing messages, but it still shows that the contentTopic was not included in the request.
-
For comparison, we can see that the topic is included when the messages are retrieved. Note that these logs are from a different attempt to reproduce the glitch.
DEBUG[10-17|23:55:24.468|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185] retrieving missing messages peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ contentTopics="[/waku/1/0xa9a5e604/rfc26 /waku/1/0x5759bbff/rfc26 /waku/1/0x696bdb86/rfc26 /waku/1/0xacfae801/rfc26 /waku/1/0x7e6732ec/rfc26 /waku/1/0xe5eb41b8/rfc26 /waku/1/0x8396b7a1/rfc26 /waku/1/0x6c561c0d/rfc26 /waku/1/0xf473d4e0/rfc26 /waku/1/0xe88187f2/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1729209264462294000 to=1729209324215529000 attempt=1 DEBUG[10-17|23:55:24.468|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268] sending store request peer=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ requestId=30393030326432323130633531306439333939663733326233373833656462323631646336396438376361333064383537323135343331666265383430663834
@seanstrom FYI, could be related: https://github.com/status-im/status-go/issues/5659
Thanks a lot for detailed tests and analysis, it helped me look at specific areas in the logs.
I’m not sure how subscriptions are coordinated between the two devices, but perhaps when the physical device goes offline and reconnects, maybe the subscriptions are cleared in some way which could cause the message to be seen as an undesired message (?)
this could be a reason why invalid subscription params is noticed. while going through logs i had noticed that app lost network connectivity at 21:10:34 and immediately it got connectivity back at 21:10:50.
when connectivity is lost client removes all subscriptions and once it is back it will try to subscribe again. but server is not aware client has removed subscription and is sending for a filter it registered before network is lost. this may not happen if app is in background for a longer period of time (i.e time between disconnection and reconnection is more than 5/10 minutes). Note that this is fine and expected to happen.
INFO [10-17|21:10:34.328|github.com/status-im/status-go/api/geth_backend.go:2439] Network state change package=status-go/api.GethStatusBackend old=wifi new=offline .... INFO [10-17|21:10:50.168|github.com/status-im/status-go/api/geth_backend.go:2439] Network state change package=status-go/api.GethStatusBackend old=offline new=wifi
- Here I noticed that we do receive a message, but it has a warning of:
received messagepush with invalid subscription parameters
while going through logs, noticed that this could happen because filter subscriptions are not successful due to peers that are not reachable and taking time to stabilize. will analyze further to see if there is some other issue.
but in the meantime i have made a simple logic to remove bad peers if dial fails twice to them and included it https://github.com/status-im/status-mobile/pull/21458. can you try to simulate the issue with this version.
- Missing messages do seem to be attempted to be retrieved but it doesn’t seem to be happening for the identified contentTopic, but in this case there weren’t any missing messages to be retrieved because we had received all the existing messages and were only missing the latest message
this is odd, wondering what might be the reason for this content-topic not being chosen for missing messages. maybe @richard-ramos has an idea.
@chaitanyaprem is the filter for this content topic is ephemeral? those are ignored for retrieving missing messages: https://github.com/status-im/status-go/blob/e611b1e5131fff706151661d15cc2904c20a7f71/protocol/transport/transport.go#L699C19-L699C24
@chaitanyaprem is the filter for this content topic is ephemeral? those are ignored for retrieving missing messages: https://github.com/status-im/status-go/blob/e611b1e5131fff706151661d15cc2904c20a7f71/protocol/transport/transport.go#L699C19-L699C24
ah, that might make sense...i have no idea though. maybe @seanstrom can help
It seems that a contentTopic filter is marked as Ephemeral here: https://github.com/status-im/status-go/blob/3179532b645549c103266e007694d2c81a7091b4/protocol/messenger_store_node_request_manager.go#L284
I've checked around and added some logs debug logs for a contentTopic being Ephemeral, and it doesn't seem in this case the contentTopic for the community channel is considered ephemeral.
cc @richard-ramos @chaitanyaprem
I've reproduced the issue again and this the logs will focus on:
contentTopicis0x479e6e55- missing
messageID0x850e6ac49c6f715d17fab609cbcf4faf813f51f601321256184caecade741a00
So far it seems like there are some issues with reconnecting to the peer after the network connection is changed from offline to online.
- I’m noticing peers are not available either by dialing and receiving a backoff status or timeout status
- This seems to affect the message verification retrieval for the contentTopic, because the tasks for retrieving the messages are being cancelled.
From the ios-device logs (the receiver device)
-
The logs show a timeout when attempting to connect to peer and subscribe
WARN [10-21|18:14:21.601|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_manager.go:740] connecting to peer peerID=16Uiu2HAmSijA4xVLfXuEhvU126psZbVzPRUhZT4BDtiophCsrcQz error="failed to dial: failed to dial 16Uiu2HAmSijA4xVLfXuEhvU126psZbVzPRUhZT4BDtiophCsrcQz: all dials failed\n * [/ip4/156.213.241.94/tcp/46379] dial tcp4 0.0.0.0:54600->156.213.241.94:46379: i/o timeout" ERROR[10-21|18:14:21.601|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:428] Failed to subscribe pubSubTopic=/waku/2/rs/16/32 contentTopics="[/waku/1/0x6c042192/rfc26 /waku/1/0x479e6e55/rfc26 /waku/1/0x9e634129/rfc26]" error="failed to dial: failed to dial 16Uiu2HAmSijA4xVLfXuEhvU126psZbVzPRUhZT4BDtiophCsrcQz: all dials failed\n * [/ip4/156.213.241.94/tcp/46379] dial tcp4 0.0.0.0:54600->156.213.241.94:46379: i/o timeout" -
The logs show a backoff when attempting to connect to peer and subscribe
WARN [10-21|18:15:57.508|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_manager.go:740] connecting to peer peerID=16Uiu2HAmKXqamc4aSTkUEgFU3coCydiVCaKoDyHtsYdCUaxqKhUv error="failed to dial: failed to dial 16Uiu2HAmKXqamc4aSTkUEgFU3coCydiVCaKoDyHtsYdCUaxqKhUv: all dials failed\n * [/ip4/79.110.55.4/tcp/53585] dial backoff" ERROR[10-21|18:15:57.508|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:428] Failed to subscribe pubSubTopic=/waku/2/rs/16/32 contentTopics="[/waku/1/0xd1b725db/rfc26 /waku/1/0x1ef3b8eb/rfc26]" error="failed to dial: failed to dial 16Uiu2HAmKXqamc4aSTkUEgFU3coCydiVCaKoDyHtsYdCUaxqKhUv: all dials failed\n * [/ip4/79.110.55.4/tcp/53585] dial backoff" -
The logs show the message-verification store-node requests being cancelled
DEBUG[10-21|18:17:23.501|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185] retrieving history to check for missing messages peerID=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM contentTopics="[/waku/1/0x0b10486b/rfc26 /waku/1/0x1ca1ac8f/rfc26]" pubsubTopic=/waku/2/rs/16/64 from=1729534036339416000 to=1729534096331553000 attempt=1 DEBUG[10-21|18:17:23.501|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185] retrieving history to check for missing messages peerID=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM contentTopics="[/waku/1/0xae688b3b/rfc26 /waku/1/0x479e6e55/rfc26 /waku/1/0xb5753763/rfc26 /waku/1/0x92e8c9ee/rfc26 /waku/1/0x55cd0a44/rfc26 /waku/1/0x277bd875/rfc26 /waku/1/0x48b1bff9/rfc26 /waku/1/0x44204c9b/rfc26 /waku/1/0x17c556f6/rfc26 /waku/1/0x1ef3b8eb/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1729534036577581000 to=1729534096331662000 attempt=1 DEBUG[10-21|18:17:23.502|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268] sending store request peer=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM requestId=65393335653535353465653761306336646635303835393032626466353138393331616634646333323633353765343430306236376161666433323565306538 DEBUG[10-21|18:17:23.502|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268] sending store request peer=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM requestId=66363732363262383966663539346663623666366632623738333561653937616430633639333363383338633735353131366239643566316633653030386665 ERROR[10-21|18:17:23.502|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:190] could not query storenode peerID=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM contentTopics="[/waku/1/0x0b10486b/rfc26 /waku/1/0x1ca1ac8f/rfc26]" pubsubTopic=/waku/2/rs/16/64 from=1729534036339416000 to=1729534096331553000 error="context canceled" attempt=1 ERROR[10-21|18:17:23.502|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:190] could not query storenode peerID=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM contentTopics="[/waku/1/0xae688b3b/rfc26 /waku/1/0x479e6e55/rfc26 /waku/1/0xb5753763/rfc26 /waku/1/0x92e8c9ee/rfc26 /waku/1/0x55cd0a44/rfc26 /waku/1/0x277bd875/rfc26 /waku/1/0x48b1bff9/rfc26 /waku/1/0x44204c9b/rfc26 /waku/1/0x17c556f6/rfc26 /waku/1/0x1ef3b8eb/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1729534036577581000 to=1729534096331662000 error="context canceled" attempt=1
It seems that a contentTopic filter is marked as Ephemeral here: https://github.com/status-im/status-go/blob/3179532b645549c103266e007694d2c81a7091b4/protocol/messenger_store_node_request_manager.go#L284
I've checked around and added some logs debug logs for a contentTopic being Ephemeral, and it doesn't seem in this case the contentTopic for the community channel is considered ephemeral.
oh, that is interesting. @richard-ramos do you think setting this default to true could be a cause that some content-topics are not considered for store-node query. wondering if there is some sort of race condition during initialization that can cause the filter to be created/overridden for storeNodeRequest?
So far it seems like there are some issues with reconnecting to the peer after the network connection is changed from offline to online.
I don't think the issue is specific to peer connectivity when connection state changes from offline to online.
Looks like some of the peers are not reachable. i just ran a wakucanary check on peers to which dial failed in your logs and seems like dial fails even now. This happens because status-desktop instances may come and go in the network and peer-exchange(discovery mechanism used in lightclients) would still return these peers even if they are not online. So, it is expected to take some time to find peers which are available.
The change i did in #21458 should help mitigate hitting dial backoff scenario because the peer would be removed from local peerstore to which dials have failed more than twice. Can you try to repro the issue with this change?
In the meantime i will think if there is something else that can be done to speed-up recovery after disconnection.
./build/wakucanary -c=16 -s=32 -a="/ip4/79.110.55.4/tcp/53585/p2p/16Uiu2HAmKXqamc4aSTkUEgFU3coCydiVCaKoDyHtsYdCUaxqKhUv"
INF 2024-10-23 15:00:40.588+05:30 Cli flags tid=46812462 file=wakucanary.nim:161 address=/ip4/79.110.55.4/tcp/53585/p2p/16Uiu2HAmKXqamc4aSTkUEgFU3coCydiVCaKoDyHtsYdCUaxqKhUv timeout=10s protocols=@[] logLevel=INFO
INF 2024-10-23 15:00:40.589+05:30 Initializing networking tid=46812462 file=waku_node.nim:157 addrs=@[/ip4/127.0.0.1/tcp/60000]
.............................
INF 2024-10-23 15:00:40.589+05:30 Dialing multiple peers topics="waku node peer_manager" tid=46812462 file=peer_manager.nim:348 numOfPeers=1 nodes=@[16Uiu2HAmKXqamc4aSTkUEgFU3coCydiVCaKoDyHtsYdCUaxqKhUv]
ERR 2024-10-23 15:00:50.591+05:30 Timedout after tid=46812462 file=wakucanary.nim:254 timeout=10s
ERR 2024-10-23 15:00:50.591+05:30 The node has some problems (see logs) tid=46812462 file=wakucanary.nim:280
./build/wakucanary -c=16 -s=32 -a="/ip4/156.213.241.94/tcp/46379/p2p/16Uiu2HAmSijA4xVLfXuEhvU126psZbVzPRUhZT4BDtiophCsrcQz"
INF 2024-10-23 15:05:16.276+05:30 Cli flags tid=46818515 file=wakucanary.nim:161 address=/ip4/156.213.241.94/tcp/46379/p2p/16Uiu2HAmSijA4xVLfXuEhvU126psZbVzPRUhZT4BDtiophCsrcQz timeout=10s protocols=@[] logLevel=INFO
INF 2024-10-23 15:05:16.279+05:30 Initializing networking tid=46818515 file=waku_node.nim:157 addrs=@[/ip4/127.0.0.1/tcp/60000]
...................
INF 2024-10-23 15:05:16.280+05:30 Dialing multiple peers topics="waku node peer_manager" tid=46818515 file=peer_manager.nim:348 numOfPeers=1 nodes=@[16Uiu2HAmSijA4xVLfXuEhvU126psZbVzPRUhZT4BDtiophCsrcQz]
ERR 2024-10-23 15:05:26.281+05:30 Timedout after tid=46818515 file=wakucanary.nim:254 timeout=10s
ERR 2024-10-23 15:05:26.281+05:30 The node has some problems (see logs) tid=46818515 file=wakucanary.nim:280
- The logs show the message-verification store-node requests being cancelled
This doesn't look like it is related to network connectivity of the simulator, rather store node must have been slow in responding which has caused request-timeout at client side.
@seanstrom can you test with updated changes pushed to https://github.com/status-im/status-mobile/pull/21458 . I have added a strategy where fleet nodes are used as one of the filter peers to subscribe to which would mean subscriptions should be created faster.
I would like to know the feedback based on which i will make this change for lightpush as well.
@chaitanyaprem yup yup I'll test those changes, sorry for the delay, should be able to try them out and post some results soon.
@chaitanyaprem yup yup I'll test those changes, sorry for the delay, should be able to try them out and post some results soon.
No hurry, just wanted to indicate that i had included additional strategy which might help speed up recovery of mobile app during disconnections.
Okie dokie I've run the experiment again with both the the iOS simulator and iOS device running the patched code in your branch. Though I did need to rebase your changes with the latest status-go stuff, but there didn't seem to be any conflicting or similar changes recently.
-
Overview
- I noticed this time I don’t receive an invalid subscription error for the missing message, it just seems it was undelivered (for roughly 20 minutes) until I put the app in background and then back into the foreground (which causes a message history retrieval).
- This time I had to disconnect and reconnect the device multiple times in a row to cause the issue to appear.
- And it seems that the receive device does attempt to reestablish connection with peers and receive messages, but for some reason they continue to fail in a loop after reconnecting, perhaps this is due to multiple attempts being made while disconnected and they’re not cancelled when the connection changes (?)
- Eventually it seems the peers are re-connected, but for some reason the missing-message retrieval and the peers did not send or retrieve the missing message.
-
Details
- Content Topic:
contentTopic=/waku/1/0x0a0df437/rfc26 - Message ID:
0xf2f6bcebfef8d9c0bac510e14320d608d88541457663acb41a946524782e1099 - Timestamp for sending message in sender logs:
20:33:05.917
- Content Topic:
-
From the iOS simulator (sender device)
INFO [10-28|20:33:05.917|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/publish/message_sender.go:115] publishing message via lightpush envelopeHash=0x3c81d4b10eae7a3c2f7aee2cb25e53e2f1640f54b69ddd2e723fbe63cfcdad6a pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x0a0df437/rfc26 timestamp=1,730,147,585,917,426,000 -
From the iOS device (receiver device)
-
We can see that the topic is attempted to be retrieved by missing-messages, but no other retrieval attempts are logged after this one.
DEBUG[10-28|20:33:13.311|github.com/status-im/status-go/wakuv2/waku.go:1044] store.query requestID=0x6a31b54cd18b55c1822923c919d019c02af05744f539d4f63e1cb0e367fb095f peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ startTime=1730147508000000000 endTime=1730147591000000000 contentTopics="[/waku/1/0xee8eabda/rfc26 /waku/1/0x0a0df437/rfc26 /waku/1/0x10674ece/rfc26 /waku/1/0xfc7b344b/rfc26 /waku/1/0xe9685ca3/rfc26 /waku/1/0xbd21033f/rfc26 /waku/1/0xeea9ef15/rfc26 /waku/1/0x2b3fa53e/rfc26]" pubsubTopic=/waku/2/rs/16/32 cursor=0x -
We can see that the new code is running by removing peers
INFO [10-28|20:33:23.725|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_manager.go:767] removing bad peer due to recurring dial failures peerID=16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ -
And we can see that the receiver device does receive the message before the missing message
DEBUG[10-28|20:33:14.148|github.com/status-im/status-go/protocol/transport/transport.go:551] confirming message processed namespace=Transport ids=[0xfc0ef7973ee250b30c6e6909011532b80e8873b912dfc88cdaaf12b1e6bf9ed0] timestamp=1,730,147,594,148 -
And shortly after we begin to loop through connecting and disconnecting to peers it seems
INFO [10-28|20:33:14.460|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/node/connectedness.go:59] peer connected peer=16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ direction=Outbound DEBUG[10-28|20:33:14.460|github.com/status-im/status-go/wakuv2/waku.go:1253] peer stats peersCount=3 stats="{IsOnline:true Peers:map[16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ:{Protocols:[/vac/waku/filter-subscribe/2.0.0-beta1] Addresses:[/dns4/boot-01.gc-us-central1-a.status.prod.status.im/tcp/30303/p2p/16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ /ip4/34.135.13.87/tcp/30303/p2p/16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ /dns4/boot-01.gc-us-central1-a.status.prod.status.im/tcp/443/wss/p2p/16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ /ip4/34.135.13.87/tcp/443/tls/sni/boot-01.gc-us-central1-a.status.prod.status.im/ws/p2p/16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ]} 16Uiu2HAmKBd6crqQNZ6nKCSCpHCAwUPn3DUDmkcPSWUTyVXpxKsW:{Protocols:[/vac/waku/relay/2.0.0 /rendezvous/1.0.0 /ipfs/id/1.0.0 /libp2p/autonat/1.0.0 /libp2p/circuit/relay/0.2.0/hop /vac/waku/metadata/1.0.0 /ipfs/ping/1.0.0 /vac/waku/store/2.0.0-beta4 /vac/waku/store-query/3.0.0] Addresses:[/ip4/34.72.140.183/tcp/30303/p2p/16Uiu2HAmKBd6crqQNZ6nKCSCpHCAwUPn3DUDmkcPSWUTyVXpxKsW /dns4/store-02.gc-us-central1-a.status.staging.status.im/tcp/30303/p2p/16Uiu2HAmKBd6crqQNZ6nKCSCpHCAwUPn3DUDmkcPSWUTyVXpxKsW /dns4/store-02.gc-us-central1-a.status.staging.status.im/tcp/443/wss/p2p/16Uiu2HAmKBd6crqQNZ6nKCSCpHCAwUPn3DUDmkcPSWUTyVXpxKsW]} 16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ:{Protocols:[/ipfs/id/1.0.0 /libp2p/autonat/1.0.0 /libp2p/circuit/relay/0.2.0/hop /ipfs/ping/1.0.0 /vac/waku/store/2.0.0-beta4 /vac/waku/store-query/3.0.0 /vac/waku/metadata/1.0.0 /vac/waku/relay/2.0.0 /rendezvous/1.0.0] Addresses:[/ip4/34.170.192.39/tcp/30303/p2p/16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ /dns4/store-01.gc-us-central1-a.status.prod.status.im/tcp/30303/p2p/16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ /dns4/store-01.gc-us-central1-a.status.prod.status.im/tcp/443/wss/p2p/16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ]}]}" DEBUG[10-28|20:33:14.460|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] clearing all peer dials: 16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ INFO [10-28|20:33:14.460|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/node/connectedness.go:79] peer disconnected peer=16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ DEBUG[10-28|20:33:14.460|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/metadata/waku_metadata.go:230] peer connected peer=16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ DEBUG[10-28|20:33:14.460|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] host 16Uiu2HAm8goi4QpW2k4G99zMvmc7gLSpWc2nE8nCBJjN9z1RSPDJ dialing 16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ DEBUG[10-28|20:33:14.460|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:251] dialing peer from=16Uiu2HAm8goi4QpW2k4G99zMvmc7gLSpWc2nE8nCBJjN9z1RSPDJ to=16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ DEBUG[10-28|20:33:14.460|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] freeing FD token; waiting: 0; consuming: 6 DEBUG[10-28|20:33:14.460|github.com/status-im/status-go/wakuv2/waku.go:1253] peer stats peersCount=2 stats="{IsOnline:true Peers:map[16Uiu2HAmKBd6crqQNZ6nKCSCpHCAwUPn3DUDmkcPSWUTyVXpxKsW:{Protocols:[/vac/waku/store-query/3.0.0 /ipfs/id/1.0.0 /libp2p/autonat/1.0.0 /libp2p/circuit/relay/0.2.0/hop /vac/waku/metadata/1.0.0 /ipfs/ping/1.0.0 /vac/waku/store/2.0.0-beta4 /vac/waku/relay/2.0.0 /rendezvous/1.0.0] Addresses:[/dns4/store-02.gc-us-central1-a.status.staging.status.im/tcp/443/wss/p2p/16Uiu2HAmKBd6crqQNZ6nKCSCpHCAwUPn3DUDmkcPSWUTyVXpxKsW /ip4/34.72.140.183/tcp/30303/p2p/16Uiu2HAmKBd6crqQNZ6nKCSCpHCAwUPn3DUDmkcPSWUTyVXpxKsW /dns4/store-02.gc-us-central1-a.status.staging.status.im/tcp/30303/p2p/16Uiu2HAmKBd6crqQNZ6nKCSCpHCAwUPn3DUDmkcPSWUTyVXpxKsW]} 16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ:{Protocols:[/rendezvous/1.0.0 /vac/waku/metadata/1.0.0 /vac/waku/relay/2.0.0 /libp2p/circuit/relay/0.2.0/hop /ipfs/ping/1.0.0 /vac/waku/store/2.0.0-beta4 /vac/waku/store-query/3.0.0 /ipfs/id/1.0.0 /libp2p/autonat/1.0.0] Addresses:[/dns4/store-01.gc-us-central1-a.status.prod.status.im/tcp/30303/p2p/16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ /dns4/store-01.gc-us-central1-a.status.prod.status.im/tcp/443/wss/p2p/16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ /ip4/34.170.192.39/tcp/30303/p2p/16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ]}]}" DEBUG[10-28|20:33:14.460|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] freeing peer token; peer 16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ; addr: /ip4/34.135.13.87/tcp/443/tls/sni/boot-01.gc-us-central1-a.status.prod.status.im/ws; active for peer: 1; waiting on peer limit: 0 DEBUG[10-28|20:33:14.460|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:251] error opening identify stream peer=16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ error=EOF DEBUG[10-28|20:33:14.460|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:197] inside on connection status change new-status=true agg filters count=3 filter subs count=0 WARN [10-28|20:33:14.460|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:210] failed to identify 16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ: EOF DEBUG[10-28|20:33:14.460|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:197] inside on connection status change new-status=true agg filters count=3 filter subs count=0 DEBUG[10-28|20:33:14.460|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] host 16Uiu2HAm8goi4QpW2k4G99zMvmc7gLSpWc2nE8nCBJjN9z1RSPDJ finished dialing 16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ DEBUG[10-28|20:33:14.460|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [16Uiu2HAm8goi4QpW2k4G99zMvmc7gLSpWc2nE8nCBJjN9z1RSPDJ] opening stream to peer [16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ] ERROR[10-28|20:33:14.460|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/metadata/waku_metadata.go:107] creating stream to peer peer=16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ error="connection failed" ERROR[10-28|20:33:14.460|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/metadata/waku_metadata.go:219] disconnecting from peer peerID=16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ error="connection failed" DEBUG[10-28|20:33:14.462|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] adding a dial job through limiter: /ip4/34.135.13.87/tcp/443/tls/sni/boot-01.gc-us-central1-a.status.prod.status.im/ws DEBUG[10-28|20:33:14.462|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] taking FD token: peer: 16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ; addr: /ip4/34.135.13.87/tcp/443/tls/sni/boot-01.gc-us-central1-a.status.prod.status.im/ws; prev consuming: 5 DEBUG[10-28|20:33:14.462|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] executing dial; peer: 16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ; addr: /ip4/34.135.13.87/tcp/443/tls/sni/boot-01.gc-us-central1-a.status.prod.status.im/ws; FD consuming: 6; waiting: 0 DEBUG[10-28|20:33:14.462|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] 16Uiu2HAm8goi4QpW2k4G99zMvmc7gLSpWc2nE8nCBJjN9z1RSPDJ swarm dialing 16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ /ip4/34.135.13.87/tcp/443/tls/sni/boot-01.gc-us-central1-a.status.prod.status.im/ws DEBUG[10-28|20:33:14.536|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:114] filter stats agg filters count=3 filter subs count=0 DEBUG[10-28|20:33:14.602|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] freeing FD token; waiting: 0; consuming: 6 INFO [10-28|20:33:14.602|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/node/connectedness.go:59] peer connected peer=16Uiu2HAmAR24Mbb6VuzoyUiGx42UenDkshENVDj4qnmmbabLvo31 direction=Outbound DEBUG[10-28|20:33:14.602|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:251] error opening identify stream peer=16Uiu2HAmAR24Mbb6VuzoyUiGx42UenDkshENVDj4qnmmbabLvo31 error=EOF WARN [10-28|20:33:14.602|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:210] failed to identify 16Uiu2HAmAR24Mbb6VuzoyUiGx42UenDkshENVDj4qnmmbabLvo31: EOF DEBUG[10-28|20:33:14.602|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] freeing peer token; peer 16Uiu2HAmAR24Mbb6VuzoyUiGx42UenDkshENVDj4qnmmbabLvo31; addr: /ip4/167.99.19.47/tcp/30303; active for peer: 2; waiting on peer limit: 0 -
And here we can see the subscription for the topic has failed
ERROR[10-28|20:36:39.265|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:442] Failed to subscribe pubSubTopic=/waku/2/rs/16/32 contentTopics="[/waku/1/0xc7b4fb3e/rfc26 /waku/1/0x1bc69c24/rfc26 /waku/1/0x2b3fa53e/rfc26 /waku/1/0xbd21033f/rfc26 /waku/1/0x5b2eda55/rfc26 /waku/1/0xeea9ef15/rfc26 /waku/1/0xe9685ca3/rfc26 /waku/1/0x10674ece/rfc26 /waku/1/0x0a0df437/rfc26 /waku/1/0xee8eabda/rfc26 /waku/1/0xfc7b344b/rfc26]" error="failed to dial: failed to dial 16Uiu2HAm6unKKTB9eGQJYdgExv6RKMFKg5yJqNv3JuPxKDEhtpmr: all dials failed\n * [/ip4/37.19.221.176/tcp/49196] dial tcp4 0.0.0.0:61206->37.19.221.176:49196: i/o timeout" -
And eventually the disconnecting and re-connecting of peers seems to resolve after a bit
INFO [10-28|20:41:59.532|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/node/connectedness.go:59] peer connected peer=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM direction=Outbound DEBUG[10-28|20:41:59.532|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] freeing peer token; peer 16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM; addr: /ip4/34.170.154.2/tcp/30303; active for peer: 2; waiting on peer limit: 0 DEBUG[10-28|20:41:59.533|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/metadata/waku_metadata.go:230] peer connected peer=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM DEBUG[10-28|20:41:59.533|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [16Uiu2HAm8goi4QpW2k4G99zMvmc7gLSpWc2nE8nCBJjN9z1RSPDJ] opening stream to peer [16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM] DEBUG[10-28|20:41:59.533|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] clearing all peer dials: 16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM DEBUG[10-28|20:41:59.533|github.com/status-im/status-go/wakuv2/waku.go:1253] peer stats peersCount=18 stats="{IsOnline:true Peers:map[16Uiu2HAkuy4F2XhfWWvpwQ8VcHhE2P27YyVW3zHfMXYzSm1pdDV8:{Protocols:[/floodsub/1.0.0 /meshsub/1.1.0 /vac/waku/peer-exchange/2.0.0-alpha1 /ipfs/id/1.0.0 /ipfs/ping/1.0.0 /vac/waku/filter-subscribe/2.0.0-beta1 /vac/waku/lightpush/2.0.0-beta1 /vac/waku/metadata/1.0.0 /libp2p/circuit/relay/0.2.0/stop /libp2p/dcutr /meshsub/1.0.0 /ipfs/id/push/1.0.0 /libp2p/autonat/1.0.0 /vac/waku/relay/2.0.0] Addresses:[/ip4/84.115.227.228/tcp/9841/p2p/16Uiu2HAkuy4F2XhfWWvpwQ8VcHhE2P27YyVW3zHfMXYzSm1pdDV8]} 16Uiu2HAkvUAmpAQ8xE1Vjp8uTn8geu39MFjD96aRywXYygzGUkG6:{Protocols:[/floodsub/1.0.0 /ipfs/id/push/1.0.0 /libp2p/autonat/1.0.0 /meshsub/1.0.0 /meshsub/1.1.0 /vac/waku/filter-subscribe/2.0.0-beta1 /vac/waku/relay/2.0.0 /ipfs/id/1.0.0 /vac/waku/lightpush/2.0.0-beta1 /vac/waku/peer-exchange/2.0.0-alpha1 /ipfs/ping/1.0.0 /libp2p/circuit/relay/0.2.0/stop /libp2p/dcutr /vac/waku/metadata/1.0.0] Addresses:[/ip4/75.155.65.160/tcp/60608/p2p/16Uiu2HAkvUAmpAQ8xE1Vjp8uTn8geu39MFjD96aRywXYygzGUkG6 /ip4/75.155.65.160/tcp/55115/p2p/16Uiu2HAkvUAmpAQ8xE1Vjp8uTn8geu39MFjD96aRywXYygzGUkG6]} 16Uiu2HAm2M7xs7cLPc3jamawkEqbr7cUJX11uvY7LxQ6WFUdUKUT:{Protocols:[] Addresses:[/ip4/8.218.74.73/tcp/30303/p2p/16Uiu2HAm2M7xs7cLPc3jamawkEqbr7cUJX11uvY7LxQ6WFUdUKUT /dns4/store-01.ac-cn-hongkong-c.status.prod.status.im/tcp/30303/p2p/16Uiu2HAm2M7xs7cLPc3jamawkEqbr7cUJX11uvY7LxQ6WFUdUKUT /dns4/store-01.ac-cn-hongkong-c.status.prod.status.im/tcp/443/wss/p2p/16Uiu2HAm2M7xs7cLPc3jamawkEqbr7cUJX11uvY7LxQ6WFUdUKUT]} 16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ:{Protocols:[/vac/waku/filter-subscribe/2.0.0-beta1] Addresses:[/dns4/boot-01.gc-us-central1-a.status.prod.status.im/tcp/30303/p2p/16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ /ip4/34.135.13.87/tcp/30303/p2p/16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ /dns4/boot-01.gc-us-central1-a.status.prod.status.im/tcp/443/wss/p2p/16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ]} 16Uiu2HAm9CQhsuwPR54q27kNj9iaQVfyRzTGKrhFmr94oD8ujU6P:{Protocols:[/vac/waku/store-query/3.0.0] Addresses:[/ip4/8.218.121.232/tcp/30303/p2p/16Uiu2HAm9CQhsuwPR54q27kNj9iaQVfyRzTGKrhFmr94oD8ujU6P /dns4/store-02.ac-cn-hongkong-c.status.prod.status.im/tcp/30303/p2p/16Uiu2HAm9CQhsuwPR54q27kNj9iaQVfyRzTGKrhFmr94oD8ujU6P /dns4/store-02.ac-cn-hongkong-c.status.prod.status.im/tcp/443/wss/p2p/16Uiu2HAm9CQhsuwPR54q27kNj9iaQVfyRzTGKrhFmr94oD8ujU6P]} 16Uiu2HAm9aDJPkhGxc2SFcEACTFdZ91Q5TJjp76qZEhq9iF59x7R:{Protocols:[/libp2p/autonat/1.0.0 /libp2p/circuit/relay/0.2.0/hop /rendezvous/1.0.0 /vac/waku/store-query/3.0.0 /ipfs/id/1.0.0 /vac/waku/metadata/1.0.0 /vac/waku/relay/2.0.0 /ipfs/ping/1.0.0 /vac/waku/store/2.0.0-beta4] Addresses:[/dns4/store-02.do-ams3.status.prod.status.im/tcp/30303/p2p/16Uiu2HAm9aDJPkhGxc2SFcEACTFdZ91Q5TJjp76qZEhq9iF59x7R /dns4/store-02.do-ams3.status.prod.status.im/tcp/443/wss/p2p/16Uiu2HAm9aDJPkhGxc2SFcEACTFdZ91Q5TJjp76qZEhq9iF59x7R /ip4/159.223.242.154/tcp/30303/p2p/16Uiu2HAm9aDJPkhGxc2SFcEACTFdZ91Q5TJjp76qZEhq9iF59x7R]} 16Uiu2HAmAR24Mbb6VuzoyUiGx42UenDkshENVDj4qnmmbabLvo31:{Protocols:[/vac/waku/filter-subscribe/2.0.0-beta1 /libp2p/circuit/relay/0.2.0/hop] Addresses:[/dns4/boot-01.do-ams3.status.prod.status.im/tcp/30303/p2p/16Uiu2HAmAR24Mbb6VuzoyUiGx42UenDkshENVDj4qnmmbabLvo31 /ip4/167.99.19.47/tcp/30303/p2p/16Uiu2HAmAR24Mbb6VuzoyUiGx42UenDkshENVDj4qnmmbabLvo31 /dns4/boot-01.do-ams3.status.prod.status.im/tcp/443/wss/p2p/16Uiu2HAmAR24Mbb6VuzoyUiGx42UenDkshENVDj4qnmmbabLvo31]} 16Uiu2HAmAUdrQ3uwzuE4Gy4D56hX6uLKEeerJAnhKEHZ3DxF1EfT:{Protocols:[/vac/waku/store-query/3.0.0 /libp2p/circuit/relay/0.2.0/hop] Addresses:[/ip4/159.223.242.94/tcp/30303/p2p/16Uiu2HAmAUdrQ3uwzuE4Gy4D56hX6uLKEeerJAnhKEHZ3DxF1EfT /dns4/store-01.do-ams3.status.prod.status.im/tcp/30303/p2p/16Uiu2HAmAUdrQ3uwzuE4Gy4D56hX6uLKEeerJAnhKEHZ3DxF1EfT /dns4/store-01.do-ams3.status.prod.status.im/tcp/443/wss/p2p/16Uiu2HAmAUdrQ3uwzuE4Gy4D56hX6uLKEeerJAnhKEHZ3DxF1EfT]} 16Uiu2HAmB7Ur9HQqo3cWDPovRQjo57fxWWDaQx27WxSzDGhN4JKg:{Protocols:[/ipfs/id/1.0.0 /libp2p/circuit/relay/0.2.0/hop /vac/waku/relay/2.0.0 /rendezvous/1.0.0 /ipfs/ping/1.0.0 /vac/waku/store/2.0.0-beta4 /libp2p/autonat/1.0.0 /vac/waku/metadata/1.0.0 /vac/waku/store-query/3.0.0] Addresses:[/ip4/35.224.231.209/tcp/30303/p2p/16Uiu2HAmB7Ur9HQqo3cWDPovRQjo57fxWWDaQx27WxSzDGhN4JKg /dns4/store-01.gc-us-central1-a.status.staging.status.im/tcp/30303/p2p/16Uiu2HAmB7Ur9HQqo3cWDPovRQjo57fxWWDaQx27WxSzDGhN4JKg /dns4/store-01.gc-us-central1-a.status.staging.status.im/tcp/443/wss/p2p/16Uiu2HAmB7Ur9HQqo3cWDPovRQjo57fxWWDaQx27WxSzDGhN4JKg]} 16Uiu2HAmBu5zRFzBGAzzMAuGWhaxN2EwcbW7CzibELQELzisf192:{Protocols:[/rendezvous/1.0.0 /vac/waku/lightpush/2.0.0-beta1 /vac/waku/peer-exchange/2.0.0-alpha1 /ipfs/ping/1.0.0 /vac/waku/filter-subscribe/2.0.0-beta1 /ipfs/id/1.0.0 /libp2p/autonat/1.0.0 /libp2p/circuit/relay/0.2.0/hop /vac/waku/metadata/1.0.0 /vac/waku/relay/2.0.0] Addresses:[/dns4/vps-aaa00d52.vps.ovh.ca/tcp/8000/wss/p2p/16Uiu2HAmBu5zRFzBGAzzMAuGWhaxN2EwcbW7CzibELQELzisf192 /ip4/51.161.131.247/tcp/30304/p2p/16Uiu2HAmBu5zRFzBGAzzMAuGWhaxN2EwcbW7CzibELQELzisf192 /dns4/vps-aaa00d52.vps.ovh.ca/tcp/30304/p2p/16Uiu2HAmBu5zRFzBGAzzMAuGWhaxN2EwcbW7CzibELQELzisf192]} 16Uiu2HAmGAA54bBTE78MYidSy3P7Q9yAWFNTAEReJYD69VRvtL5r:{Protocols:[/vac/waku/lightpush/2.0.0-beta1 /vac/waku/filter-subscribe/2.0.0-beta1 /vac/waku/peer-exchange/2.0.0-alpha1 /libp2p/circuit/relay/0.2.0/hop /vac/waku/relay/2.0.0 /rendezvous/1.0.0 /ipfs/ping/1.0.0 /ipfs/id/1.0.0 /libp2p/autonat/1.0.0 /vac/waku/metadata/1.0.0] Addresses:[/ip4/104.197.5.96/tcp/30303/p2p/16Uiu2HAmGAA54bBTE78MYidSy3P7Q9yAWFNTAEReJYD69VRvtL5r /dns4/boot-01.gc-us-central1-a.status.staging.status.im/tcp/30303/p2p/16Uiu2HAmGAA54bBTE78MYidSy3P7Q9yAWFNTAEReJYD69VRvtL5r /dns4/boot-01.gc-us-central1-a.status.staging.status.im/tcp/443/wss/p2p/16Uiu2HAmGAA54bBTE78MYidSy3P7Q9yAWFNTAEReJYD69VRvtL5r]} 16Uiu2HAmGwcE8v7gmJNEWFtZtojYpPMTHy2jBLL6xRk33qgDxFWX:{Protocols:[/vac/waku/filter-subscribe/2.0.0-beta1] Addresses:[/dns4/boot-01.ac-cn-hongkong-c.status.prod.statusim.net/tcp/30303/p2p/16Uiu2HAmGwcE8v7gmJNEWFtZtojYpPMTHy2jBLL6xRk33qgDxFWX /ip4/8.218.23.76/tcp/30303/p2p/16Uiu2HAmGwcE8v7gmJNEWFtZtojYpPMTHy2jBLL6xRk33qgDxFWX /dns4/boot-01.ac-cn-hongkong-c.status.prod.status.im/tcp/30303/p2p/16Uiu2HAmGwcE8v7gmJNEWFtZtojYpPMTHy2jBLL6xRk33qgDxFWX /dns4/boot-01.ac-cn-hongkong-c.status.prod.status.im/tcp/443/wss/p2p/16Uiu2HAmGwcE8v7gmJNEWFtZtojYpPMTHy2jBLL6xRk33qgDxFWX]} 16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM:{Protocols:[/vac/waku/store-query/3.0.0] Addresses:[/ip4/34.170.154.2/tcp/30303/p2p/16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM /dns4/store-02.gc-us-central1-a.status.prod.status.im/tcp/30303/p2p/16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM /dns4/store-02.gc-us-central1-a.status.prod.status.im/tcp/443/wss/p2p/16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM /ip4/34.170.154.2/tcp/443/tls/sni/store-02.gc-us-central1-a.status.prod.status.im/ws/p2p/16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM]} 16Uiu2HAmKBd6crqQNZ6nKCSCpHCAwUPn3DUDmkcPSWUTyVXpxKsW:{Protocols:[/libp2p/circuit/relay/0.2.0/hop /vac/waku/metadata/1.0.0 /ipfs/ping/1.0.0 /vac/waku/store/2.0.0-beta4 /vac/waku/store-query/3.0.0 /ipfs/id/1.0.0 /libp2p/autonat/1.0.0 /vac/waku/relay/2.0.0 /rendezvous/1.0.0] Addresses:[/ip4/34.72.140.183/tcp/30303/p2p/16Uiu2HAmKBd6crqQNZ6nKCSCpHCAwUPn3DUDmkcPSWUTyVXpxKsW /dns4/store-02.gc-us-central1-a.status.staging.status.im/tcp/30303/p2p/16Uiu2HAmKBd6crqQNZ6nKCSCpHCAwUPn3DUDmkcPSWUTyVXpxKsW /dns4/store-02.gc-us-central1-a.status.staging.status.im/tcp/443/wss/p2p/16Uiu2HAmKBd6crqQNZ6nKCSCpHCAwUPn3DUDmkcPSWUTyVXpxKsW]} 16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ:{Protocols:[/ipfs/id/1.0.0 /libp2p/autonat/1.0.0 /libp2p/circuit/relay/0.2.0/hop /ipfs/ping/1.0.0 /vac/waku/store/2.0.0-beta4 /vac/waku/store-query/3.0.0 /vac/waku/metadata/1.0.0 /vac/waku/relay/2.0.0 /rendezvous/1.0.0] Addresses:[/ip4/34.170.192.39/tcp/30303/p2p/16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ /dns4/store-01.gc-us-central1-a.status.prod.status.im/tcp/30303/p2p/16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ /dns4/store-01.gc-us-central1-a.status.prod.status.im/tcp/443/wss/p2p/16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ]} 16Uiu2HAmMU7Y29oL6DmoJfBFv8J4JhYzYgazPL7nGKJFBV3qcj2E:{Protocols:[/ipfs/id/1.0.0 /libp2p/autonat/1.0.0 /vac/waku/metadata/1.0.0 /rendezvous/1.0.0 /vac/waku/store/2.0.0-beta4 /vac/waku/store-query/3.0.0 /libp2p/circuit/relay/0.2.0/hop /vac/waku/relay/2.0.0 /ipfs/ping/1.0.0] Addresses:[/ip4/8.218.206.134/tcp/30303/p2p/16Uiu2HAmMU7Y29oL6DmoJfBFv8J4JhYzYgazPL7nGKJFBV3qcj2E /dns4/store-01.ac-cn-hongkong-c.status.staging.status.im/tcp/30303/p2p/16Uiu2HAmMU7Y29oL6DmoJfBFv8J4JhYzYgazPL7nGKJFBV3qcj2E /dns4/store-01.ac-cn-hongkong-c.status.staging.status.im/tcp/443/wss/p2p/16Uiu2HAmMU7Y29oL6DmoJfBFv8J4JhYzYgazPL7nGKJFBV3qcj2E]} 16Uiu2HAmNTpGnyZ8W1BK2sXEmgSCNWiyDKgRU3NBR2DXST2HzxRU:{Protocols:[/ipfs/id/1.0.0 /libp2p/autonat/1.0.0 /vac/waku/filter-subscribe/2.0.0-beta1 /vac/waku/peer-exchange/2.0.0-alpha1 /libp2p/circuit/relay/0.2.0/hop /vac/waku/metadata/1.0.0 /vac/waku/relay/2.0.0 /rendezvous/1.0.0 /ipfs/ping/1.0.0 /vac/waku/lightpush/2.0.0-beta1] Addresses:[/ip4/47.76.168.186/tcp/30303/p2p/16Uiu2HAmNTpGnyZ8W1BK2sXEmgSCNWiyDKgRU3NBR2DXST2HzxRU /dns4/boot-01.ac-cn-hongkong-c.status.staging.status.im/tcp/30303/p2p/16Uiu2HAmNTpGnyZ8W1BK2sXEmgSCNWiyDKgRU3NBR2DXST2HzxRU /dns4/boot-01.ac-cn-hongkong-c.status.staging.status.im/tcp/443/wss/p2p/16Uiu2HAmNTpGnyZ8W1BK2sXEmgSCNWiyDKgRU3NBR2DXST2HzxRU]} 16Uiu2HAmQE7FXQc6iZHdBzYfw3qCSDa9dLc1wsBJKoP4aZvztq2d:{Protocols:[/vac/waku/filter-subscribe/2.0.0-beta1] Addresses:[/ip4/143.198.250.233/tcp/30303/p2p/16Uiu2HAmQE7FXQc6iZHdBzYfw3qCSDa9dLc1wsBJKoP4aZvztq2d /dns4/boot-01.do-ams3.status.staging.status.im/tcp/30303/p2p/16Uiu2HAmQE7FXQc6iZHdBzYfw3qCSDa9dLc1wsBJKoP4aZvztq2d /dns4/boot-01.do-ams3.status.staging.status.im/tcp/443/wss/p2p/16Uiu2HAmQE7FXQc6iZHdBzYfw3qCSDa9dLc1wsBJKoP4aZvztq2d]}]}" DEBUG[10-28|20:41:59.533|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] freeing FD token; waiting: 0; consuming: 1 DEBUG[10-28|20:41:59.534|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] negotiated: /ipfs/id/1.0.0 (took 61.417µs) DEBUG[10-28|20:41:59.536|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:251] sending snapshot seq=25 protocols="[/ipfs/id/1.0.0 /ipfs/id/push/1.0.0 /ipfs/ping/1.0.0 /libp2p/autonat/1.0.0 /libp2p/circuit/relay/0.2.0/stop /libp2p/dcutr /vac/waku/filter-push/2.0.0-beta1 /vac/waku/metadata/1.0.0]" addrs="[/ip4/68.108.152.224/tcp/61206 /ip4/127.0.0.1/tcp/61206 /ip4/192.168.1.86/tcp/61206]" DEBUG[10-28|20:41:59.536|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] /ipfs/id/1.0.0 sending message to 16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM /ip4/34.170.154.2/tcp/30303 DEBUG[10-28|20:41:59.535|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:114] filter stats agg filters count=3 filter subs count=6 DEBUG[10-28|20:41:59.536|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [limiter] freeing peer token; peer 16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM; addr: /ip4/34.170.154.2/tcp/443/tls/sni/store-02.gc-us-central1-a.status.prod.status.im/ws; active for peer: 1; waiting on peer limit: 0 DEBUG[10-28|20:41:59.539|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:197] inside on connection status change new-status=true agg filters count=3 filter subs count=6 DEBUG[10-28|20:41:59.603|github.com/status-im/status-go/vendor/github.com/status-im/mvds/node/node.go:221] Epoch processing namespace=mvds node=04cabd81 epoch=15959 DEBUG[10-28|20:41:59.624|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] /ipfs/id/1.0.0 received message from 16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM /ip4/34.170.154.2/tcp/30303 DEBUG[10-28|20:41:59.624|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] 16Uiu2HAm8goi4QpW2k4G99zMvmc7gLSpWc2nE8nCBJjN9z1RSPDJ received listen addrs for 16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM: [/dns4/store-02.gc-us-central1-a.status.prod.status.im/tcp/30303 /dns4/store-02.gc-us-central1-a.status.prod.status.im/tcp/443/wss] DEBUG[10-28|20:41:59.624|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:251] added own observed listen addr observed=/ip4/68.108.152.224/tcp/61206 DEBUG[10-28|20:41:59.624|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] host 16Uiu2HAm8goi4QpW2k4G99zMvmc7gLSpWc2nE8nCBJjN9z1RSPDJ finished dialing 16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM DEBUG[10-28|20:41:59.624|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [16Uiu2HAm8goi4QpW2k4G99zMvmc7gLSpWc2nE8nCBJjN9z1RSPDJ] opening stream to peer [16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM] DEBUG[10-28|20:41:59.624|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [16Uiu2HAm8goi4QpW2k4G99zMvmc7gLSpWc2nE8nCBJjN9z1RSPDJ] opening stream to peer [16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM] DEBUG[10-28|20:41:59.624|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/metadata/waku_metadata.go:128] sending metadata request peer=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM DEBUG[10-28|20:41:59.625|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/metadata/waku_metadata.go:138] sent metadata request peer=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM DEBUG[10-28|20:41:59.761|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/metadata/waku_metadata.go:151] received metadata response peer=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM DEBUG[10-28|20:41:59.762|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/metadata/waku_metadata.go:250] relay peer checking cluster and shards DEBUG[10-28|20:41:59.762|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/metadata/waku_metadata.go:266] getting remote cluster and shards -
Although I do see this error occur (could be unrelated though)
WARN [10-28|20:41:59.761|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_manager.go:758] connecting to peer peerID=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM error="failed to negotiate protocol: protocols not supported: [/vac/waku/peer-exchange/2.0.0-alpha1]" INFO [10-28|20:41:59.762|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_manager.go:767] removing bad peer due to recurring dial failures peerID=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM ERROR[10-28|20:41:59.762|github.com/status-im/status-go/wakuv2/waku.go:625] couldnt request peers via peer exchange error="failed to negotiate protocol: protocols not supported: [/vac/waku/peer-exchange/2.0.0-alpha1]" -
And we do see that missing messages are attempted to be retrieved for the topic
DEBUG[10-28|20:47:24.070|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185] retrieving history to check for missing messages peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ contentTopics=[/waku/1/0x0a0df437/rfc26] pubsubTopic=/waku/2/rs/16/32 from=1730148384236730000 to=1730148444070854000 attempt=1
-
ios-device-filter-bad-peers.zip ios-simulator-filter-bad-peers.log.zip
- I noticed this time I don’t receive an invalid subscription error for the missing message, it just seems it was undelivered (for roughly 20 minutes) until I put the app in background and then back into the foreground (which causes a message history retrieval).
That is odd because periodic store query should have been able to fetch the missing message. Can you point me to the messageID of the message so that i can trace it in the logs?
- And it seems that the receive device does attempt to reestablish connection with peers and receive messages, but for some reason they continue to fail in a loop after reconnecting, perhaps this is due to multiple attempts being made while disconnected and they’re not cancelled when the connection changes (?)
Once network state is changed to offline, the filter subscriptions will not be attempted which means reconnect must be getting triggered from somewhere else. It is odd that you are noticing this. Will take a look at the logs and try to figure out.
- Eventually it seems the peers are re-connected, but for some reason the missing-message retrieval and the peers did not send or retrieve the missing message.
This could just be the case that the peers that node is trying to connect to are just down at that time or not reachable as they are behind a strict NAT.
We can see that the new code is running by removing peers
This is odd as this is a fleet peer. i am noticing more such logs for even bootnodes as they should not be removed as it might affect other functionality. Will take a more detailed look at the logs and see what is happening. Will try to update the code to handle such cases.
And here we can see the subscription for the topic has failed
This is expected as all peers that are discovered may not be online or reachable(as they are behind a NAT) causing dial failure. As long as subscriptions recover that is good enough.
Although I do see this error occur (could be unrelated though)
We can ignore this error as this is a fleet storenode which doesn't have peer-exchange configured. Since waku doesn't have a mechanism to indicate to peers if this protocol is enabled or not, it gets detected after connecting to the peer.
@seanstrom i have identified some side-effects of bad peer removal changes from your logs. Hence i have updated the way it works to be more precise and most probably should not have any side effects. https://github.com/status-im/status-mobile/pull/21458
Can i request you to rerun the test and share results/logs?
Thanks a lot for your patience and help with testing this so far!
@chaitanyaprem Yup yup I'll re-run by tests and share soon
Can you point me to the messageID of the message so that i can trace it in the logs?
In case you still need this, the missing message ID for the previous logs is: 0xf2f6bcebfef8d9c0bac510e14320d608d88541457663acb41a946524782e1099
Okay I've re-run my tests with the updated changes in your status-go branch
-
Overview
- This time I was able to reproduce the issue by disconnecting the network, waiting for a few seconds, and then reconnecting the network. However this happened after attempting multiple times.
- One thing I noticed is that messages were being sent and received after disconnecting and reconnecting multiple times, but the were delayed by a minute or so, so it seemed that some messages were being retrieved by the missing-messages-verification service.
- It does seem that the chat messages we were received instantly at the beginning of the tests, but eventually they became delayed, and then finally one message was not received at all.
-
Details
- Content Topic:
contentTopic=/waku/1/0x6cb9a69b/rfc26 - Missing Message ID:
0x936be016203399f910210ac84aed4e2f99c0b2199c640df25954670672368d34 - Missing Message Envelope Hash:
0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee
- Content Topic:
-
Sender Logs
-
We can see the iOS simulator device (sender) sends the second to last message:
DEBUG[10-29|21:52:06.416|github.com/status-im/status-go/protocol/common/message_sender.go:805] sent-message: public message recipient=[] messageID=0x27511e1ca9633a375599150847dc279630aa59fe78c9ed2a7ca4fbc4a8e259d4 contentType=CHAT_MESSAGE messageType=public hashes=[0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da] INFO [10-29|21:52:06.416|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/publish/message_sender.go:115] publishing message via lightpush envelopeHash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x6cb9a69b/rfc26 timestamp=1,730,238,726,415,937,000 DEBUG[10-29|21:52:06.416|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:731] handling message sent DEBUG[10-29|21:52:06.416|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:890] handling public messages messageID="[39 81 30 28 169 99 58 55 85 153 21 8 71 220 39 150 48 170 89 254 120 201 237 42 124 164 251 196 168 226 89 212]" DEBUG[10-29|21:52:06.416|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_selection.go:194] Select Peers selectionCriteria="{\"selectionType\":0,\"protocolId\":\"/vac/waku/lightpush/2.0.0-beta1\",\"pubsubTopics\":[\"/waku/2/rs/16/32\"],\"specificPeers\":null,\"maxPeerCount\":2,\"excludePeers\":null}" excludedPeers= DEBUG[10-29|21:52:06.416|github.com/status-im/status-go/wakuv2/common/filter.go:256] processing message: decrypted envelopeHash=904f39..8506da DEBUG[10-29|21:52:06.416|github.com/status-im/status-go/wakuv2/waku.go:1596] filters did match envelopeHash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x6cb9a69b/rfc26 timestamp=1,730,238,726,415,937,000 DEBUG[10-29|21:52:06.416|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_selection.go:148] discovering peers by pubsubTopic pubsubTopics=[/waku/2/rs/16/32] DEBUG[10-29|21:52:06.417|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_selection.go:159] could not retrieve random peer from slot DEBUG[10-29|21:52:06.417|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:900] messageID tracked messageID="[39 81 30 28 169 99 58 55 85 153 21 8 71 220 39 150 48 170 89 254 120 201 237 42 124 164 251 196 168 226 89 212]" DEBUG[10-29|21:52:06.417|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/lightpush/waku_lightpush.go:334] publishing message hash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x6cb9a69b/rfc26 timestamp=1,730,238,726,415,937,000 peers="[16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ 16Uiu2HAmGAA54bBTE78MYidSy3P7Q9yAWFNTAEReJYD69VRvtL5r]" -
We can see the iOS simulator device (sender) sends a message (missing message)
DEBUG[10-29|21:53:01.723|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:114] filter stats agg filters count=9 filter subs count=18 INFO [10-29|21:53:01.745|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:217] received message push peerID=16Uiu2HAmNTpGnyZ8W1BK2sXEmgSCNWiyDKgRU3NBR2DXST2HzxRU hash=0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x6cb9a69b/rfc26 timestamp=1,730,238,781,340,769,000 DEBUG[10-29|21:53:01.746|github.com/status-im/status-go/wakuv2/waku.go:1487] received new envelope messageType=relay envelopeHash=0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x6cb9a69b/rfc26 timestamp=1730238781340769000 DEBUG[10-29|21:53:01.746|github.com/status-im/status-go/wakuv2/waku.go:1528] w envelope already cached envelopeHash=0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee DEBUG[10-29|21:53:01.975|github.com/status-im/status-go/protocol/messenger_mailserver_cycle.go:492] wakuV2 storenode status verification DEBUG[10-29|21:53:01.978|github.com/status-im/status-go/protocol/transport/transport.go:264] message not cached namespace=Transport site=retrieveRawAll hash=0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee DEBUG[10-29|21:53:01.981|github.com/status-im/status-go/protocol/messenger.go:3604] storing waku message site=RetrieveAll hash=0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee DEBUG[10-29|21:53:01.983|github.com/status-im/status-go/protocol/common/message_segmentation.go:237] handling message segment site=handleSegmentationLayerV2 hash=0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee EntireMessageHash=0x Index=0 SegmentsCount=0 ParitySegmentIndex=0 ParitySegmentsCount=0 DEBUG[10-29|21:53:01.983|github.com/status-im/status-go/protocol/common/message_sender.go:969] failed to handle segmentation layer message site=handleMessage hash=0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee error="invalid segments count" DEBUG[10-29|21:53:01.984|github.com/status-im/status-go/protocol/encryption/protocol.go:756] received a protocol message namespace=Protocol site=HandleMessage sender-public-key=0x0460a47d0bde2d675dda6759d8df21ba3dc14217f1ab520bd69811e658e4a473ca93e73d4737ad172069e0d8aa513cd55f7be270f81742e3a3d02f2bb77c4c5536 my-installation-id=41bb8c7a-e8f2-4b13-a225-e7ba4b500810 messageID=0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee DEBUG[10-29|21:53:01.984|github.com/status-im/status-go/protocol/common/message_sender.go:994] failed to handle datasync message site=handleMessage hash=0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee error="handling non-datasync message" DEBUG[10-29|21:53:01.985|github.com/status-im/status-go/protocol/v1/status_message.go:170] calculated ID for envelope envelopeHash=0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee messageId=0x936be016203399f910210ac84aed4e2f99c0b2199c640df25954670672368d34 DEBUG[10-29|21:53:01.985|github.com/status-im/status-go/protocol/messenger.go:3639] processing messages further site=RetrieveAll hash=0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee count=1 INFO [10-29|21:53:01.985|github.com/status-im/status-go/protocol/messenger.go:3655] processing message site=RetrieveAll hash=0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee message-id=0x936be016203399f910210ac84aed4e2f99c0b2199c640df25954670672368d34 type=CHAT_MESSAGE senderID=0x0460a47d0bde2d675dda6759d8df21ba3dc14217f1ab520bd69811e658e4a473ca93e73d4737ad172069e0d8aa513cd55f7be270f81742e3a3d02f2bb77c4c5536 WARN [10-29|21:53:01.985|github.com/status-im/status-go/protocol/messenger_maps.go:65] contacts map: loading own identity contactID=0x0460a47d0bde2d675dda6759d8df21ba3dc14217f1ab520bd69811e658e4a473ca93e73d4737ad172069e0d8aa513cd55f7be270f81742e3a3d02f2bb77c4c5536 DEBUG[10-29|21:53:01.986|github.com/status-im/status-go/protocol/messenger.go:3678] skipping duplicate site=RetrieveAll hash=0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee message-id=0x936be016203399f910210ac84aed4e2f99c0b2199c640df25954670672368d34 messageID=0x936be016203399f910210ac84aed4e2f99c0b2199c640df25954670672368d34 DEBUG[10-29|21:53:01.986|github.com/status-im/status-go/protocol/transport/transport.go:551] confirming message processed namespace=Transport ids=[0x69136708ab10b26dcebd78464e90cfb38c77ee98d9f0993366ba4c0daf4666ee] timestamp=1,730,238,781,986
-
-
Receiver Logs
-
We can see the iOS physical device (receiver) receives the second to last message, but it seems to be received by the missing-messages-verification service.
DEBUG[10-29|21:52:55.300|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185] retrieving next page peerID=16Uiu2HAm9aDJPkhGxc2SFcEACTFdZ91Q5TJjp76qZEhq9iF59x7R contentTopics="[/waku/1/0x96e4e168/rfc26 /waku/1/0x361f77a6/rfc26 /waku/1/0x717fa142/rfc26 /waku/1/0x394889b3/rfc26 /waku/1/0xfc7abb97/rfc26 /waku/1/0xe97b921c/rfc26 /waku/1/0x6cb9a69b/rfc26 /waku/1/0xb22caef5/rfc26 /waku/1/0xcc441b95/rfc26 /waku/1/0x8ad6a8b7/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1730238654761600000 to=1730238774565838000 cursor= attempt=1 DEBUG[10-29|21:52:55.300|github.com/status-im/status-go/wakuv2/waku.go:1487] received new envelope messageType=missing envelopeHash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x6cb9a69b/rfc26 timestamp=1730238726415937000 DEBUG[10-29|21:52:55.300|github.com/status-im/status-go/wakuv2/waku.go:1531] cached w envelope envelopeHash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da DEBUG[10-29|21:52:55.301|github.com/status-im/status-go/wakuv2/waku.go:1541] posting event envelopeHash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da DEBUG[10-29|21:52:55.301|github.com/status-im/status-go/wakuv2/common/filter.go:256] processing message: decrypted envelopeHash=904f39..8506da DEBUG[10-29|21:52:55.302|github.com/status-im/status-go/wakuv2/waku.go:1596] filters did match envelopeHash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x6cb9a69b/rfc26 timestamp=1,730,238,726,415,937,000 DEBUG[10-29|21:52:55.593|github.com/status-im/status-go/vendor/github.com/status-im/mvds/node/node.go:221] Epoch processing namespace=mvds node=047b8919 epoch=1746 DEBUG[10-29|21:52:55.898|github.com/status-im/status-go/vendor/github.com/status-im/mvds/node/node.go:221] Epoch processing namespace=mvds node=047b8919 epoch=1747 DEBUG[10-29|21:52:56.203|github.com/status-im/status-go/vendor/github.com/status-im/mvds/node/node.go:221] Epoch processing namespace=mvds node=047b8919 epoch=1748 DEBUG[10-29|21:52:56.278|github.com/status-im/status-go/protocol/messenger_mailserver_cycle.go:492] wakuV2 storenode status verification DEBUG[10-29|21:52:56.281|github.com/status-im/status-go/protocol/transport/transport.go:264] message not cached namespace=Transport site=retrieveRawAll hash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da DEBUG[10-29|21:52:56.286|github.com/status-im/status-go/protocol/common/message_segmentation.go:237] handling message segment site=handleSegmentationLayerV2 hash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da EntireMessageHash=0x Index=0 SegmentsCount=0 ParitySegmentIndex=0 ParitySegmentsCount=0 DEBUG[10-29|21:52:56.287|github.com/status-im/status-go/protocol/common/message_sender.go:969] failed to handle segmentation layer message site=handleMessage hash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da error="invalid segments count" DEBUG[10-29|21:52:56.287|github.com/status-im/status-go/protocol/encryption/protocol.go:756] received a protocol message namespace=Protocol site=HandleMessage sender-public-key=0x0460a47d0bde2d675dda6759d8df21ba3dc14217f1ab520bd69811e658e4a473ca93e73d4737ad172069e0d8aa513cd55f7be270f81742e3a3d02f2bb77c4c5536 my-installation-id=b4280369-eedd-4681-9ab1-8de9a4b36f17 messageID=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da DEBUG[10-29|21:52:56.294|github.com/status-im/status-go/protocol/common/message_sender.go:994] failed to handle datasync message site=handleMessage hash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da error="handling non-datasync message" DEBUG[10-29|21:52:56.295|github.com/status-im/status-go/protocol/v1/status_message.go:170] calculated ID for envelope envelopeHash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da messageId=0x27511e1ca9633a375599150847dc279630aa59fe78c9ed2a7ca4fbc4a8e259d4 DEBUG[10-29|21:52:56.295|github.com/status-im/status-go/protocol/messenger.go:3639] processing messages further site=RetrieveAll hash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da count=1 INFO [10-29|21:52:56.295|github.com/status-im/status-go/protocol/messenger.go:3655] processing message site=RetrieveAll hash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da message-id=0x27511e1ca9633a375599150847dc279630aa59fe78c9ed2a7ca4fbc4a8e259d4 type=CHAT_MESSAGE senderID=0x0460a47d0bde2d675dda6759d8df21ba3dc14217f1ab520bd69811e658e4a473ca93e73d4737ad172069e0d8aa513cd55f7be270f81742e3a3d02f2bb77c4c5536 INFO [10-29|21:52:56.298|github.com/status-im/status-go/protocol/messenger_handlers.go:275] handling ChatMessage DEBUG[10-29|21:52:56.311|github.com/status-im/status-go/protocol/messenger.go:3713] Handled parsed message site=RetrieveAll hash=0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da message-id=0x27511e1ca9633a375599150847dc279630aa59fe78c9ed2a7ca4fbc4a8e259d4 DEBUG[10-29|21:52:56.311|github.com/status-im/status-go/protocol/transport/transport.go:551] confirming message processed namespace=Transport ids=[0x904f3960b67f0fd417d73c844a5a15ecb9ea7155474e98c44c782a0e198506da] timestamp=1,730,238,776,311 -
We can see some logs related to not being able to reach a peer and a NAT status as unknown. And these logs reoccur and can be seen again roughly ten minutes later.
DEBUG[10-29|21:54:10.098|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] Dialback through peer 16Uiu2HAm3xVDaz6SRJ6kErwC21zBJEZjavVXg7VSkoWzaV1aMA3F completed: err: i/o deadline reached DEBUG[10-29|21:54:10.098|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] NAT status is unknownDEBUG[10-29|22:03:10.109|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] Dialback through peer 16Uiu2HAmCDSnT8oNpMR9HH6uipD71KstYuDCAQGpek9XDAVmqdEr completed: err: i/o deadline reached DEBUG[10-29|22:03:10.109|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] NAT status is unknown
-
receiver-logs-filter-bad-peers-v2-extended.zip sender-geth-filter-bad-peers-v2-extended.log.zip
Hey @chaitanyaprem 👋 Is there anything else I can do to help with this issue? 🙏
Hey @chaitanyaprem 👋 Is there anything else I can do to help with this issue? 🙏
Hey @seanstrom , I was working on something else and lost track of this.
But looks like i have found a potential issue in an approach for filter to handle offline to online state switch which i have fixed and updated it in the same PR now. Can you please rerun your tests and share the results. It might help debug further as to what might be going on. I have removed option of using fleet nodes as preferred peers as well, so i would like to see the behaviour.
- This time I was able to reproduce the issue by disconnecting the network, waiting for a few seconds, and then reconnecting the network. However this happened after attempting multiple times.
This could just be because of specific condition that is getting hit in the code which may not happen always.
One thing I noticed is that messages were being sent and received after disconnecting and reconnecting multiple times, but the were delayed by a minute or so, so it seemed that some messages were being retrieved by the missing-messages-verification service.
Yep, delay of 1 minute means they are retreived by missing message service which is a good sign atleast.
It does seem that the chat messages we were received instantly at the beginning of the tests, but eventually they became delayed, and then finally one message was not received at all.
That doesn't seem right, it could mean either the missing message service itself could not retrieve it (this needs analysis). Will analyze this and get back to you.
- We can see some logs related to not being able to reach a peer and a NAT status as unknown. And these logs reoccur and can be seen again roughly ten minutes later.
These can be ignored as this is an attempt by libp2p while trying to determine status of NAT which is not significant for mobile device. This is important for desktop node acting as a relay though.
- We can see the iOS physical device (receiver) receives the second to last message, but it seems to be received by the missing-messages-verification service.
After going through logs in receiver, looks like the filter go-routines that handle connection status change and stats seem to be blocked on something because i cannot see any logs after 10-29|21:46:10.413.
And last message is sent by sender at around 10-29|21:53:01.691 timestamp which is why it was not received by receiver.
One thing i still don't get is why missing-messages service could not retrieve this message as i could see that it is trying to fetch from store nodes. The store node that receiver is using to retrieve messages has also archived this message as per screenshot attached.
Looks like the missing_messages go-routine also did not fire around the time of 10-29|21:53:54 . The last it ran was at log below. Wondering if go-routines are getting paused in the receiver, not sure what could be the reason but if we see almost 3 of the routines seem to be paused for some reason.
DEBUG[10-29|21:52:55.300|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185] retrieving next page peerID=16Uiu2HAm9aDJPkhGxc2SFcEACTFdZ91Q5TJjp76qZEhq9iF59x7R contentTopics="[/waku/1/0x96e4e168/rfc26 /waku/1/0x361f77a6/rfc26 /waku/1/0x717fa142/rfc26 /waku/1/0x394889b3/rfc26 /waku/1/0xfc7abb97/rfc26 /waku/1/0xe97b921c/rfc26 /waku/1/0x6cb9a69b/rfc26 /waku/1/0xb22caef5/rfc26 /waku/1/0xcc441b95/rfc26 /waku/1/0x8ad6a8b7/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1730238654761600000 to=1730238774565838000 cursor= attempt=1
But then again i could see that peer-exchange loop was still running. Will do more investigation and update.
DEBUG[10-29|21:52:59.565|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_selection.go:194] Select Peers selectionCriteria="{\"selectionType\":0,\"protocolId\":\"/vac/waku/peer-exchange/2.0.0-alpha1\",\"pubsubTopics\":[\"/waku/2/rs/16/32\"],\"specificPeers\":[\"16Uiu2HAmAR24Mbb6VuzoyUiGx42UenDkshENVDj4qnmmbabLvo31\",\"16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ\",\"16Uiu2HAmGwcE8v7gmJNEWFtZtojYpPMTHy2jBLL6xRk33qgDxFWX\"],\"maxPeerCount\":1,\"excludePeers\":null}" excludedPeers=
Hey @chaitanyaprem 👋
I've re-run the tests with latest changes, and I think I'm still running into the issue, but it seems more difficult to reproduce.
- In this case, I ended up backgrounding the apps for the receiver and sender device for a while.
- The results were that I could not immediately receive messages when both devices began running the apps in the foreground.
- However, messages were retrieved by the "missing messages verification", and eventually it seemed that I was able to receive messages immediately after the devices were left running for a few minutes.
- The only problem that seems to persist is that one message remained missing.
Log Identifiers
- contentTopic:
contentTopic=/waku/1/0x35f53ee3/rfc26 - missing message ID:
0x0faa0b5372bf27672719fba258aa66636be3bdcb825f2d913e44ab235fba382a
Log Details
- For example, I sent a message on the sender device (iOS Simulator):
DEBUG[11-07|20:50:27.363|github.com/status-im/status-go/protocol/messenger.go:5638] sending community chat message site=Send chatID=0x0232936b0d051b28c55dbbcf6c348ffde8511c3139f5348dc7019ac03060ba4a673bab53ac-c5cf-4e3a-9a6b-1fa4cf6e3699 chatName=general
DEBUG[11-07|20:50:27.365|github.com/status-im/status-go/protocol/messenger.go:2272] sending community chat message site=dispatchMessage chatID=0x0232936b0d051b28c55dbbcf6c348ffde8511c3139f5348dc7019ac03060ba4a673bab53ac-c5cf-4e3a-9a6b-1fa4cf6e3699 chatName=general
DEBUG[11-07|20:50:27.369|github.com/status-im/status-go/protocol/common/message_segmentation.go:47] message segmented segments=1
DEBUG[11-07|20:50:27.369|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:726] handling message scheduled
DEBUG[11-07|20:50:27.372|github.com/status-im/status-go/protocol/common/message_sender.go:805] sent-message: public message recipient=[] messageID=0x7f1b7e0452e8493b952c3be8f14fdce89e4be80c3169746030cc24f262ceb9e8 contentType=CHAT_MESSAGE messageType=public hashes=[0xeb744d47cd325dae41b25e7320a76a8a27bd1f3e1acd31f813af17c097dc8f11]
DEBUG[11-07|20:50:27.372|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:731] handling message sent
DEBUG[11-07|20:50:27.372|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:890] handling public messages messageID="[127 27 126 4 82 232 73 59 149 44 59 232 241 79 220 232 158 75 232 12 49 105 116 96 48 204 36 242 98 206 185 232]"
INFO [11-07|20:50:27.372|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/publish/message_sender.go:115] publishing message via lightpush envelopeHash=0xeb744d47cd325dae41b25e7320a76a8a27bd1f3e1acd31f813af17c097dc8f11 pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x35f53ee3/rfc26 timestamp=1,731,012,627,372,180,000
- This message was delayed by roughly 1 minute, and I assume the "missing message verification" retrieved this message. Here are the logs for the receiver device (iOS physical):
ERROR[11-07|20:50:43.428|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/filter/client.go:447] Failed to subscribe pubSubTopic=/waku/2/rs/16/32 contentTopics=[/waku/1/0xa4383181/rfc26] error="failed to dial: failed to dial 16Uiu2HAmMReKQMXbfrwg4bTTx4PkTt3i37eHfPgWVqDWYDco9k77: all dials failed\n * [/ip4/45.30.95.214/tcp/57600] dial tcp4 45.30.95.214:57600: connect: connection refused\n * [/ip4/24.144.78.120/tcp/30303/p2p/16Uiu2HAmCDSnT8oNpMR9HH6uipD71KstYuDCAQGpek9XDAVmqdEr/p2p-circuit] error opening relay circuit: NO_RESERVATION (204)\n * [/ip4/159.223.242.94/tcp/30303/p2p/16Uiu2HAmAUdrQ3uwzuE4Gy4D56hX6uLKEeerJAnhKEHZ3DxF1EfT/p2p-circuit] error opening relay circuit: NO_RESERVATION (204)"
DEBUG[11-07|20:50:43.524|github.com/status-im/status-go/protocol/messenger_mailserver_cycle.go:492] wakuV2 storenode status verification
DEBUG[11-07|20:50:43.527|github.com/status-im/status-go/protocol/transport/transport.go:264] message not cached namespace=Transport site=retrieveRawAll hash=0xeb744d47cd325dae41b25e7320a76a8a27bd1f3e1acd31f813af17c097dc8f11
DEBUG[11-07|20:50:43.529|github.com/status-im/status-go/protocol/common/message_segmentation.go:237] handling message segment site=handleSegmentationLayerV2 hash=0xeb744d47cd325dae41b25e7320a76a8a27bd1f3e1acd31f813af17c097dc8f11 EntireMessageHash=0x Index=0 SegmentsCount=0 ParitySegmentIndex=0 ParitySegmentsCount=0
DEBUG[11-07|20:50:43.529|github.com/status-im/status-go/protocol/common/message_sender.go:969] failed to handle segmentation layer message site=handleMessage hash=0xeb744d47cd325dae41b25e7320a76a8a27bd1f3e1acd31f813af17c097dc8f11 error="invalid segments count"
DEBUG[11-07|20:50:43.529|github.com/status-im/status-go/protocol/encryption/protocol.go:756] received a protocol message namespace=Protocol site=HandleMessage sender-public-key=0x04035e0735303bb9a08ecb09145ece682eee59863e04bf6b1634a3ef5b66e3c6e728bd09d94763b7036e74e2af01e2e94b9a4dff30e3cca0677ef8c57acd6dcae2 my-installation-id=e8b026da-1bf0-47c5-9be9-970f940fdae7 messageID=0xeb744d47cd325dae41b25e7320a76a8a27bd1f3e1acd31f813af17c097dc8f11
DEBUG[11-07|20:50:43.539|github.com/status-im/status-go/protocol/common/message_sender.go:994] failed to handle datasync message site=handleMessage hash=0xeb744d47cd325dae41b25e7320a76a8a27bd1f3e1acd31f813af17c097dc8f11 error="handling non-datasync message"
DEBUG[11-07|20:50:43.540|github.com/status-im/status-go/protocol/v1/status_message.go:170] calculated ID for envelope envelopeHash=0xeb744d47cd325dae41b25e7320a76a8a27bd1f3e1acd31f813af17c097dc8f11 messageId=0x7f1b7e0452e8493b952c3be8f14fdce89e4be80c3169746030cc24f262ceb9e8
DEBUG[11-07|20:50:43.540|github.com/status-im/status-go/protocol/messenger.go:3639] processing messages further site=RetrieveAll hash=0xeb744d47cd325dae41b25e7320a76a8a27bd1f3e1acd31f813af17c097dc8f11 count=1
INFO [11-07|20:50:43.540|github.com/status-im/status-go/protocol/messenger.go:3655] processing message site=RetrieveAll hash=0xeb744d47cd325dae41b25e7320a76a8a27bd1f3e1acd31f813af17c097dc8f11 message-id=0x7f1b7e0452e8493b952c3be8f14fdce89e4be80c3169746030cc24f262ceb9e8 type=CHAT_MESSAGE senderID=0x04035e0735303bb9a08ecb09145ece682eee59863e04bf6b1634a3ef5b66e3c6e728bd09d94763b7036e74e2af01e2e94b9a4dff30e3cca0677ef8c57acd6dcae2
INFO [11-07|20:50:43.543|github.com/status-im/status-go/protocol/messenger_handlers.go:275] handling ChatMessage
DEBUG[11-07|20:50:43.559|github.com/status-im/status-go/protocol/messenger.go:3713] Handled parsed message site=RetrieveAll hash=0xeb744d47cd325dae41b25e7320a76a8a27bd1f3e1acd31f813af17c097dc8f11 message-id=0x7f1b7e0452e8493b952c3be8f14fdce89e4be80c3169746030cc24f262ceb9e8
DEBUG[11-07|20:50:43.559|github.com/status-im/status-go/protocol/transport/transport.go:551] confirming message processed namespace=Transport ids=[0xeb744d47cd325dae41b25e7320a76a8a27bd1f3e1acd31f813af17c097dc8f11] timestamp=1,731,012,643,559
- And once I noticed the message was received, I sent another message to see if it would be immediate. In this case, the message was delayed again, but after waiting more than a minute the message was not received or retrieved. Here are the logs from sender device:
DEBUG[11-07|20:50:48.400|github.com/status-im/status-go/protocol/messenger.go:5638] sending community chat message site=Send chatID=0x0232936b0d051b28c55dbbcf6c348ffde8511c3139f5348dc7019ac03060ba4a673bab53ac-c5cf-4e3a-9a6b-1fa4cf6e3699 chatName=general
DEBUG[11-07|20:50:48.402|github.com/status-im/status-go/protocol/messenger.go:2272] sending community chat message site=dispatchMessage chatID=0x0232936b0d051b28c55dbbcf6c348ffde8511c3139f5348dc7019ac03060ba4a673bab53ac-c5cf-4e3a-9a6b-1fa4cf6e3699 chatName=general
DEBUG[11-07|20:50:48.406|github.com/status-im/status-go/protocol/common/message_segmentation.go:47] message segmented segments=1
DEBUG[11-07|20:50:48.406|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:726] handling message scheduled
DEBUG[11-07|20:50:48.410|github.com/status-im/status-go/protocol/common/message_sender.go:805] sent-message: public message recipient=[] messageID=0x0faa0b5372bf27672719fba258aa66636be3bdcb825f2d913e44ab235fba382a contentType=CHAT_MESSAGE messageType=public hashes=[0x13c93de6f8bb8ab422afb1f9ccb0368deda13fc36f62cfce34ee6739a6182109]
DEBUG[11-07|20:50:48.410|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:731] handling message sent
DEBUG[11-07|20:50:48.410|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:890] handling public messages messageID="[15 170 11 83 114 191 39 103 39 25 251 162 88 170 102 99 107 227 189 203 130 95 45 145 62 68 171 35 95 186 56 42]"
INFO [11-07|20:50:48.410|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/publish/message_sender.go:115] publishing message via lightpush envelopeHash=0x13c93de6f8bb8ab422afb1f9ccb0368deda13fc36f62cfce34ee6739a6182109 pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x35f53ee3/rfc26 timestamp=1,731,012,648,410,353,000
DEBUG[11-07|20:50:48.410|github.com/status-im/status-go/wakuv2/common/filter.go:256] processing message: decrypted envelopeHash=13c93d..182109
DEBUG[11-07|20:50:48.410|github.com/status-im/status-go/wakuv2/waku.go:1596] filters did match envelopeHash=0x13c93de6f8bb8ab422afb1f9ccb0368deda13fc36f62cfce34ee6739a6182109 pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x35f53ee3/rfc26 timestamp=1,731,012,648,410,353,000
DEBUG[11-07|20:50:48.410|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_selection.go:194] Select Peers selectionCriteria="{\"selectionType\":0,\"protocolId\":\"/vac/waku/lightpush/2.0.0-beta1\",\"pubsubTopics\":[\"/waku/2/rs/16/32\"],\"specificPeers\":null,\"maxPeerCount\":2,\"excludePeers\":null}" excludedPeers=
DEBUG[11-07|20:50:48.410|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_selection.go:148] discovering peers by pubsubTopic pubsubTopics=[/waku/2/rs/16/32]
DEBUG[11-07|20:50:48.410|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/peermanager/peer_selection.go:159] could not retrieve random peer from slot
DEBUG[11-07|20:50:48.410|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/lightpush/waku_lightpush.go:334] publishing message hash=0x13c93de6f8bb8ab422afb1f9ccb0368deda13fc36f62cfce34ee6739a6182109 pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x35f53ee3/rfc26 timestamp=1,731,012,648,410,353,000 peers="[16Uiu2HAm8mUZ18tBWPXDQsaF7PbCKYA35z7WB2xNZH2EVq1qS8LJ 16Uiu2HAmAR24Mbb6VuzoyUiGx42UenDkshENVDj4qnmmbabLvo31]"
- After a few minutes, I sent another message. Here are the logs from the sender device:
DEBUG[11-07|20:52:11.641|github.com/status-im/status-go/protocol/messenger.go:5638] sending community chat message site=Send chatID=0x0232936b0d051b28c55dbbcf6c348ffde8511c3139f5348dc7019ac03060ba4a673bab53ac-c5cf-4e3a-9a6b-1fa4cf6e3699 chatName=general
DEBUG[11-07|20:52:11.642|github.com/status-im/status-go/protocol/messenger.go:2272] sending community chat message site=dispatchMessage chatID=0x0232936b0d051b28c55dbbcf6c348ffde8511c3139f5348dc7019ac03060ba4a673bab53ac-c5cf-4e3a-9a6b-1fa4cf6e3699 chatName=general
DEBUG[11-07|20:52:11.647|github.com/status-im/status-go/protocol/common/message_segmentation.go:47] message segmented segments=1
DEBUG[11-07|20:52:11.647|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:726] handling message scheduled
DEBUG[11-07|20:52:11.650|github.com/status-im/status-go/protocol/common/message_sender.go:805] sent-message: public message recipient=[] messageID=0x02db0aa693e60e366093b0a220d0c986ca46d9e6fdf5bae3aacf7fe9756e6aa8 contentType=CHAT_MESSAGE messageType=public hashes=[0x6568a610a0d49eec1e3988d287e3812ebc8365ef4abe62b151eb08d7a5d5247d]
INFO [11-07|20:52:11.650|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/publish/message_sender.go:115] publishing message via lightpush envelopeHash=0x6568a610a0d49eec1e3988d287e3812ebc8365ef4abe62b151eb08d7a5d5247d pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x35f53ee3/rfc26 timestamp=1,731,012,731,650,327,000
- And this time the message was received immediately. Here are the logs from the receiver device:
wakuV2 storenode status verification
DEBUG[11-07|20:52:12.528|github.com/status-im/status-go/protocol/transport/transport.go:264] message not cached namespace=Transport site=retrieveRawAll hash=0x6568a610a0d49eec1e3988d287e3812ebc8365ef4abe62b151eb08d7a5d5247d
DEBUG[11-07|20:52:12.529|github.com/status-im/status-go/protocol/common/message_segmentation.go:237] handling message segment site=handleSegmentationLayerV2 hash=0x6568a610a0d49eec1e3988d287e3812ebc8365ef4abe62b151eb08d7a5d5247d EntireMessageHash=0x Index=0 SegmentsCount=0 ParitySegmentIndex=0 ParitySegmentsCount=0
DEBUG[11-07|20:52:12.530|github.com/status-im/status-go/protocol/common/message_sender.go:969] failed to handle segmentation layer message site=handleMessage hash=0x6568a610a0d49eec1e3988d287e3812ebc8365ef4abe62b151eb08d7a5d5247d error="invalid segments count"
DEBUG[11-07|20:52:12.530|github.com/status-im/status-go/protocol/encryption/protocol.go:756] received a protocol message namespace=Protocol site=HandleMessage sender-public-key=0x04035e0735303bb9a08ecb09145ece682eee59863e04bf6b1634a3ef5b66e3c6e728bd09d94763b7036e74e2af01e2e94b9a4dff30e3cca0677ef8c57acd6dcae2 my-installation-id=e8b026da-1bf0-47c5-9be9-970f940fdae7 messageID=0x6568a610a0d49eec1e3988d287e3812ebc8365ef4abe62b151eb08d7a5d5247d
DEBUG[11-07|20:52:12.538|github.com/status-im/status-go/protocol/common/message_sender.go:994] failed to handle datasync message site=handleMessage hash=0x6568a610a0d49eec1e3988d287e3812ebc8365ef4abe62b151eb08d7a5d5247d error="handling non-datasync message"
DEBUG[11-07|20:52:12.539|github.com/status-im/status-go/protocol/v1/status_message.go:170] calculated ID for envelope envelopeHash=0x6568a610a0d49eec1e3988d287e3812ebc8365ef4abe62b151eb08d7a5d5247d messageId=0x02db0aa693e60e366093b0a220d0c986ca46d9e6fdf5bae3aacf7fe9756e6aa8
DEBUG[11-07|20:52:12.539|github.com/status-im/status-go/protocol/messenger.go:3639] processing messages further site=RetrieveAll hash=0x6568a610a0d49eec1e3988d287e3812ebc8365ef4abe62b151eb08d7a5d5247d count=1
INFO [11-07|20:52:12.539|github.com/status-im/status-go/protocol/messenger.go:3655] processing message site=RetrieveAll hash=0x6568a610a0d49eec1e3988d287e3812ebc8365ef4abe62b151eb08d7a5d5247d message-id=0x02db0aa693e60e366093b0a220d0c986ca46d9e6fdf5bae3aacf7fe9756e6aa8 type=CHAT_MESSAGE senderID=0x04035e0735303bb9a08ecb09145ece682eee59863e04bf6b1634a3ef5b66e3c6e728bd09d94763b7036e74e2af01e2e94b9a4dff30e3cca0677ef8c57acd6dcae2
INFO [11-07|20:52:12.542|github.com/status-im/status-go/protocol/messenger_handlers.go:275] handling ChatMessage
DEBUG[11-07|20:52:12.555|github.com/status-im/status-go/protocol/messenger.go:3713] Handled parsed message site=RetrieveAll hash=0x6568a610a0d49eec1e3988d287e3812ebc8365ef4abe62b151eb08d7a5d5247d message-id=0x02db0aa693e60e366093b0a220d0c986ca46d9e6fdf5bae3aacf7fe9756e6aa8
DEBUG[11-07|20:52:12.556|github.com/status-im/status-go/protocol/transport/transport.go:551] confirming message processed namespace=Transport ids=[0x6568a610a0d49eec1e3988d287e3812ebc8365ef4abe62b151eb08d7a5d5247d] timestamp=1,731,012,732,556
- Notably, I do also notice some logs that the tasks for "retrieving missing messages" are being cancelled on the receiver device. Here are the logs for the receiver device:
DEBUG[11-07|20:50:43.083|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:197] inside on connection status change new-status=true agg filters count=12 filter subs count=9
DEBUG[11-07|20:50:43.215|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185] retrieving history to check for missing messages peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ contentTopics="[/waku/1/0x28154c87/rfc26 /waku/1/0xcaca5536/rfc26 /waku/1/0x48a4b7fd/rfc26 /waku/1/0x3a79d86d/rfc26 /waku/1/0x6b05d553/rfc26 /waku/1/0x1aff0540/rfc26 /waku/1/0x9d48b918/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731006213200264000 to=1731006273202123000 attempt=1
DEBUG[11-07|20:50:43.215|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185] retrieving history to check for missing messages peerID=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM contentTopics="[/waku/1/0x52920ec5/rfc26 /waku/1/0x28154c87/rfc26 /waku/1/0x3a79d86d/rfc26 /waku/1/0x1aff0540/rfc26 /waku/1/0xa4383181/rfc26 /waku/1/0x48a4b7fd/rfc26 /waku/1/0x6b05d553/rfc26 /waku/1/0xcaca5536/rfc26 /waku/1/0x35f53ee3/rfc26 /waku/1/0xe76f7b02/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731007119659290000 to=1731007173225791000 attempt=1
DEBUG[11-07|20:50:43.216|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268] sending store request peer=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ requestId=61393365323834303533353066616639356639333365366664623165306533643736643965393636326165333834303336633032656465366332353766623337
ERROR[11-07|20:50:43.216|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:190] could not query storenode peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ contentTopics="[/waku/1/0x28154c87/rfc26 /waku/1/0xcaca5536/rfc26 /waku/1/0x48a4b7fd/rfc26 /waku/1/0x3a79d86d/rfc26 /waku/1/0x6b05d553/rfc26 /waku/1/0x1aff0540/rfc26 /waku/1/0x9d48b918/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731006213200264000 to=1731006273202123000 error="context canceled" attempt=1
DEBUG[11-07|20:50:43.216|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268] sending store request peer=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM requestId=61636364326632623337396230623034333032393538326162643165363332666133663337643766626630353339633134336363303932373830323632626266
ERROR[11-07|20:50:43.216|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:190] could not query storenode peerID=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM contentTopics="[/waku/1/0x52920ec5/rfc26 /waku/1/0x28154c87/rfc26 /waku/1/0x3a79d86d/rfc26 /waku/1/0x1aff0540/rfc26 /waku/1/0xa4383181/rfc26 /waku/1/0x48a4b7fd/rfc26 /waku/1/0x6b05d553/rfc26 /waku/1/0xcaca5536/rfc26 /waku/1/0x35f53ee3/rfc26 /waku/1/0xe76f7b02/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731007119659290000 to=1731007173225791000 error="context canceled" attempt=1
...
DEBUG[11-07|20:51:05.234|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185] retrieving history to check for missing messages peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ contentTopics="[/waku/1/0x28154c87/rfc26 /waku/1/0xcaca5536/rfc26 /waku/1/0x48a4b7fd/rfc26 /waku/1/0x3a79d86d/rfc26 /waku/1/0x6b05d553/rfc26 /waku/1/0x1aff0540/rfc26 /waku/1/0x9d48b918/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731006213200264000 to=1731006273202123000 attempt=1
DEBUG[11-07|20:51:05.234|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185] retrieving history to check for missing messages peerID=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM contentTopics="[/waku/1/0x52920ec5/rfc26 /waku/1/0x28154c87/rfc26 /waku/1/0x3a79d86d/rfc26 /waku/1/0x1aff0540/rfc26 /waku/1/0xa4383181/rfc26 /waku/1/0x48a4b7fd/rfc26 /waku/1/0x6b05d553/rfc26 /waku/1/0xcaca5536/rfc26 /waku/1/0x35f53ee3/rfc26 /waku/1/0xe76f7b02/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731007119659290000 to=1731007173225791000 attempt=1
DEBUG[11-07|20:51:05.235|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268] sending store request peer=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ requestId=33646130306531623462363436386634663564613037386638323438306537353866653534353933323238633131643235313134323939383966313162366339
DEBUG[11-07|20:51:05.235|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268] sending store request peer=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM requestId=62663234353636623266373837353439383533666563653533363630383036323732663664376434653439306662346666356330633837656166393563623932
ERROR[11-07|20:51:05.235|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:190] could not query storenode peerID=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM contentTopics="[/waku/1/0x52920ec5/rfc26 /waku/1/0x28154c87/rfc26 /waku/1/0x3a79d86d/rfc26 /waku/1/0x1aff0540/rfc26 /waku/1/0xa4383181/rfc26 /waku/1/0x48a4b7fd/rfc26 /waku/1/0x6b05d553/rfc26 /waku/1/0xcaca5536/rfc26 /waku/1/0x35f53ee3/rfc26 /waku/1/0xe76f7b02/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731007119659290000 to=1731007173225791000 error="context canceled" attempt=1
ERROR[11-07|20:51:05.235|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:190] could not query storenode peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ contentTopics="[/waku/1/0x28154c87/rfc26 /waku/1/0xcaca5536/rfc26 /waku/1/0x48a4b7fd/rfc26 /waku/1/0x3a79d86d/rfc26 /waku/1/0x6b05d553/rfc26 /waku/1/0x1aff0540/rfc26 /waku/1/0x9d48b918/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731006213200264000 to=1731006273202123000 error="context canceled" attempt=1
Thoughts
In summary, what I notice is that I was able to send a message while initially re-connecting the devices, which was delayed but eventually retrieved. But after that successful retrieval, another message was sent but was not received or retrieved.
- I wonder if the timing of re-connecting to the network and sending a message before and after a "missing message retrieval" affects the results (?).
- It seems that something is happening during re-connection that fails or cancels the tasks for "retrieving missing messages", and then the missing message is not retrieved. (?)
- So perhaps the connection state change is causing some kind of cancellation that's not recovering even though it seems they are being retried (?)
The only problem that seems to persist is that one message remained missing.
This looks like an issue with the way missing_message store queries are being done. It looks like since we are going offline and coming back online the store-query loop is lagging behind and trying to fetch only very old messages and not able to catch up. If you notice the from time and to-time used here it is very different from the actual time query was made. This is why message always seems to be missing. I will investigate what is causing this, but i think i know a simple fix that can fix it (but need to think of side-effects of it).
DEBUG[11-07|20:50:45.217|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:185] retrieving history to check for missing messages peerID=16Uiu2HAmJnVR7ZzFaYvciPVafUXuYGLHPzSUigqAmeNw9nJUVGeM contentTopics="[/waku/1/0x52920ec5/rfc26 /waku/1/0x28154c87/rfc26 /waku/1/0x3a79d86d/rfc26 /waku/1/0x1aff0540/rfc26 /waku/1/0xa4383181/rfc26 /waku/1/0x48a4b7fd/rfc26 /waku/1/0x6b05d553/rfc26 /waku/1/0xcaca5536/rfc26 /waku/1/0x35f53ee3/rfc26 /waku/1/0xe76f7b02/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731007119659290000 to=1731007173225791000 attempt=1
I have pushed a potential fix for missing messages check not being so delayed. @seanstrom Can you please rerun and share the details?
Hey @chaitanyaprem I've re-run the tests with latest changes, and this time the issue was still reproducible but still tricky to encounter.
Overview
- In this case, I needed to lock the screen on the receiver device and wait several minutes before re-opening.
- Then I sent a message to the receiver device, which were received immediately, and then I toggled the network on the receiver device, and sent another message.
- I waited roughly 2 minutes and the message was not retrieved by "missing message" service.
- I then sent another message to the receiver device and that was received immediately.
Details
- contentTopic:
contentTopic=/waku/1/0xbca9db4a/rfc26 - missing message ID:
0xcc1386a45d0331b61f0b0ab2ada527f04bc20a1a8b9f575a28a0cb37261d0fa8 - missing message envelope hash:
0x1cbd8f05042856e32efb72fe02240fc293f9c7af616e502daba748595a55fcc2
Logs
- Here are the logs of sending the missing message from the sender device:
DEBUG[11-11|19:33:09.394|github.com/status-im/status-go/protocol/messenger.go:5638] sending community chat message site=Send chatID=0x03893ee3455c3a866162b4d4433db796cbed42bbf3bdcef3dc3b944680ca2403f0d9a9fedd-77c6-4304-b820-ed6bbf37db04 chatName=general
DEBUG[11-11|19:33:09.395|github.com/status-im/status-go/protocol/messenger.go:2272] sending community chat message site=dispatchMessage chatID=0x03893ee3455c3a866162b4d4433db796cbed42bbf3bdcef3dc3b944680ca2403f0d9a9fedd-77c6-4304-b820-ed6bbf37db04 chatName=general
DEBUG[11-11|19:33:09.400|github.com/status-im/status-go/protocol/common/message_segmentation.go:47] message segmented segments=1
DEBUG[11-11|19:33:09.400|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:726] handling message scheduled
DEBUG[11-11|19:33:09.402|github.com/status-im/status-go/protocol/common/message_sender.go:805] sent-message: public message recipient=[] messageID=0xcc1386a45d0331b61f0b0ab2ada527f04bc20a1a8b9f575a28a0cb37261d0fa8 contentType=CHAT_MESSAGE messageType=public hashes=[0x1cbd8f05042856e32efb72fe02240fc293f9c7af616e502daba748595a55fcc2]
DEBUG[11-11|19:33:09.402|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:731] handling message sent
DEBUG[11-11|19:33:09.402|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:890] handling public messages messageID="[204 19 134 164 93 3 49 182 31 11 10 178 173 165 39 240 75 194 10 26 139 159 87 90 40 160 203 55 38 29 15 168]"
INFO [11-11|19:33:09.402|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/publish/message_sender.go:115] publishing message via lightpush envelopeHash=0x1cbd8f05042856e32efb72fe02240fc293f9c7af616e502daba748595a55fcc2 pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0xbca9db4a/rfc26 timestamp=1,731,353,589,402,255,000
- Here are the logs of sender device sending the last message after the missing message:
DEBUG[11-11|19:36:54.770|github.com/status-im/status-go/protocol/messenger.go:5638] sending community chat message site=Send chatID=0x03893ee3455c3a866162b4d4433db796cbed42bbf3bdcef3dc3b944680ca2403f0d9a9fedd-77c6-4304-b820-ed6bbf37db04 chatName=general
DEBUG[11-11|19:36:54.771|github.com/status-im/status-go/protocol/messenger.go:2272] sending community chat message site=dispatchMessage chatID=0x03893ee3455c3a866162b4d4433db796cbed42bbf3bdcef3dc3b944680ca2403f0d9a9fedd-77c6-4304-b820-ed6bbf37db04 chatName=general
DEBUG[11-11|19:36:54.777|github.com/status-im/status-go/protocol/common/message_segmentation.go:47] message segmented segments=1
DEBUG[11-11|19:36:54.777|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:726] handling message scheduled
DEBUG[11-11|19:36:54.780|github.com/status-im/status-go/protocol/common/message_sender.go:805] sent-message: public message recipient=[] messageID=0x9912159991ff1bdc1cad64bd066b5a292db8e42a6515c994ea0a3c2d02ba41f0 contentType=CHAT_MESSAGE messageType=public hashes=[0xbb80b97df1bfca7c2c966d3a8da7a6af04bed675f45b78893e93eea2526f13ef]
DEBUG[11-11|19:36:54.780|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:731] handling message sent
DEBUG[11-11|19:36:54.780|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:890] handling public messages messageID="[153 18 21 153 145 255 27 220 28 173 100 189 6 107 90 41 45 184 228 42 101 21 201 148 234 10 60 45 2 186 65 240]"
INFO [11-11|19:36:54.780|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/publish/message_sender.go:115] publishing message via lightpush envelopeHash=0xbb80b97df1bfca7c2c966d3a8da7a6af04bed675f45b78893e93eea2526f13ef pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0xbca9db4a/rfc26 timestamp=1,731,353,814,780,330,000
- Here are the logs of the receiver device receiving the last message after the missing message:
DEBUG[11-11|19:36:55.520|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:117] checking for missing messages...
DEBUG[11-11|19:36:55.528|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:192] retrieving next page peerID=16Uiu2HAmAUdrQ3uwzuE4Gy4D56hX6uLKEeerJAnhKEHZ3DxF1EfT contentTopics="[/waku/1/0x8a2ca87f/rfc26 /waku/1/0x1916e98a/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731353791641722000 to=1731353809785075000 cursor= attempt=1
DEBUG[11-11|19:36:55.549|github.com/status-im/status-go/vendor/github.com/status-im/mvds/node/node.go:221] Epoch processing namespace=mvds node=0416f5fd epoch=4042
DEBUG[11-11|19:36:55.594|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:114] filter stats agg filters count=20 filter subs count=39
DEBUG[11-11|19:36:55.607|github.com/status-im/status-go/vendor/go.uber.org/zap/sugar.go:198] [16Uiu2HAmBREkVj4RKn8K4vvFcgYka5nQGAhUf2xEbZNY9Zfy7Zj7] opening stream to peer [16Uiu2HAmAUdrQ3uwzuE4Gy4D56hX6uLKEeerJAnhKEHZ3DxF1EfT]
DEBUG[11-11|19:36:55.622|github.com/status-im/status-go/protocol/messenger_mailserver_cycle.go:492] wakuV2 storenode status verification
DEBUG[11-11|19:36:55.626|github.com/status-im/status-go/protocol/transport/transport.go:264] message not cached namespace=Transport site=retrieveRawAll hash=0xbb80b97df1bfca7c2c966d3a8da7a6af04bed675f45b78893e93eea2526f13ef
DEBUG[11-11|19:36:55.641|github.com/status-im/status-go/protocol/common/message_segmentation.go:237] handling message segment site=handleSegmentationLayerV2 hash=0xbb80b97df1bfca7c2c966d3a8da7a6af04bed675f45b78893e93eea2526f13ef EntireMessageHash=0x Index=0 SegmentsCount=0 ParitySegmentIndex=0 ParitySegmentsCount=0
DEBUG[11-11|19:36:55.641|github.com/status-im/status-go/protocol/common/message_sender.go:969] failed to handle segmentation layer message site=handleMessage hash=0xbb80b97df1bfca7c2c966d3a8da7a6af04bed675f45b78893e93eea2526f13ef error="invalid segments count"
DEBUG[11-11|19:36:55.642|github.com/status-im/status-go/protocol/encryption/protocol.go:756] received a protocol message namespace=Protocol site=HandleMessage sender-public-key=0x04fe8601e3e633a776fa155573e9fc8d6a28e0944e7a4916c9c660360b4ff0e3ba7a3a4ed4f4df32b3a5eb9bed7f6225c21c5c29072ac5412c486ef0f62f28a3e1 my-installation-id=8a3b5748-2279-4279-bb0e-07de96cc7a1a messageID=0xbb80b97df1bfca7c2c966d3a8da7a6af04bed675f45b78893e93eea2526f13ef
DEBUG[11-11|19:36:55.646|github.com/status-im/status-go/protocol/common/message_sender.go:994] failed to handle datasync message site=handleMessage hash=0xbb80b97df1bfca7c2c966d3a8da7a6af04bed675f45b78893e93eea2526f13ef error="handling non-datasync message"
DEBUG[11-11|19:36:55.646|github.com/status-im/status-go/protocol/v1/status_message.go:170] calculated ID for envelope envelopeHash=0xbb80b97df1bfca7c2c966d3a8da7a6af04bed675f45b78893e93eea2526f13ef messageId=0x9912159991ff1bdc1cad64bd066b5a292db8e42a6515c994ea0a3c2d02ba41f0
DEBUG[11-11|19:36:55.646|github.com/status-im/status-go/protocol/messenger.go:3639] processing messages further site=RetrieveAll hash=0xbb80b97df1bfca7c2c966d3a8da7a6af04bed675f45b78893e93eea2526f13ef count=1
INFO [11-11|19:36:55.646|github.com/status-im/status-go/protocol/messenger.go:3655] processing message site=RetrieveAll hash=0xbb80b97df1bfca7c2c966d3a8da7a6af04bed675f45b78893e93eea2526f13ef message-id=0x9912159991ff1bdc1cad64bd066b5a292db8e42a6515c994ea0a3c2d02ba41f0 type=CHAT_MESSAGE senderID=0x04fe8601e3e633a776fa155573e9fc8d6a28e0944e7a4916c9c660360b4ff0e3ba7a3a4ed4f4df32b3a5eb9bed7f6225c21c5c29072ac5412c486ef0f62f28a3e1
DEBUG[11-11|19:36:55.647|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:192] retrieving next page peerID=16Uiu2HAmAUdrQ3uwzuE4Gy4D56hX6uLKEeerJAnhKEHZ3DxF1EfT contentTopics="[/waku/1/0x1916e98a/rfc26 /waku/1/0x458b4c80/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731353791641722000 to=1731353809896079000 cursor= attempt=1
INFO [11-11|19:36:55.648|github.com/status-im/status-go/protocol/messenger_handlers.go:275] handling ChatMessage
DEBUG[11-11|19:36:55.659|github.com/status-im/status-go/protocol/messenger.go:3713] Handled parsed message site=RetrieveAll hash=0xbb80b97df1bfca7c2c966d3a8da7a6af04bed675f45b78893e93eea2526f13ef message-id=0x9912159991ff1bdc1cad64bd066b5a292db8e42a6515c994ea0a3c2d02ba41f0
DEBUG[11-11|19:36:55.659|github.com/status-im/status-go/protocol/transport/transport.go:551] confirming message processed namespace=Transport ids=[0xbb80b97df1bfca7c2c966d3a8da7a6af04bed675f45b78893e93eea2526f13ef] timestamp=1,731,353,815,659
- And here are some logs from the receiver device that may be related to failed requests to retrieve missing messages:
DEBUG[11-11|19:33:35.152|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:192] retrieving history to check for missing messages peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ contentTopics="[/waku/1/0x01dd794e/rfc26 /waku/1/0x08788a36/rfc26 /waku/1/0x72e60904/rfc26 /waku/1/0x2a00c047/rfc26 /waku/1/0xea26a23a/rfc26 /waku/1/0x2f57390e/rfc26 /waku/1/0x458b4c80/rfc26 /waku/1/0xa2577d5b/rfc26 /waku/1/0xfe7ee9fd/rfc26 /waku/1/0xd5a6b1fd/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731352029627647000 to=1731352222024578000 attempt=1
DEBUG[11-11|19:33:35.153|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268] sending store request peer=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ requestId=65396435646633646332303261363763376331343637666566313066393364653763363636303737366331343666616135643434633361346136636331646432
DEBUG[11-11|19:33:35.153|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268] sending store request peer=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ requestId=66343863393133303338643462393063303061616563616439363439373663346636623534643330373934323934323864343266633931626436373839316232
DEBUG[11-11|19:33:35.154|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268] sending store request peer=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ requestId=30653830613763343764333031346538633738616630653561323237333865653833326363353466323230393134336465353130303034636536636238383932
ERROR[11-11|19:33:35.154|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:197] could not query storenode peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ contentTopics="[/waku/1/0x01dd794e/rfc26 /waku/1/0x08788a36/rfc26 /waku/1/0x72e60904/rfc26 /waku/1/0x2a00c047/rfc26 /waku/1/0xea26a23a/rfc26 /waku/1/0x2f57390e/rfc26 /waku/1/0x458b4c80/rfc26 /waku/1/0xa2577d5b/rfc26 /waku/1/0xfe7ee9fd/rfc26 /waku/1/0xd5a6b1fd/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731352029627647000 to=1731352222024578000 error="rate: Wait(n=1) would exceed context deadline" attempt=1
DEBUG[11-11|19:33:35.154|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:192] retrieving history to check for missing messages peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ contentTopics="[/waku/1/0x29dcabf8/rfc26 /waku/1/0xbca9db4a/rfc26 /waku/1/0xde213ba3/rfc26 /waku/1/0x01dd794e/rfc26 /waku/1/0x08788a36/rfc26 /waku/1/0x72e60904/rfc26 /waku/1/0x2a00c047/rfc26 /waku/1/0xea26a23a/rfc26 /waku/1/0x623d7840/rfc26 /waku/1/0x2f57390e/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731352029627647000 to=1731352259858045000 attempt=1
ERROR[11-11|19:33:35.154|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:197] could not query storenode peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ contentTopics="[/waku/1/0x2f57390e/rfc26 /waku/1/0xea26a23a/rfc26 /waku/1/0xd5a6b1fd/rfc26 /waku/1/0x257bc1b1/rfc26 /waku/1/0xbca9db4a/rfc26 /waku/1/0xf0dbcdc4/rfc26 /waku/1/0x1916e98a/rfc26 /waku/1/0x2a00c047/rfc26 /waku/1/0xe514bc4a/rfc26 /waku/1/0x8a2ca87f/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731353502952161000 to=1731353523867347000 error="rate: Wait(n=1) would exceed context deadline" attempt=1
DEBUG[11-11|19:33:35.154|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:192] retrieving history to check for missing messages peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ contentTopics="[/waku/1/0xe514bc4a/rfc26 /waku/1/0xf0dbcdc4/rfc26 /waku/1/0x257bc1b1/rfc26 /waku/1/0x1916e98a/rfc26 /waku/1/0x29dcabf8/rfc26 /waku/1/0x0e0fc39a/rfc26 /waku/1/0x0e01e57f/rfc26 /waku/1/0x08788a36/rfc26 /waku/1/0x72e60904/rfc26 /waku/1/0x2a00c047/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731352029627647000 to=1731352093901187000 attempt=1
DEBUG[11-11|19:33:35.154|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268] sending store request peer=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ requestId=34646331636334633633373330333663383461666362663733666563363266343566396264356633363234343333346437366635613564613930306434633237
DEBUG[11-11|19:33:35.155|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/protocol/store/client.go:268] sending store request peer=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ requestId=38346637646434393336376463313937363562303863626262393262393532653666343265613566656539396534356334643238643732343831386332313335
ERROR[11-11|19:33:35.155|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:197] could not query storenode peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ contentTopics="[/waku/1/0x29dcabf8/rfc26 /waku/1/0xbca9db4a/rfc26 /waku/1/0xde213ba3/rfc26 /waku/1/0x01dd794e/rfc26 /waku/1/0x08788a36/rfc26 /waku/1/0x72e60904/rfc26 /waku/1/0x2a00c047/rfc26 /waku/1/0xea26a23a/rfc26 /waku/1/0x623d7840/rfc26 /waku/1/0x2f57390e/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731352029627647000 to=1731352259858045000 error="rate: Wait(n=1) would exceed context deadline" attempt=1
ERROR[11-11|19:33:35.155|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/missing/missing_messages.go:197] could not query storenode peerID=16Uiu2HAmMELCo218hncCtTvC2Dwbej3rbyHQcR8erXNnKGei7WPZ contentTopics="[/waku/1/0xe514bc4a/rfc26 /waku/1/0xf0dbcdc4/rfc26 /waku/1/0x257bc1b1/rfc26 /waku/1/0x1916e98a/rfc26 /waku/1/0x29dcabf8/rfc26 /waku/1/0x0e0fc39a/rfc26 /waku/1/0x0e01e57f/rfc26 /waku/1/0x08788a36/rfc26 /waku/1/0x72e60904/rfc26 /waku/1/0x2a00c047/rfc26]" pubsubTopic=/waku/2/rs/16/32 from=1731352029627647000 to=1731352093901187000 error="rate: Wait(n=1) would exceed context deadline" attempt=1
Thoughts
- It seems there are some failed "missing message retrieval" requests, but those seem to go away after a few minutes, and the "missing message retrieval" requests continue to be performed but are not retrieving the missing message for some reason. (?)