minetest icon indicating copy to clipboard operation
minetest copied to clipboard

arm64-v8a. Packets time out in singleplayer. RE-SENDING timed-out

Open stellio opened this issue 6 years ago • 27 comments

Minetest version
5.1.0-dev
5.2.0-dev
OS / Hardware

Operating system: Android 8.1.0 CPU: Qualcomm MSM8994 Snapdragon 810

Summary

Slow media loading at first world start on arm64-v8a game build, on arm7 normal. Issue similar to #8207 .

Supposedly bug in server side, because if connecting to remote server game or to local server from arm64-v8a, loading is normal. If we connecting to arm64-v8a hosted game from desktop game build or arm7 build - appears same bug

What it may be, some kind arm64-v8a float variables incorrect operating, or type_cast?

17:38:44: INFO[Main]: Connecting to server at 127.0.0.1:59067
17:38:44: INFO[Main]: Client packetcounter (20):
17:38:44: VERBOSE[Server]: Server::peerAdded(): peer->id=2
17:38:44: VERBOSE[Server]: Server: Handling peer change: id=2, timeout=0
17:38:44: INFO[Server]: Server: Maximum lag peaked to 1.551 s
17:38:44: INFO[Main]: Client::peerAdded(): peer->id=1
17:38:44: VERBOSE[Server]: Server: Got TOSERVER_INIT from 127.0.0.1 (peer_id=2)
17:38:44: VERBOSE[Server]: Server: 127.0.0.1: Protocol version: min: 37, max: 38, chosen: 38
17:38:44: INFO[Server]: Server: New connection: "singleplayer" from 127.0.0.1 (peer_id=2)
17:38:44: VERBOSE[Server]: Sending TOCLIENT_HELLO with auth method field: 2
17:38:44: INFO[Main]: Client: TOCLIENT_HELLO received with serialization_ver=28, auth_mechs=2, proto_ver=38, compression_mode=0. Doing auth with mech 2
17:38:44: INFO[Server]: Server: TOSERVER_SRP_BYTES_A received with based_on=1 and len_A=256.
17:38:44: INFO[Main]: Client: Received TOCLIENT_SRP_BYTES_S_B.
17:38:44: VERBOSE[Server]: Server: Received TOCLIENT_SRP_BYTES_M.
17:38:44: INFO[Main]: Client: received map seed: 13341598796440393803
17:38:44: INFO[Main]: Client: received recommended send interval 0.09
17:38:44: VERBOSE[Server]: Server: Got TOSERVER_INIT2 from 2
17:38:44: INFO[Server]: Server: Sending content to [id=2]
17:38:44: VERBOSE[Server]: Server: Sending item definitions to id(2): size=15290
17:38:44: VERBOSE[Server]: Server: Sending node definitions to id(2): size=14566
17:38:44: VERBOSE[Server]: Server: Announcing files to id(2)
17:38:45: VERBOSE[ConnectionSend]: con(39/1)RE-SENDING timed-out RELIABLE to 127.0.0.1(t/o=0.1): from_peer_id=1, channel=0, seqnum=65532
17:38:45: VERBOSE[ConnectionSend]: con(39/1)RE-SENDING timed-out RELIABLE to 127.0.0.1(t/o=0.1): from_peer_id=1, channel=0, seqnum=65533
17:38:45: VERBOSE[ConnectionSend]: con(39/1)RE-SENDING timed-out RELIABLE to 127.0.0.1(t/o=0.1): from_peer_id=1, channel=0, seqnum=65534
17:38:45: INFO[Main]: Client: Received item definitions: packet size: 15290
17:38:45: VERBOSE[Main]: ItemDefManager: registering "default:dirt_with_grass"
17:38:45: VERBOSE[Main]: ItemDefManager: registering "default:dirt_with_grass_footsteps"
17:38:45: VERBOSE[Main]: ItemDefManager: registering "default:dirt_with_rainforest_litter"
17:38:45: VERBOSE[Main]: ItemDefManager: registering "default:dirt_with_snow"
17:38:45: VERBOSE[Main]: ItemDefManager: registering "default:dry_dirt"
17:38:45: VERBOSE[ConnectionSend]: con(39/1)RE-SENDING timed-out RELIABLE to 127.0.0.1(t/o=0.1): from_peer_id=1, channel=0, seqnum=11
17:38:45: VERBOSE[Main]: con(39/1)RE-SENDING timed-out RELIABLE to 127.0.0.1(t/o=0.1): from_peer_id=1, channel=0, seqnum=11ItemDefManager: registering "default:dry_dirt_with_dry_grass"
17:38:45: VERBOSE[ConnectionSend]: con(39/1)RE-SENDING timed-out RELIABLE to 127.0.0.1(t/o=0.1): from_peer_id=1, channel=0, seqnum=12
17:38:45: VERBOSE[Main]: con(39/1)RE-SENDING timed-out RELIABLE to 127.0.0.1(t/o=0.1): from_peer_id=1, channel=0, seqnum=12ItemDefManager: registering "default:dry_grass_1"
17:38:45: VERBOSE[ConnectionSend]: con(39/1)RE-SENDING timed-out RELIABLE to 127.0.0.1(t/o=0.1): from_peer_id=1, channel=0, seqnum=13
...
...
...

stellio avatar Nov 12 '19 12:11 stellio

Not only similar, I believe this is a duplicate. The other issue has also been seen on arm64 (linked forum post).

sfan5 avatar Nov 12 '19 13:11 sfan5

Could this be related to the currently wrong calculated Reliable UDP RTT? "Fixed" in https://github.com/minetest/minetest/pull/7428, reverted in https://github.com/minetest/minetest/pull/8187

SmallJoker avatar Nov 12 '19 17:11 SmallJoker

This is the problem that I talked about in my PR. In 0.4.17 also

MoNTE48 avatar Nov 12 '19 23:11 MoNTE48

I wasted a few hours debugging this today. It's not an AArch64-specific problem.

Sometimes the receive thread runs behind on receiving the ACK packets and since the resend timeout on localhost is low (0.1s), packets are then very quickly resent even if this is not needed. Eventually the receive thread catches up and transmission can resume normally.

example log
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=0
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=0
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=20
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=20
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=40
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=40
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=60
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=60
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=80
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=100
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=80
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=120
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=100
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=140
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=120
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=160
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=140
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=180
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=160
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=200
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=220
2020-04-08 17:10:52: ACTION[ConnectionReceive]: on(5/1) acked seqnum=180
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=240
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=200
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=260
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=220
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=280
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=240
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=300
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=260
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=320
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=280
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=340
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=300
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=260
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=320
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=280
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=340
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=300
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=360
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=320
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=380
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=340
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=400
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=360
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=420
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=440
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=380
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=460
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=400
2020-04-08 17:10:52: ACTION[ConnectionSend]: timed_outs=71
2020-04-08 17:10:52: VERBOSE[ConnectionSend]: con(5/1)RE-SENDING timed-out RELIABLE to 10.0.2.2(t/o=0.1): from_peer_id=1, channel=2, seqnum=420
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=420
2020-04-08 17:10:52: VERBOSE[ConnectionSend]: con(5/1)RE-SENDING timed-out RELIABLE to 10.0.2.2(t/o=0.1): from_peer_id=1, channel=2, seqnum=440
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=440
2020-04-08 17:10:52: VERBOSE[ConnectionSend]: con(5/1)RE-SENDING timed-out RELIABLE to 10.0.2.2(t/o=0.1): from_peer_id=1, channel=2, seqnum=460
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=460
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=480
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=500
2020-04-08 17:10:52: VERBOSE[ConnectionReceive]: con(5/1)WARNING: ACKed packet not in outgoing queue seqnum=420
2020-04-08 17:10:52: VERBOSE[ConnectionReceive]: con(5/1)WARNING: ACKed packet not in outgoing queue seqnum=440
2020-04-08 17:10:52: VERBOSE[ConnectionReceive]: con(5/1)WARNING: ACKed packet not in outgoing queue seqnum=460
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=480
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=500
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=520
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=520
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=540
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=540
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=560
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=560
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=580
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=580
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=600
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=600
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=620
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=620
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=640
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=640
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=660
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=660
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=680
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=680
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=700
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=700
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=720
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=720
2020-04-08 17:10:52: ACTION[ConnectionSend]: con(5/1) sending seqnum=740
2020-04-08 17:10:52: ACTION[ConnectionReceive]: con(5/1) acked seqnum=740

Possible causes are the device just being too slow (does this make sense?) or different threads fighting over locks.

sfan5 avatar Apr 08 '20 17:04 sfan5

@sfan5 test-device with Snapdragon 820 and 6 GB RAM is slow for you?

MoNTE48 avatar Apr 08 '20 18:04 MoNTE48

It's taking me 10s of minutes to load a game. This needs to be fixed sooner than later

Will the Play Store allow us to cheat the 64-bit requirement, and ship a 32-bit NDK .so for 64-bit?

rubenwardy avatar Jun 20 '20 15:06 rubenwardy

@rubenwardy

  1. with large subgame probably 5 min on cheap Android phone.
  2. no, an application with a 32-bit library in a arm64 folder will cause crash of the application at startup.

MoNTE48 avatar Jun 20 '20 15:06 MoNTE48

I estimated, this is with MineClone 2. Minetest Game loads much faster. I dread to think how slow something like Dreambuilder

rubenwardy avatar Jun 20 '20 15:06 rubenwardy

The best I can do: to indicate which code can partially help (armv7 still loads the game 3 times faster) in this problem and declare bounty for solving the problem.

MoNTE48 avatar Jun 20 '20 15:06 MoNTE48

Briefly looked at this again today, here's a theory (tested with resend_timeout fixed to 10): When the networking code queues to-be-sent reliable packets because the window size is full, the send thread doesn't wake up in time when ACKs arrive.

Sorry if this post doesn't make sense, just writing it here so I don't forget.

sfan5 avatar Jun 20 '20 17:06 sfan5

Try to comment this lines. https://github.com/minetest/minetest/blob/master/src/network/connection.cpp#L688-L726

MoNTE48 avatar Jun 20 '20 17:06 MoNTE48

Beta (5.3.0, jun25, 2020) from PlayStore have this problem. Samsung Galaxy Tab S6 Lite. Pkease let me know if/when you will have beta build to test

kai11 avatar Jun 29 '20 06:06 kai11

i have a server on 5.2.0 and 5.3.0 clients on android any arch time out inmediatly uppon joining

IvanTurgenev avatar Jul 15 '20 06:07 IvanTurgenev

@IvanTurgenev, a bug in the server running on arm64, not the client.

MoNTE48 avatar Jul 15 '20 09:07 MoNTE48

@IvanTurgenev, a bug in the server running on arm64, not the client.

should i open a new issue and investigate?

IvanTurgenev avatar Jul 16 '20 03:07 IvanTurgenev

We are seeing this issue between a amd64 server and armhf client (beaglebone black). Same question as @IvanTurgenev, do you rather want a new issue or can we report any findings here?

Given the client and server are severly constrained in terms of processing power our setup might be unique - it also seems to reliably reproduce the issue - one every connect of the stressed client.

bbros-dev avatar Jan 22 '21 02:01 bbros-dev

If you have any new findings you can report them here in this issue.

sfan5 avatar Jan 22 '21 09:01 sfan5

Briefly looked at this again today, here's a theory (tested with resend_timeout fixed to 10): When the networking code queues to-be-sent reliable packets because the window size is full, the send thread doesn't wake up in time when ACKs arrive.

Sorry if this post doesn't make sense, just writing it here so I don't forget.

Was this proven to be false? What's the process on this?

rubenwardy avatar Feb 17 '21 02:02 rubenwardy

No, this is as far as I got last time. I have no idea how you'd best test the theory that threads are waiting for locks (perf?).

The process would be to

  • edit timeout calculation so resend_timeout is always >= 10.0 (optional? I think this was just to help debugging)
  • add logging to the send thread to log the ACKs it sends with the respective sequence number
  • add logging to the receive thread to log the reliable packets it sends with the respective seqno
  • you should then be able to reproduce the situation as described here https://github.com/minetest/minetest/issues/9107#issuecomment-611085761

sfan5 avatar Feb 17 '21 10:02 sfan5

I see the title says 'single player', but I am experiencing this issue on a server as well, when trying to connect with Android.

Git-Forked avatar Apr 09 '21 13:04 Git-Forked

It's possible that what you're describing is related to this or has a similar cause, but this issue is only about timeouts in singleplayer (where those should never happen).

sfan5 avatar Apr 09 '21 15:04 sfan5

I had this Problem with arm64v8 build with v5.4.0. But only on the first time loading a mineclone2 map. I thought it is the game downloading all the needed mod dependencies on first game start

NeroBurner avatar May 25 '21 06:05 NeroBurner

This should be fixed by the PR seen above: ^ I've verified it on my phone at least, if still broken for anyone please comment here.

sfan5 avatar Sep 17 '21 16:09 sfan5

I tried the arm64 build artifact of that PR signed (by an app), and it works fine, media loading finally doesn't take much time to load Mineclone 5 or Hamlet's Quest or Minetest Extended. Loading with texturepacks such as Soothing 32 and John Smith seems to take a bit longer time than when trying with a stable armv7 build.

I tried the armv7 build artifact as well, it seems fine for the most part too.

That's until I tried Mineclone 5 and Refi texturepack and it takes an unusual amount of time to load on both builds I waited up until 4 mins for both before I gave up (at 3%).

These are the warnings when I was trying the arm64 build with Mineclone 5 and Refi texturepack (before giving up).

2021-09-18 06:11:58: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 850 pkts
2021-09-18 06:11:58: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1009 pkts
2021-09-18 06:11:59: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:11:59: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 420 pkts
2021-09-18 06:11:59: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 589 pkts
2021-09-18 06:11:59: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 332 pkts
2021-09-18 06:11:59: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 416 pkts
2021-09-18 06:12:00: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 506 pkts
2021-09-18 06:12:00: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 516 pkts
2021-09-18 06:12:00: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:00: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 466 pkts
2021-09-18 06:12:00: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 581 pkts
2021-09-18 06:12:00: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 732 pkts
2021-09-18 06:12:01: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:01: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:01: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 503 pkts
2021-09-18 06:12:01: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 712 pkts
2021-09-18 06:12:01: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 149 pkts
2021-09-18 06:12:01: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:02: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:02: WARNING[ConnectionSend]: con(87/1) Packet quota used up after re-sending packets, max=1024
2021-09-18 06:12:02: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 630 pkts
2021-09-18 06:12:02: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:02: WARNING[ConnectionSend]: con(87/1) Packet quota used up after re-sending packets, max=1024
2021-09-18 06:12:02: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:02: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:03: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:03: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:03: WARNING[ConnectionSend]: con(87/1) Packet quota used up after re-sending packets, max=1024
2021-09-18 06:12:03: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 467 pkts
2021-09-18 06:12:03: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 239 pkts
2021-09-18 06:12:03: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 382 pkts
2021-09-18 06:12:04: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:04: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 581 pkts
2021-09-18 06:12:04: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:04: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 317 pkts
2021-09-18 06:12:04: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 710 pkts
2021-09-18 06:12:05: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 325 pkts
2021-09-18 06:12:05: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 398 pkts
2021-09-18 06:12:05: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 547 pkts
2021-09-18 06:12:05: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:05: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 270 pkts
2021-09-18 06:12:05: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 483 pkts
2021-09-18 06:12:05: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 495 pkts
2021-09-18 06:12:05: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 189 pkts
2021-09-18 06:12:05: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 769 pkts
2021-09-18 06:12:06: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:06: WARNING[ConnectionSend]: con(87/1) Packet quota used up after re-sending packets, max=1024
2021-09-18 06:12:06: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 345 pkts
2021-09-18 06:12:07: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 1024 pkts
2021-09-18 06:12:08: WARNING[ConnectionSend]: con(87/1) Packet quota used up for peer_id=2, was 987 pkts

AbduSharif avatar Sep 18 '21 04:09 AbduSharif

Looks like the limit at which it falls apart was just raised (though this time armv7 is equally broken), thanks for testing. I'll untag "High Priority" since most users should never hit this.

sfan5 avatar Sep 18 '21 11:09 sfan5

I'm getting this as well on "slower" phones when there are too many mods enabled. I updated to 5.5.0 and initially it seemed to be working but it didn't.

Server: Local WiFi, Windows, MT 5.5.0

Device MT Version Working
OnePlus 8T 5.5.0 v8a
OnePlus 8T 5.4.2
OnePlus 6 5.5.0 v8a
OnePlus 6 5.4.2
OnePlus Nord N100 5.5.0 v8a
OnePlus Nord N100 5.4.2

When timing out they do so immediately when "Connecting to the server".

19:24:45: INFO[Main]: Connecting to server at 192.168.0.141:30000
19:24:45: INFO[Main]: Client packetcounter (30s): sum=0 avg=0/s
19:24:45: INFO[Main]: Client::peerAdded(): peer->id=1
19:24:45: VERBOSE[ConnectionSend]: con(5/0)RE-SENDING timed-out RELIABLE to 192.168.0.141(t/o=0.5): count=1, channel=0, seqnum=65500
[...]
19:24:55: VERBOSE[ConnectionSend]: con(5/0)RE-SENDING timed-out RELIABLE to 192.168.0.141(t/o=0.5): count=19, channel=0, seqnum=65500
19:24:55: ERROR[Main]: Connection timed out.
19:24:55: INFO[ConnectionSend]: con(5/0)RunTimeouts(): Peer 1 has timed out.

After updating to 5.5.0 the two slower devices connected to the server and started "Initializing nodes" for around an hour, then they timed out around 80%. When trying to connect again it immediately time out again as in the log above. (Initializing nodes on the working OP8T takes around 7 min)

If I disable the Home Decor mod, I'm able to connect again on the OP6 device.

List of mods used All mods are available at contentDB, doors is Doors Redux

Be aware of that some are set to false

load_mod_i3 = true
load_mod_wieldview = false
load_mod_shields = false
load_mod_3d_armor_ui = false
load_mod_3d_armor_stand = false
load_mod_3d_armor_sfinv = false
load_mod_3d_armor = true
load_mod_3d_armor_ip = false
load_mod_terraform = true
load_mod_mobkit = true
load_mod_mob_core = true
load_mod_cottages = true
load_mod_comboblock = true
load_mod_mtc_chisel = true
load_mod_doors = true
load_mod_config = true
load_mod_creatura = true
load_mod_rocks = true
load_mod_dreambuilder_hotbar = true
load_mod_nuke = true
load_mod_modlib = true
load_mod_cmdlib = true
load_mod_markers = true
load_mod_areas = true
load_mod_castle_masonry = true
load_mod_matrix = true
load_mod_rhotator = true
load_mod_doc_items = true
load_mod_doc = true
load_mod_travelnet = true
load_mod_tnt_revamped = true
load_mod_unifieddyes = true
load_mod_basic_materials = true
load_mod_morelights_dim_morelights_extras = true
load_mod_morelights_modern = true
load_mod_morelights_extras = true
load_mod_morelights_vintage = true
load_mod_facade = true
load_mod_xdecor = true
load_mod_morelights_dim = true
load_mod_morelights_dim_morelights_modern = true
load_mod_morelights = true
load_mod_luxury_decor = true
load_mod_morelights_dim_morelights_vintage = true
load_mod_pkarcs_doors3 = true
load_mod_pkarcs_doors = true
load_mod_pkarcs = true
load_mod_formspecs = true
load_mod_falls = true
load_mod_snippets = true
load_mod_h_windows = true
load_mod_moreblocks = true
load_mod_cleaner = true
load_mod_mover = true
mod_storage_backend = sqlite3
load_mod_lavalamp = false
load_mod_homedecor_tables = false
load_mod_homedecor_exterior = true
load_mod_homedecor_cobweb = false
load_mod_homedecor_office = false
load_mod_homedecor_gastronomy = false
load_mod_homedecor_foyer = false
load_mod_homedecor_furniture = false
load_mod_homedecor_trash_cans = false
load_mod_fake_fire = false
load_mod_homedecor_wardrobe = true
load_mod_homedecor_furniture_medieval = false
load_mod_homedecor_lighting = false
load_mod_itemframes = true
load_mod_homedecor_books = false
load_mod_homedecor_roofing = true
load_mod_homedecor_seating = false
load_mod_homedecor_3d_extras = false
load_mod_homedecor_climate_control = false
load_mod_homedecor_plasmascreen = false
load_mod_building_blocks = false
load_mod_homedecor_laundry = false
load_mod_homedecor_bathroom = true
load_mod_homedecor_bedroom = false
load_mod_homedecor_common = true
load_mod_homedecor_doors_and_gates = false
load_mod_homedecor_kitchen = false
load_mod_homedecor_electrical = false
load_mod_homedecor_electronics = false
load_mod_homedecor_fences = true
load_mod_homedecor_inbox = false
load_mod_homedecor_windows_and_treatments = true
load_mod_homedecor_clocks = false
load_mod_homedecor_misc = false
load_mod_homedecor_pictures_and_paintings = false

depascalis avatar Mar 25 '22 16:03 depascalis

Re-test needed after last networking tuning changes (5.9.1).

sfan5 avatar Oct 08 '24 21:10 sfan5