bun icon indicating copy to clipboard operation
bun copied to clipboard

random ECONNRESET using microtask operations in server.fetch

Open tinchoz49 opened this issue 2 years ago • 1 comments

What version of Bun is running?

0.5.7

What platform is your computer?

Darwin 22.2.0 arm64 arm

What steps can reproduce the bug?

  1. Create a basic server with a microtask operation before to return the response.
export default {
  port: 3000,
  async fetch () {
    await new Promise(resolve => {
      resolve()
    })

    return new Response('test')
  }
}
  1. Use autocannon
$ bunx autocannon -c 100 -d 10 -p 10 --debug localhost:3000

What is the expected behavior?

I expect autocannon to return the benchmark result without ECONNRESET issues.

What do you see instead?

Multiples debug logs from autocannon with:

Error: read ECONNRESET
    at TCP.onStreamRead (node:internal/stream_base_commons:217:20) {
  errno: -54,
  code: 'ECONNRESET',
  syscall: 'read'
}

Additional information

Adding a macrotask setTimeout or removing the promise solves the problem.

export default {
  port: 3000,
  async fetch () {
    await new Promise(resolve => {
      setTimeout(resolve, 1)
    })

    return new Response('test')
  }
}

tinchoz49 avatar Feb 25 '23 04:02 tinchoz49

I few more things:

  1. I know that autocannon is not a good option to test bun server but besides that is weird that without the promise autocannon finish without errors. Also i'm running autocannon with bunx so i expect to be using bun to test bun.

  2. It seems that the performance of microtasking performs slower than in node.js:

import { run, bench } from 'mitata'

bench(async () => {
  await Promise.resolve()
})

run()
❯ bun bench.js
cpu: Apple M1 Max
runtime: bun 0.5.8 (arm64-darwin)

benchmark      time (avg)             (min … max)       p75       p99      p995
------------------------------------------------- -----------------------------
           145.31 ns/iter (138.18 ns … 280.74 ns) 141.35 ns 213.96 ns 254.32 ns
❯ node bench.js
cpu: Apple M1 Max
runtime: node v16.18.1 (arm64-darwin)

benchmark      time (avg)             (min … max)       p75       p99      p995
------------------------------------------------- -----------------------------
           112.53 ns/iter (104.81 ns … 233.58 ns) 115.95 ns 173.65 ns 220.27 ns

the difference is not too much but maybe something is going on there.

Another test but with bun server and wrk:

export default {
  port: 4000,
  async fetch () {
    return new Response('hello')
  }
}
❯ wrk -t12 -c400 -d5s http://0.0.0.0:4000
Running 5s test @ http://0.0.0.0:4000
  12 threads and 400 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     2.63ms  800.60us  15.66ms   66.47%
    Req/Sec    12.53k     2.76k   36.22k    80.30%
  753594 requests in 5.10s, 86.24MB read
  Socket errors: connect 0, read 375, write 0, timeout 0
Requests/sec: 147633.79
Transfer/sec:     16.90MB

But adding a promise the performance drop 30k of request/sec:

export default {
  port: 4000,
  async fetch () {
    await Promise.resolve()
    return new Response('hello')
  }
}
Running 5s test @ http://0.0.0.0:4000
  12 threads and 400 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     3.42ms  389.76us  15.65ms   73.35%
    Req/Sec     9.60k   394.77    10.33k    90.17%
  573376 requests in 5.01s, 65.62MB read
  Socket errors: connect 0, read 382, write 0, timeout 0
Requests/sec: 114409.77
Transfer/sec:     13.09MB

tinchoz49 avatar Feb 27 '23 13:02 tinchoz49

It appears this has been fixed since Bun v1.0.

❯ bunx autocannon -c 100 -d 10 -p 10 --debug localhost:3002
Running 10s test @ http://localhost:3002
100 connections with 10 pipelining factor


┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max   │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 3 ms │ 3 ms │ 4 ms  │ 4 ms │ 3.36 ms │ 0.64 ms │ 26 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬────────┬────────┬─────────┬────────┬──────────┬─────────┬─────────┐
│ Stat      │ 1%     │ 2.5%   │ 50%     │ 97.5%  │ Avg      │ Stdev   │ Min     │
├───────────┼────────┼────────┼─────────┼────────┼──────────┼─────────┼─────────┤
│ Req/Sec   │ 243199 │ 243199 │ 251519  │ 252415 │ 249894.4 │ 3455.91 │ 243150  │
├───────────┼────────┼────────┼─────────┼────────┼──────────┼─────────┼─────────┤
│ Bytes/Sec │ 29 MB  │ 29 MB  │ 29.9 MB │ 30 MB  │ 29.7 MB  │ 409 kB  │ 28.9 MB │
└───────────┴────────┴────────┴─────────┴────────┴──────────┴─────────┴─────────┘

Req/Bytes counts sampled once per second.
# of samples: 10

2500k requests in 10.04s, 297 MB read

If you still run into this problem, please feel free to re-open this issue and we'll take another look.

Electroid avatar Oct 25 '23 22:10 Electroid