node icon indicating copy to clipboard operation
node copied to clipboard

src: make ReqWrap weak

Open RafaelGSS opened this issue 1 year ago • 9 comments

This commit allows throwing an exception after creating FSReqCallback

Reference: https://github.com/nodejs/node/pull/44004#discussion_r930964065

RafaelGSS avatar Jul 31 '22 23:07 RafaelGSS

CI: https://ci.nodejs.org/job/node-test-pull-request/45801/

nodejs-github-bot avatar Aug 02 '22 14:08 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/45888/

nodejs-github-bot avatar Aug 06 '22 23:08 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/45908/

nodejs-github-bot avatar Aug 07 '22 22:08 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/45910/

nodejs-github-bot avatar Aug 08 '22 01:08 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/45922/

nodejs-github-bot avatar Aug 08 '22 13:08 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/45926/

nodejs-github-bot avatar Aug 08 '22 16:08 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/45939/

nodejs-github-bot avatar Aug 08 '22 20:08 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/45946/

nodejs-github-bot avatar Aug 09 '22 12:08 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/45958/

nodejs-github-bot avatar Aug 09 '22 21:08 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/45963/

nodejs-github-bot avatar Aug 10 '22 10:08 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/45970/

nodejs-github-bot avatar Aug 10 '22 13:08 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/45988/

nodejs-github-bot avatar Aug 10 '22 21:08 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/45989/

nodejs-github-bot avatar Aug 10 '22 22:08 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/45990/

nodejs-github-bot avatar Aug 10 '22 22:08 nodejs-github-bot

It looks like this PR introduces a flakiness to test.parallel/test-net-server-max-connections on Windows, @RafaelGSS can you look into it? I don't think it should land as is.

aduh95 avatar Aug 17 '22 18:08 aduh95

It looks like this PR introduces a flakiness to test.parallel/test-net-server-max-connections on Windows, @RafaelGSS can you look into it? I don't think it should land as is.

Sorry for the delay, busy week. I've looked into the Reliability issues and I couldn't find a reference to the test-net-server-max-connections. I have also executed the test locally several times (on my windows machine) and it's working fine.

RafaelGSS avatar Aug 19 '22 23:08 RafaelGSS

If you look at the above CI links, you can see that it's always the same test-net-server-max-connections test that fails on 2,win2012r2-COMPILED_BY-vs2019-x86. Having to spawn 13 Jenkins jobs to get a passing CI is also a good sign that something is not right. It's unfortunate that you can't reproduce, maybe we should mark the test as flaky?

aduh95 avatar Aug 22 '22 17:08 aduh95

If you look at the above CI links, you can see that it's always the same test-net-server-max-connections test that fails on 2,win2012r2-COMPILED_BY-vs2019-x86. Having to spawn 13 Jenkins jobs to get a passing CI is also a good sign that something is not right.

I've realized I was testing it on Win10, not Windows Server 2012 R2. I'll rebase on main and see if it's still flaky. In case, I'll try to get access to this machine.

RafaelGSS avatar Aug 24 '22 14:08 RafaelGSS

It looks like this PR introduces a flakiness to test.parallel/test-net-server-max-connections on Windows, @RafaelGSS can you look into it? I don't think it should land as is.

Sorry for the delay, busy week. I've looked into the Reliability issues and I couldn't find a reference to the test-net-server-max-connections. I have also executed the test locally several times (on my windows machine) and it's working fine.

Reliability only reports tests that have failed for more than one PR so will not spot failures introduced by a PR (as they won't be occurring for other PRs).

richardlau avatar Aug 24 '22 14:08 richardlau

It looks like this PR introduces a flakiness to test.parallel/test-net-server-max-connections on Windows, @RafaelGSS can you look into it? I don't think it should land as is.

Sorry for the delay, busy week. I've looked into the Reliability issues and I couldn't find a reference to the test-net-server-max-connections. I have also executed the test locally several times (on my windows machine) and it's working fine.

Reliability only reports tests that have failed for more than one PR so will not spot failures introduced by a PR (as they won't be occurring for other PRs).

Yeah, I realized a bit late. I've looked to the past builds in this PR, and that test was often failing. But, I couldn't get any stack trace or something else. The output seems sliced.

RafaelGSS avatar Aug 24 '22 14:08 RafaelGSS

CI: https://ci.nodejs.org/job/node-test-pull-request/46222/

nodejs-github-bot avatar Aug 24 '22 14:08 nodejs-github-bot

@richardlau @nodejs/build I've set up a Windows Server 2012 R2 Machine and I couldn't reproduce this bug. Is there any chance of the build machine being stuck in an old cache?

image

I'll run the CI again. Let's see...

RafaelGSS avatar Sep 08 '22 00:09 RafaelGSS

CI: https://ci.nodejs.org/job/node-test-pull-request/46473/

nodejs-github-bot avatar Sep 08 '22 00:09 nodejs-github-bot

Yeah, I realized a bit late. I've looked to the past builds in this PR, and that test was often failing. But, I couldn't get any stack trace or something else. The output seems sliced.

The test failed with 3221226505 which can be a memory access violation https://docs.microsoft.com/en-us/shows/inside/c0000409, so it seems related to this PR. For tests that crashed like this we do not have stack traces in the CI (that probably requires some system-level handler to be configured to symbolicate the stack frames and write to stderr, I think in the CI that's only been done for a minority of setups on other OSs). If the memory issue depends on GC to reproduce (which is probably the case for this PR), it's not rare that it cannot be reproduced reliably and only show up on certain systems under certain load...

joyeecheung avatar Sep 08 '22 05:09 joyeecheung

On another note, the failure only reproduced on 32-bit windows, that might be why it reproduced relatively consistently in the CI but not with a 64-bit local build. You can check out https://ci.nodejs.org/job/node-compile-windows/46854/nodes=win-vs2019-x86/consoleFull and check out the build configs (specifically it used python configure --with-ltcg "--download=all" --with-intl=full-icu --dest-cpu=x86 --verbose --no-cross-compiling )

joyeecheung avatar Sep 08 '22 06:09 joyeecheung

@joyeecheung Thanks!

Unfortunately, looks like Microsoft doesn't provide access to the 32-bit versions. That's not available neither on the Azure cloud nor downloading the iso https://www.microsoft.com/en-us/evalcenter/download-windows-server-2012-r2, so I can't reproduce it in an accurate way (we should probably think about dropping its support in the next major).

On another note, the failure only reproduced on 32-bit windows, that might be why it reproduced relatively consistently in the CI but not with a 64-bit local build. You can check out https://ci.nodejs.org/job/node-compile-windows/46854/nodes=win-vs2019-x86/consoleFull and check out the build configs (specifically it used python configure --with-ltcg "--download=all" --with-intl=full-icu --dest-cpu=x86 --verbose --no-cross-compiling )

That's worked! It's indeed flaky:

C:\Users\rafaelgss\node>Release\node.exe test\parallel\test-net-server-max-connections.js
C:\Users\rafaelgss\node>echo %ERRORLEVEL%
-1073740791

C:\Users\rafaelgss\node>Release\node.exe test\parallel\test-net-server-max-connections.js
closed 10
closed 11
closed 12
closed 13
closed 14
closed 15
closed 16
closed 17
closed 18
closed 19
calling wait callback.
closed 0
closed 1
closed 2
closed 3
closed 4
closed 5
closed 6
closed 7
closed 8
closed 9

C:\Users\rafaelgss\node>echo %ERRORLEVEL%
0

I'll investigate.

RafaelGSS avatar Sep 08 '22 19:09 RafaelGSS

CI: https://ci.nodejs.org/job/node-test-pull-request/46554/

nodejs-github-bot avatar Sep 13 '22 03:09 nodejs-github-bot

CI: https://ci.nodejs.org/job/node-test-pull-request/46563/

nodejs-github-bot avatar Sep 13 '22 12:09 nodejs-github-bot

The flakiness is gone. It's just hitting some other flaky tests (not related to this PR)

RafaelGSS avatar Sep 13 '22 16:09 RafaelGSS

CI: https://ci.nodejs.org/job/node-test-pull-request/46569/

nodejs-github-bot avatar Sep 13 '22 16:09 nodejs-github-bot