jsdom-worker icon indicating copy to clipboard operation
jsdom-worker copied to clipboard

A jest worker process was terminated by another process

Open nfwyst opened this issue 2 years ago • 31 comments
trafficstars

error detail: signal=SIGSEGV, exitCode=null. Operating system logs may contain more information on why this occurred

nfwyst avatar Apr 07 '23 10:04 nfwyst

my test case:

import { QueryClient, QueryClientProvider } from '@tanstack/react-query'
import { cleanup, render } from '@testing-library/react'

import { Form } from '../index'

export const client = new QueryClient({
  defaultOptions: {
    queries: {
      refetchOnWindowFocus: false,
      retry: 1
    }
  }
})

afterEach(cleanup)

it('create test', () => {
  const { container } = render(
    <QueryClientProvider client={client}>
      <HiveForm />
    </QueryClientProvider>
  )
  const name = container.querySelector('#name')
  expect(name).toBeInTheDocument()
})

nfwyst avatar Apr 07 '23 10:04 nfwyst

hi @nfwyst did you ever figure it out? I have the same issue

Udbhav8 avatar Jun 08 '23 00:06 Udbhav8

same issue here

tavaneftekhar avatar Jun 08 '23 17:06 tavaneftekhar

found this thread while researching the jest bug, but I think this is caused by jest itself https://github.com/jestjs/jest/issues/13976

zk-ant avatar Jul 05 '23 12:07 zk-ant

+1. Critical error for a test framework.

rwilliams3088 avatar Aug 20 '23 13:08 rwilliams3088

Have the same problem.

ithrforu avatar Aug 25 '23 05:08 ithrforu

Bump. OS logs ain't got anything.

black-snow avatar Oct 30 '23 11:10 black-snow

Same issue.

revskill10 avatar Nov 14 '23 12:11 revskill10

same.

diemperdiem avatar Nov 15 '23 20:11 diemperdiem

same

rodrigovallades avatar Nov 17 '23 18:11 rodrigovallades

Same issue.

vgermes avatar Nov 20 '23 06:11 vgermes

Same issue

sharifsheraz avatar Nov 22 '23 05:11 sharifsheraz

PID 90503 received SIGSEGV for address: 0x0
0   segfault-handler.node               0x00000001119e1190 _ZL16segfault_handleriP9__siginfoPv + 296
1   libsystem_platform.dylib            0x000000018fa1ea24 _sigtramp + 56
2   node                                0x0000000104512fcc _ZN4node6loaderL23ImportModuleDynamicallyEN2v85LocalINS1_7ContextEEENS2_INS1_4DataEEENS2_INS1_5ValueEEENS2_INS1_6StringEEENS2_INS1_10FixedArrayEEE + 232
3   node                                0x00000001047e9a2c _ZN2v88internal7Isolate38RunHostImportModuleDynamicallyCallbackENS0_6HandleINS0_6ScriptEEENS2_INS0_6ObjectEEENS0_11MaybeHandleIS5_EE + 852
4   node                                0x0000000104bcbd9c _ZN2v88internal25Runtime_DynamicImportCallEiPmPNS0_7IsolateE + 276
5   node                                0x0000000104f152c4 Builtins_CEntry_Return1_DontSaveFPRegs_ArgvInRegister_NoBuiltinExit + 100
6   node                                0x0000000104faa1bc Builtins_CallRuntimeHandler + 92
7   node                                0x0000000104ea0198 Builtins_InterpreterEntryTrampoline + 248
8   node                                0x0000000104ea0198 Builtins_InterpreterEntryTrampoline + 248
9   node                                0x0000000104ea0198 Builtins_InterpreterEntryTrampoline + 248
10  node                                0x0000000104ea0198 Builtins_InterpreterEntryTrampoline + 248
11  ???                                 0x0000000109edd0d8 0x0 + 4461547736
12  ???                                 0x000000010a0393f0 0x0 + 4462973936
13  ???                                 0x0000000109e9c108 0x0 + 4461281544
14  ???                                 0x0000000109f88ba4 0x0 + 4462250916
15  ???                                 0x0000000109ee1f8c 0x0 + 4461567884
16  ???                                 0x0000000109ee23b8 0x0 + 4461568952
17  ???                                 0x0000000109ed79c0 0x0 + 4461525440
18  ???                                 0x0000000109fab148 0x0 + 4462391624
19  ???                                 0x0000000109ede4d8 0x0 + 4461552856
20  ???                                 0x0000000109f5bd90 0x0 + 4462067088
21  ???                                 0x0000000109f781a0 0x0 + 4462182816
22  node                                0x0000000104ea0198 Builtins_InterpreterEntryTrampoline + 248
23  node                                0x0000000104ea0198 Builtins_InterpreterEntryTrampoline + 248
24  node                                0x0000000104ea0198 Builtins_InterpreterEntryTrampoline + 248
25  node                                0x0000000104ed1ef4 Builtins_AsyncFunctionAwaitResolveClosure + 84
26  node                                0x0000000104f60738 Builtins_PromiseFulfillReactionJob + 56
27  node                                0x0000000104ec3c4c Builtins_RunMicrotasks + 588
28  node                                0x0000000104e9e3a4 Builtins_JSRunMicrotasksEntry + 164
29  node                                0x00000001047cf9ac _ZN2v88internal12_GLOBAL__N_16InvokeEPNS0_7IsolateERKNS1_12InvokeParamsE + 2680
30  node                                0x00000001047cfe9c _ZN2v88internal12_GLOBAL__N_118InvokeWithTryCatchEPNS0_7IsolateERKNS1_12InvokeParamsE + 88
31  node                                0x00000001047d0078 _ZN2v88internal9Execution16TryRunMicrotasksEPNS0_7IsolateEPNS0_14MicrotaskQueueEPNS0_11MaybeHandleINS0_6ObjectEEE + 64

daqi avatar Dec 02 '23 13:12 daqi

Same issue.

node --expose-gc ./node_modules/.bin/jest --config ./jest.config.json --no-cache --logHeapUsage --forceExit -maxWorkers=6

This works fine: node --expose-gc ./node_modules/.bin/jest --config ./jest.config.json --runInBand --forceExit

Jus1x-by avatar Dec 18 '23 11:12 Jus1x-by

Same issue.

danjor avatar Dec 19 '23 09:12 danjor

Same issue.

clucca-gb avatar Dec 19 '23 14:12 clucca-gb

Same issue

bellomayowa avatar Jan 03 '24 17:01 bellomayowa

same issue

stickmy avatar Jan 05 '24 09:01 stickmy

same issue

pamelalozano16 avatar Jan 11 '24 01:01 pamelalozano16

Same issue, This happens when I change my docker image from a Debian-based to Alpine, maybe this helps

maxidr avatar Jan 30 '24 15:01 maxidr

Same issue, any plans on this? Thanks

haskelcurry avatar Feb 14 '24 13:02 haskelcurry

I have it as well. Jest version 29.6.0

opoveshchenko avatar Feb 15 '24 15:02 opoveshchenko

same issue, hard to debug. It's quite possible that one test is disrupting the others, but how to isolate it?

ThibaudAV avatar Feb 21 '24 09:02 ThibaudAV

I faced the same error message. In my case the jest tests failed because their worker processes were killed by the OS oom (out of memory) killer. If you run it with the --runInBand option then it runs the tests in sequence, otherwise it uses the number of available cores minus 1. If your tests use a lot of memory it can easily exhaust it. In my case I was running it inside an Alpine docker container. I needed to run it in privileged mode: docker run --privileged -e "container=docker" -it ..... to be able to see the dmesg output and find it out. It can also die because the Node.js process running it has insufficient heap space. You can try to set it to 4GB as a start, for example: NODE_OPTIONS=--max_old_space_size=4096 If you are using yarn then you can also examine the heap usage with something like this: yarn node --expose-gc $(yarn test:my_app --ci --maxWorkers=1 --logHeapUsage) where yarn test:my_app runs the jest test suites.

vargabi avatar Feb 21 '24 15:02 vargabi

Had similar issue:

A jest worker process (pid=X) was terminated by another process: signal=SIGBUS, exitCode=null. Operating system logs may contain more information on why this occurred.

Deleting node modules and reinstalling fixed issue.

garik-galstyan avatar Apr 22 '24 15:04 garik-galstyan