varnish-cache
varnish-cache copied to clipboard
Assert error in V1F_FetchRespHdr(), ../../../../bin/varnishd/http1/cache_http1_fetch.c line 296:
Unrelated panic from b53 on cci:
**** v1 CLI RX|Panic at: Wed, 26 Oct 2022 13:11:04 GMT
**** v1 CLI RX|Assert error in V1F_FetchRespHdr(), ../../../../bin/varnishd/http1/cache_http1_fetch.c line 296:
**** v1 CLI RX| Condition(bo->vfc->resp == bo->beresp) not true.
**** v1 CLI RX|version = varnish-trunk revision 3ef5390dd417cdc4625745753bb4a715829012d0, vrt api = 16.0
**** v1 CLI RX|ident = Linux,5.4.0-1021-gcp,x86_64,-jnone,-sdefault,-sdefault,-hcritbit,epoll
**** v1 CLI RX|now = 380.022943 (mono), 1666789864.462510 (real)
**** v1 CLI RX|Backtrace:
**** v1 CLI RX| ip=0x565c2614 sp=0xf7862460 <pan_ic+0x2a5>
**** v1 CLI RX| ip=0x5665dba5 sp=0xf7862550 <VAS_Fail+0x44>
**** v1 CLI RX| ip=0x565fb2af sp=0xf7862590 <V1F_FetchRespHdr+0x899>
**** v1 CLI RX| ip=0x5658e84d sp=0xf78625f0 <vbe_dir_gethdrs+0x501>
**** v1 CLI RX| ip=0x5659e626 sp=0xf7862630 <VDI_GetHdr+0x138>
**** v1 CLI RX| ip=0x565aa211 sp=0xf78626c0 <vbf_stp_startfetch+0x381>
**** v1 CLI RX| ip=0x565aca5b sp=0xf7862710 <vbf_fetch_thread+0x40d>
**** v1 CLI RX| ip=0x565eef60 sp=0xf78627b0 <Pool_Work_Thread+0x665>
**** v1 CLI RX| ip=0x565edbda sp=0xf7862850 <WRK_Thread+0x2cc>
**** v1 CLI RX| ip=0x565ef0e0 sp=0xf7863330 <pool_thread+0x9c>
**** v1 CLI RX| ip=0xf7d2f3bd sp=0xf7863360 <start_thread+0xfd>
**** v1 CLI RX| ip=0xf7c45d86 sp=0xf7863430 <clone+0x66>
**** v1 CLI RX|argv = {
**** v1 CLI RX| [0] = \"varnishd\",
**** v1 CLI RX| [1] = \"-d\",
**** v1 CLI RX| [2] = \"-n\",
**** v1 CLI RX| [3] = \"/tmp/vtc.41380.587e0f5e/v1\",
**** v1 CLI RX| [4] = \"-i\",
**** v1 CLI RX| [5] = \"v1\",
**** v1 CLI RX| [6] = \"-l\",
**** v1 CLI RX| [7] = \"2m\",
**** v1 CLI RX| [8] = \"-p\",
**** v1 CLI RX| [9] = \"auto_restart=off\",
**** v1 CLI RX| [10] = \"-p\",
**** v1 CLI RX| [11] = \"syslog_cli_traffic=off\",
**** v1 CLI RX| [12] = \"-p\",
**** v1 CLI RX| [13] = \"thread_pool_min=10\",
**** v1 CLI RX| [14] = \"-p\",
**** v1 CLI RX| [15] = \"debug=+vtc_mode\",
**** v1 CLI RX| [16] = \"-p\",
**** v1 CLI RX| [17] = \"vsl_mask=+Debug\",
**** v1 CLI RX| [18] = \"-p\",
**** v1 CLI RX| [19] = \"h2_initial_window_size=1m\",
**** v1 CLI RX| [20] = \"-p\",
**** v1 CLI RX| [21] = \"h2_rx_window_low_water=64k\",
**** v1 CLI RX| [22] = \"-M\",
**** v1 CLI RX| [23] = \"127.0.0.1 45769\",
**** v1 CLI RX| [24] = \"-P\",
**** v1 CLI RX| [25] = \"/tmp/vtc.41380.587e0f5e/v1/varnishd.pid\",
**** v1 CLI RX| [26] = \"-p\",
**** v1 CLI RX| [27] = \"vmod_path=/workspace/varnish-trunk/_build/sub/vmod/.libs\",
**** v1 CLI RX| [28] = \"-a\",
**** v1 CLI RX| [29] = \"/tmp/vtc.41380.587e0f5e/v1.sock\",
**** v1 CLI RX| [30] = \"-a\",
**** v1 CLI RX| [31] = \"127.0.0.1:0\",
**** v1 CLI RX|}
**** v1 CLI RX|pthread.self = 0xf7863b40
**** v1 CLI RX|pthread.name = (cache-worker)
**** v1 CLI RX|pthread.attr = {
**** v1 CLI RX| guard = 4096,
**** v1 CLI RX| stack_bottom = 0xf7854000,
**** v1 CLI RX| stack_top = 0xf7864000,
**** v1 CLI RX| stack_size = 65536,
**** v1 CLI RX|}
**** v1 CLI RX|thr.req = NULL
**** v1 CLI RX|thr.busyobj = 0xf50c4018 {
**** v1 CLI RX| end = 0xf50c9000,
**** v1 CLI RX| retries = 0,
**** v1 CLI RX| sess = 0xf50af318 {
**** v1 CLI RX| fd = 24, vxid = 1000,
**** v1 CLI RX| t_open = 1666789864.461713,
**** v1 CLI RX| t_idle = 1666789864.461713,
**** v1 CLI RX| ws = 0xf50af34c {
**** v1 CLI RX| id = \"ses\",
**** v1 CLI RX| {s, f, r, e} = {0xf50af394, +48, (nil), +616},
**** v1 CLI RX| },
**** v1 CLI RX| transport = HTTP/1 {
**** v1 CLI RX| state = HTTP1::Proc
**** v1 CLI RX| }
**** v1 CLI RX| client = 0.0.0.0 0 /tmp/vtc.41380.587e0f5e/v1.sock,
**** v1 CLI RX| local.endpoint = /tmp/vtc.41380.587e0f5e/v1.sock,
**** v1 CLI RX| local.socket = a0,
**** v1 CLI RX| local.ip = 0.0.0.0:0,
**** v1 CLI RX| remote.ip = 0.0.0.0:0,
**** v1 CLI RX| server.ip = 0.0.0.0:0,
**** v1 CLI RX| client.ip = 0.0.0.0:0,
**** v1 CLI RX| },
**** v1 CLI RX| worker = 0xf7863294 {
**** v1 CLI RX| ws = 0xf78632f8 {
**** v1 CLI RX| id = \"wrk\",
**** v1 CLI RX| {s, f, r, e} = {0xf7862870, +0, (nil), +2044},
**** v1 CLI RX| },
**** v1 CLI RX| VCL::method = BACKEND_FETCH,
**** v1 CLI RX| VCL::methods = {BACKEND_FETCH},
**** v1 CLI RX| },
**** v1 CLI RX| vfc = 0xf50c5860 {
**** v1 CLI RX| failed = 1999131479,
**** v1 CLI RX| req = 0x202c6b72,
**** v1 CLI RX| resp = 0x37667830,
**** v1 CLI RX| wrk = 0x32333638,
**** v1 CLI RX| oc = 0x20293866,
**** v1 CLI RX| filters = {
**** v1 CLI RX|
---- v1 Unexpected panic
The panic dump seems to segfault before it completes...
https://app.circleci.com/pipelines/github/varnishcache/varnish-cache/4015/workflows/7d9199bf-7b01-4afa-aafe-d336ad2e03e1/jobs/43693
On alpine all test failures appear to be a child startup after it was stopped, all with this kind of output:
** top === varnish v1 -start
** v1 Start
**** v1 CLI TX|start
**** dT 2.689
*** v1 debug|Debug: Child (60739) Started
**** dT 2.859
*** v1 vsl|VSM_Map(): Could not open segment
**** dT 2.959
*** v1 vsl|VSM_Map(): Could not open segment
**** dT 2.963
*** v1 debug|Info: Child (-1) said WARNING: Could not disable ASLR
*** v1 debug|Info: Child (-1) said sizeof(struct smp_ident) = 112 = 0x70
*** v1 debug|Info: Child (-1) said sizeof(struct smp_sign) = 32 = 0x20
*** v1 debug|Info: Child (-1) said sizeof(struct smp_segptr) = 32 = 0x20
*** v1 debug|Info: Child (-1) said sizeof(struct smp_object) = 72 = 0x48
*** v1 debug|Info: Child (-1) said min_nseg = 10, max_segl = 104441
*** v1 debug|Info: Child (-1) said max_nseg = 6553, min_segl = 159
*** v1 debug|Info: Child (-1) said aim_nseg = 255, aim_segl = 4095
*** v1 debug|Info: Child (-1) said free_reserve = 40950
*** v1 debug|Error: Child failed on launch
**** v1 STDOUT EOF
https://app.circleci.com/pipelines/github/varnishcache/varnish-cache/4015/workflows/7d9199bf-7b01-4afa-aafe-d336ad2e03e1/jobs/43686
All presumably unrelated to this change, but noteworthy.
Originally posted by @Dridi in https://github.com/varnishcache/varnish-cache/issues/3859#issuecomment-1292173239
The panic happened again in https://app.circleci.com/pipelines/github/varnishcache/varnish-cache/4064/workflows/e2440611-b33c-422a-9d98-eab921278fd6/jobs/44473 for #3862.
This could possibly be caused by 4ed6181f8d505283c879d34ac9bcbdc07074cdd9