Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FAIL tests/c00057.vtc on MacOS X 14.5 with Varnish branch 6.0 LTS and master branch #4124

Open
dwalrond opened this issue Jun 17, 2024 · 8 comments

Comments

@dwalrond
Copy link

dwalrond commented Jun 17, 2024

Expected Behavior

make check has no failures

Current Behavior

====================================================
   Varnish 6.0.13: bin/varnishtest/test-suite.log
====================================================

# TOTAL: 802
# PASS:  772
# SKIP:  29
# XFAIL: 0
# FAIL:  1
# XPASS: 0
# ERROR: 0

FAIL: tests/c00057
==================

**** top   0.0 extmacro def pwd=/Users/dawa/src/varnish-cache/bin/varnishtest
**** top   0.0 extmacro def date(...)
**** top   0.0 extmacro def string(...)
**** top   0.0 extmacro def localhost=127.0.0.1
**** top   0.0 extmacro def bad_backend=127.0.0.1 56450
**** top   0.0 extmacro def bad_ip=192.0.2.255
**** top   0.0 extmacro def topbuild=/Users/dawa/src/varnish-cache
**** top   0.0 macro def testdir=/Users/dawa/src/varnish-cache/bin/varnishtest/./tests
**** top   0.0 macro def tmpdir=/var/folders/cw/hxx9vcj55s72xc8yrxn53r540000gn/T//vtc.54272.3078712e
*    top   0.0 TEST ./tests/c00057.vtc starting
**   top   0.0 === varnishtest	"test sigsegv handler"
*    top   0.0 TEST test sigsegv handler
**   top   0.0 === feature cmd "test -z $(env | grep ASAN_OPTIONS)"
**   top   0.0 === server s1 {
**   s1    0.0 Starting server
**** s1    0.0 macro def s1_addr=127.0.0.1
**** s1    0.0 macro def s1_port=56451
**** s1    0.0 macro def s1_sock=127.0.0.1 56451
*    s1    0.0 Listen on 127.0.0.1 56451
**   top   0.0 === varnish v1 \
**   s1    0.0 Started on 127.0.0.1 56451 (1 iterations)
**   v1    0.0 Launch
***  v1    0.0 CMD: cd ${pwd} && exec varnishd  -d -n /var/folders/cw/hxx9vcj55s72xc8yrxn53r540000gn/T//vtc.54272.3078712e/v1 -l 2m -p auto_restart=off -p syslog_cli_traffic=off -p sigsegv_handler=on -p thread_pool_min=10 -p debug=+vtc_mode -p vsl_mask=+Debug,+H2RxHdr,+H2RxBody -p h2_initial_window_size=1m -p h2_rx_window_low_water=64k -a '127.0.0.1:0' -M '127.0.0.1 56452' -P /var/folders/cw/hxx9vcj55s72xc8yrxn53r540000gn/T//vtc.54272.3078712e/v1/varnishd.pid -p vmod_path=/Users/dawa/src/varnish-cache/lib/libvmod_std/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_debug/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_directors/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_purge/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_vtc/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_blob/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_unix/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_proxy/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_h2/.libs  -p feature=+no_coredump -p vcc_allow_inline_c=true -p thread_pool_stack=48k
***  v1    0.0 CMD: cd /Users/dawa/src/varnish-cache/bin/varnishtest && exec varnishd  -d -n /var/folders/cw/hxx9vcj55s72xc8yrxn53r540000gn/T//vtc.54272.3078712e/v1 -l 2m -p auto_restart=off -p syslog_cli_traffic=off -p sigsegv_handler=on -p thread_pool_min=10 -p debug=+vtc_mode -p vsl_mask=+Debug,+H2RxHdr,+H2RxBody -p h2_initial_window_size=1m -p h2_rx_window_low_water=64k -a '127.0.0.1:0' -M '127.0.0.1 56452' -P /var/folders/cw/hxx9vcj55s72xc8yrxn53r540000gn/T//vtc.54272.3078712e/v1/varnishd.pid -p vmod_path=/Users/dawa/src/varnish-cache/lib/libvmod_std/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_debug/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_directors/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_purge/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_vtc/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_blob/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_unix/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_proxy/.libs:/Users/dawa/src/varnish-cache/lib/libvmod_h2/.libs  -p feature=+no_coredump -p vcc_allow_inline_c=true -p thread_pool_stack=48k
***  v1    0.0 PID: 54294
**** v1    0.0 macro def v1_pid=54294
**** v1    0.0 macro def v1_name=/var/folders/cw/hxx9vcj55s72xc8yrxn53r540000gn/T//vtc.54272.3078712e/v1
***  v1    0.0 debug|Debug: Version: varnish-6.0.13 revision ec8f05464fbed40565103beb4a1d834a1f5b24a1
***  v1    0.0 debug|Debug: Platform: Darwin,23.5.0,arm64,-jnone,-sdefault,-sdefault,-hcritbit
***  v1    0.0 debug|200 308
***  v1    0.0 debug|-----------------------------
***  v1    0.0 debug|Varnish Cache CLI 1.0
***  v1    0.0 debug|-----------------------------
***  v1    0.0 debug|Darwin,23.5.0,arm64,-jnone,-sdefault,-sdefault,-hcritbit
***  v1    0.0 debug|varnish-6.0.13 revision ec8f05464fbed40565103beb4a1d834a1f5b24a1
***  v1    0.0 debug|
***  v1    0.0 debug|Type 'help' for command list.
***  v1    0.0 debug|Type 'quit' to close CLI session.
***  v1    0.0 debug|Type 'start' to launch worker process.
***  v1    0.0 debug|
**** v1    0.1 CLIPOLL 1 0x1 0x0
***  v1    0.1 CLI connection fd = 6
***  v1    0.1 CLI RX  107
**** v1    0.1 CLI RX|gzhaveqyqcpnsrlqcreubmmwrkqaojxa
**** v1    0.1 CLI RX|
**** v1    0.1 CLI RX|Authentication required.
**** v1    0.1 CLI TX|auth d91b9b7a5f3ac30008c3dc480d674d1062c95859bd9ef6d3f4c38134b8cb3c82
***  v1    0.1 CLI RX  200
**** v1    0.1 CLI RX|-----------------------------
**** v1    0.1 CLI RX|Varnish Cache CLI 1.0
**** v1    0.1 CLI RX|-----------------------------
**** v1    0.1 CLI RX|Darwin,23.5.0,arm64,-jnone,-sdefault,-sdefault,-hcritbit
**** v1    0.1 CLI RX|varnish-6.0.13 revision ec8f05464fbed40565103beb4a1d834a1f5b24a1
**** v1    0.1 CLI RX|
**** v1    0.1 CLI RX|Type 'help' for command list.
**** v1    0.1 CLI RX|Type 'quit' to close CLI session.
**** v1    0.1 CLI RX|Type 'start' to launch worker process.
**** v1    0.1 CLI TX|vcl.inline vcl1 << %XJEIFLH|)Xspa8P
**** v1    0.1 CLI TX|vcl 4.1;
**** v1    0.1 CLI TX|backend s1 { .host = "127.0.0.1"; .port = "56451"; }
**** v1    0.1 CLI TX|
**** v1    0.1 CLI TX|
**** v1    0.1 CLI TX|\tC{
**** v1    0.1 CLI TX|#include <signal.h>
**** v1    0.1 CLI TX|#include <unistd.h>
**** v1    0.1 CLI TX|#include <stdio.h>
**** v1    0.1 CLI TX|
**** v1    0.1 CLI TX|static void _accessor(volatile char *p) {
**** v1    0.1 CLI TX|     p[0] = 'V'; p[1] = '\0';
**** v1    0.1 CLI TX|     fprintf(stderr, "%p %s\n", p, p);
**** v1    0.1 CLI TX|}
**** v1    0.1 CLI TX|void (*accessor)(volatile char *p) = _accessor;
**** v1    0.1 CLI TX|
**** v1    0.1 CLI TX|}C
**** v1    0.1 CLI TX|\tsub vcl_recv { C{
**** v1    0.1 CLI TX|\t    int i;
**** v1    0.1 CLI TX|\t    volatile char overflow[48*1024];
**** v1    0.1 CLI TX|
**** v1    0.1 CLI TX|\t    /* for downwards stack, take care to hit a single guard page */
**** v1    0.1 CLI TX|\t    for (i = 47*1024; i >= 0; i -= 1024)
**** v1    0.1 CLI TX|\t\taccessor(overflow + i);
**** v1    0.1 CLI TX|\t    /* NOTREACHED */
**** v1    0.1 CLI TX|\t    sleep(2);
**** v1    0.1 CLI TX|\t}C }
**** v1    0.1 CLI TX|
**** v1    0.1 CLI TX|%XJEIFLH|)Xspa8P
***  v1    0.2 vsl|No VSL chunk found (child not started ?)
***  v1    0.3 vsl|No VSL chunk found (child not started ?)
***  v1    0.4 CLI RX  200
**** v1    0.4 CLI RX|VCL compiled.
**** v1    0.4 CLI TX|vcl.use vcl1
***  v1    0.4 CLI RX  200
**   v1    0.4 Start
**** v1    0.4 CLI TX|start
***  v1    0.4 debug|Debug: Child (54303) Started
***  v1    0.5 CLI RX  200
***  v1    0.5 wait-running
**** v1    0.5 CLI TX|status
***  v1    0.5 debug|Info: Child (54303) said Child starts
***  v1    0.5 CLI RX  200
**** v1    0.5 CLI RX|Child in state running
**** v1    0.5 CLI TX|debug.listen_address
***  v1    0.5 CLI RX  200
**** v1    0.5 CLI RX|127.0.0.1 56453
**** v1    0.5 CLI TX|debug.xid 999
***  v1    0.5 CLI RX  200
**** v1    0.5 CLI RX|XID is 999
**** v1    0.5 CLI TX|debug.listen_address
***  v1    0.5 CLI RX  200
**** v1    0.5 CLI RX|127.0.0.1 56453
**   v1    0.5 Listen on 127.0.0.1 56453
**** v1    0.5 macro def v1_addr=127.0.0.1
**** v1    0.5 macro def v1_port=56453
**** v1    0.5 macro def v1_sock=127.0.0.1 56453
**** v1    0.5 vsl|          0 CLI             - Rd vcl.load "vcl1" vcl_vcl1.1718617500.910606/vgc.so 1auto
**** v1    0.5 vsl|          0 CLI             - Wr 200 52 Loaded "vcl_vcl1.1718617500.910606/vgc.so" as "vcl1"
**** v1    0.5 vsl|          0 CLI             - Rd vcl.use "vcl1"
**** v1    0.5 vsl|          0 CLI             - Wr 200 0
**** v1    0.5 vsl|          0 CLI             - Rd start
**** v1    0.5 vsl|          0 Debug           - sockopt: Setting SO_LINGER for a0=127.0.0.1:56453
**** v1    0.5 vsl|          0 Debug           - sockopt: Setting SO_KEEPALIVE for a0=127.0.0.1:56453
**** v1    0.5 vsl|          0 Debug           - sockopt: Setting SO_SNDTIMEO for a0=127.0.0.1:56453
**** v1    0.5 vsl|          0 Debug           - sockopt: Setting SO_RCVTIMEO for a0=127.0.0.1:56453
**** v1    0.5 vsl|          0 Debug           - sockopt: Setting TCP_NODELAY for a0=127.0.0.1:56453
**** v1    0.5 vsl|          0 CLI             - Wr 200 0
**** v1    0.5 vsl|          0 CLI             - Rd debug.listen_address
**** v1    0.5 vsl|          0 CLI             - Wr 200 16 127.0.0.1 56453

**** v1    0.5 vsl|          0 CLI             - Rd debug.xid 999
**** v1    0.5 vsl|          0 CLI             - Wr 200 10 XID is 999
**** v1    0.5 vsl|          0 CLI             - Rd debug.listen_address
**** v1    0.5 vsl|          0 CLI             - Wr 200 16 127.0.0.1 56453

**   top   0.6 === client c1 {
**   c1    0.6 Starting client
**   c1    0.6 Waiting for client
***  c1    0.6 Connect to 127.0.0.1 56453
***  c1    0.6 connected fd 16 from 127.0.0.1 56457 to 127.0.0.1 56453
**   c1    0.6 === txreq
**** c1    0.6 txreq|GET / HTTP/1.1\r
**** c1    0.6 txreq|Host: 127.0.0.1\r
**** c1    0.6 txreq|\r
**   c1    0.6 === expect_close
**** c1    0.6 Expecting close (fd = 16)
***  v1    0.6 debug|Info: Child (54303) said 0x16ff51cf8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff518f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff514f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff510f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff50cf8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff508f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff504f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff500f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4fcf8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4f8f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4f4f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4f0f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4ecf8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4e8f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4e4f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4e0f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4dcf8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4d8f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4d4f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4d0f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4ccf8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4c8f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4c4f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4c0f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4bcf8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4b8f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4b4f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4b0f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4acf8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4a8f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4a4f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff4a0f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff49cf8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff498f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff494f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff490f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff48cf8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff488f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff484f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff480f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff47cf8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff478f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff474f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff470f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff46cf8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff468f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff464f8 V
***  v1    0.6 debug|Info: Child (54303) said 0x16ff460f8 V
**** v1    0.6 vsl|       1000 Begin           c sess 0 HTTP/1
**** v1    0.6 vsl|       1000 SessOpen        c 127.0.0.1 56457 a0 127.0.0.1 56453 1718617501.344371 22
**** v1    0.6 vsl|       1000 Debug           c sockopt: Test confirmed SO_KEEPALIVE non heredity for a0=127.0.0.1:56453
**** v1    0.6 vsl|       1000 Debug           c sockopt: Test confirmed SO_SNDTIMEO non heredity for a0=127.0.0.1:56453
**** v1    0.6 vsl|       1000 Debug           c sockopt: Test confirmed SO_RCVTIMEO non heredity for a0=127.0.0.1:56453
**** v1    0.6 vsl|       1000 Debug           c sockopt: Test confirmed TCP_NODELAY non heredity for a0=127.0.0.1:56453
**** v1    0.6 vsl|       1000 Debug           c sockopt: SO_LINGER may be inherited for a0=127.0.0.1:56453
**** v1    0.6 vsl|       1000 Debug           c sockopt: Setting SO_KEEPALIVE for a0=127.0.0.1:56453
**** v1    0.6 vsl|       1000 Debug           c sockopt: Setting SO_SNDTIMEO for a0=127.0.0.1:56453
**** v1    0.6 vsl|       1000 Debug           c sockopt: Setting SO_RCVTIMEO for a0=127.0.0.1:56453
**** v1    0.6 vsl|       1000 Debug           c sockopt: Setting TCP_NODELAY for a0=127.0.0.1:56453
**** v1    0.6 vsl|       1000 Link            c req 1001 rxreq
***  s1    2.6 accepted fd 17 127.0.0.1 56471
**   s1    2.6 === rxreq
**** s1    2.6 rxhdr|GET / HTTP/1.1\r
**** s1    2.6 rxhdr|Host: 127.0.0.1\r
**** s1    2.6 rxhdr|X-Forwarded-For: 127.0.0.1\r
**** s1    2.6 rxhdr|Accept-Encoding: gzip\r
**** s1    2.6 rxhdr|X-Varnish: 1002\r
**** s1    2.6 rxhdr|\r
**** s1    2.6 rxhdrlen = 103
**** s1    2.6 http[ 0] |GET
**** s1    2.6 http[ 1] |/
**** s1    2.6 http[ 2] |HTTP/1.1
**** s1    2.6 http[ 3] |Host: 127.0.0.1
**** s1    2.6 http[ 4] |X-Forwarded-For: 127.0.0.1
**** s1    2.6 http[ 5] |Accept-Encoding: gzip
**** s1    2.6 http[ 6] |X-Varnish: 1002
**** s1    2.6 bodylen = 0
**   s1    2.6 === txresp
**** s1    2.6 txresp|HTTP/1.1 200 OK\r
**** s1    2.6 txresp|Content-Length: 0\r
**** s1    2.6 txresp|\r
***  s1    2.6 shutting fd 17
**   s1    2.6 Ending
---- c1    2.6 Expecting close: read = 1, c = 0x48
*    top   2.6 RESETTING after ./tests/c00057.vtc
**   s1    2.6 Waiting for server (4/-1)
**   v1    2.6 Wait
**** v1    2.6 CLI TX|panic.clear
***  v1    2.6 CLI RX  300
**** v1    2.6 CLI RX|No panic to clear
***  v1    2.6 debug|Info: manager stopping child
***  v1    2.6 debug|Debug: Stopping Child
**** v1    2.6 vsl|          0 ExpKill         - EXP_Inbox flg=1e p=0x124e05bf0 e=0.000000 f=0x0
**** v1    2.6 vsl|          0 ExpKill         - EXP_When p=0x124e05bf0 e=1718617633.347532 f=0x1e
**** v1    2.6 vsl|          0 ExpKill         - EXP_expire p=0x124e05bf0 e=129.987463 f=0x0
**** v1    2.6 vsl|       1002 Begin           b bereq 1001 fetch
**** v1    2.6 vsl|       1002 VCL_use         b vcl1
**** v1    2.6 vsl|       1002 Timestamp       b Start: 1718617503.346064 0.000000 0.000000
**** v1    2.6 vsl|       1002 BereqMethod     b GET
**** v1    2.6 vsl|       1002 BereqURL        b /
**** v1    2.6 vsl|       1002 BereqProtocol   b HTTP/1.1
**** v1    2.6 vsl|       1002 BereqHeader     b Host: 127.0.0.1
**** v1    2.6 vsl|       1002 BereqHeader     b X-Forwarded-For: 127.0.0.1
**** v1    2.6 vsl|       1002 BereqHeader     b Accept-Encoding: gzip
**** v1    2.6 vsl|       1002 BereqHeader     b X-Varnish: 1002
**** v1    2.6 vsl|       1002 VCL_call        b BACKEND_FETCH
**** v1    2.6 vsl|       1002 VCL_return      b fetch
**** v1    2.6 vsl|       1002 BackendOpen     b 26 vcl1.s1 127.0.0.1 56451 127.0.0.1 56471
**** v1    2.6 vsl|       1002 BackendStart    b 127.0.0.1 56451
**** v1    2.6 vsl|       1002 Timestamp       b Bereq: 1718617503.346691 0.000627 0.000627
**** v1    2.6 vsl|       1002 Timestamp       b Beresp: 1718617503.347532 0.001468 0.000841
**** v1    2.6 vsl|       1002 BerespProtocol  b HTTP/1.1
**** v1    2.6 vsl|       1002 BerespStatus    b 200
**** v1    2.6 vsl|       1002 BerespReason    b OK
**** v1    2.6 vsl|       1002 BerespHeader    b Content-Length: 0
**** v1    2.6 vsl|       1002 BerespHeader    b Date: Mon, 17 Jun 2024 09:45:03 GMT
**** v1    2.6 vsl|       1002 TTL             b RFC 120 10 0 1718617503 1718617503 1718617503 0 0 cacheable
**** v1    2.6 vsl|       1002 VCL_call        b BACKEND_RESPONSE
**** v1    2.6 vsl|       1002 VCL_return      b deliver
**** v1    2.6 vsl|       1002 Storage         b malloc s0
**** v1    2.6 vsl|       1002 Fetch_Body      b 0 none -
**** v1    2.6 vsl|       1002 BackendReuse    b 26 vcl1.s1
**** v1    2.6 vsl|       1002 Timestamp       b BerespBody: 1718617503.360049 0.013985 0.012517
**** v1    2.6 vsl|       1002 Length          b 0
**** v1    2.6 vsl|       1002 BereqAcct       b 103 0 103 38 0 38
**** v1    2.6 vsl|       1002 End             b
**** v1    2.6 vsl|       1001 Begin           c req 1000 rxreq
**** v1    2.6 vsl|       1001 Timestamp       c Start: 1718617501.344564 0.000000 0.000000
**** v1    2.6 vsl|       1001 Timestamp       c Req: 1718617501.344564 0.000000 0.000000
**** v1    2.6 vsl|       1001 ReqStart        c 127.0.0.1 56457 a0
**** v1    2.6 vsl|       1001 ReqMethod       c GET
**** v1    2.6 vsl|       1001 ReqURL          c /
**** v1    2.6 vsl|       1001 ReqProtocol     c HTTP/1.1
**** v1    2.6 vsl|       1001 ReqHeader       c Host: 127.0.0.1
**** v1    2.6 vsl|       1001 ReqHeader       c X-Forwarded-For: 127.0.0.1
**** v1    2.6 vsl|       1001 VCL_call        c RECV
**** v1    2.6 vsl|       1001 VCL_return      c hash
**** v1    2.6 vsl|       1001 VCL_call        c HASH
**** v1    2.6 vsl|       1001 VCL_return      c lookup
**** v1    2.6 vsl|       1001 VCL_call        c MISS
**** v1    2.6 vsl|       1001 VCL_return      c fetch
**** v1    2.6 vsl|       1001 Link            c bereq 1002 fetch
**** v1    2.6 vsl|       1001 Timestamp       c Fetch: 1718617503.360090 2.015526 2.015526
**** v1    2.6 vsl|       1001 RespProtocol    c HTTP/1.1
**** v1    2.6 vsl|       1001 RespStatus      c 200
**** v1    2.6 vsl|       1001 RespReason      c OK
**** v1    2.6 vsl|       1001 RespHeader      c Content-Length: 0
**** v1    2.6 vsl|       1001 RespHeader      c Date: Mon, 17 Jun 2024 09:45:03 GMT
**** v1    2.6 vsl|       1001 RespHeader      c X-Varnish: 1001
**** v1    2.6 vsl|       1001 RespHeader      c Age: 0
**** v1    2.6 vsl|       1001 RespHeader      c Via: 1.1 varnish (Varnish/6.0)
**** v1    2.6 vsl|       1001 VCL_call        c DELIVER
**** v1    2.6 vsl|       1001 VCL_return      c deliver
**** v1    2.6 vsl|       1001 Timestamp       c Process: 1718617503.360121 2.015557 0.000031
**** v1    2.6 vsl|       1001 RespHeader      c Accept-Ranges: bytes
**** v1    2.6 vsl|       1001 RespHeader      c Connection: keep-alive
**** v1    2.6 vsl|       1001 Timestamp       c Resp: 1718617503.360273 2.015709 0.000152
**** v1    2.6 vsl|       1001 ReqAcct         c 35 0 35 178 0 178
**** v1    2.6 vsl|       1001 End             c
**** v1    2.6 vsl|          0 CLI             - EOF on CLI connection, worker stops
***  v1    2.7 debug|Info: Child (54303) ended
***  v1    2.7 debug|Info: Child (54303) said Child dies
***  v1    2.7 debug|Debug: Child cleanup complete
***  v1    2.7 debug|Info: manager dies
**** v1    2.7 STDOUT poll 0x11
**   v1    2.7 WAIT4 pid=54294 status=0x0000 (user 0.082511 sys 0.074753)
*    top   2.7 TEST ./tests/c00057.vtc FAILED
#    top  TEST ./tests/c00057.vtc FAILED (2.729) exit=2
FAIL tests/c00057.vtc (exit status: 2)

Possible Solution

No response

Steps to Reproduce (for bugs)

  1. git clone https://github.com/varnishcache/varnish-cache.git
  2. ./autogen.sh
  3. ./configure
  4. make
  5. make check

The same test failed when I built from the release tarball https://varnish-cache.org/downloads/varnish-6.0.13.tgz and applied commit a8dd3c8cad8bea29f378a68823f96e36a9cbfe17 -- "Cast to (u)intmax_t for %ju and %jd" so it built.

Context

Building varnish 6.0 LTS mostly for access to varnishtest to match version running on production servers.

Varnish Cache version

varnishd (varnish-6.0.13 revision ec8f054)

Operating system

Mac OS X 14.5

Source of binary packages used (if any)

No response

@dridi
Copy link
Member

dridi commented Jun 24, 2024

Does it pass if you apply 88948d9 to the 6.0 branch?

@dwalrond
Copy link
Author

It does not pass with 88948d9 applied to the 6.0 branch. Also the same test fails in the HEAD of master (commit 23ec1fc was the HEAD when I compiled). I will adjust the issue title as well.

@dwalrond dwalrond changed the title FAIL tests/c00057.vtc on MacOS X 14.5 with Varnish branch 6.0 LTS FAIL tests/c00057.vtc on MacOS X 14.5 with Varnish branch 6.0 LTS and master branch Jun 25, 2024
@dridi
Copy link
Member

dridi commented Jun 25, 2024

We don't know what MacOS is doing that is no longer resulting in an intentional segmentation fault. We don't consider this important, just ignore or remove the test case before building. Please note that this test case passes on Darwin (MacOS' base) in our CI.

@bsdphk
Copy link
Contributor

bsdphk commented Jul 2, 2024

That CI box is running a pretty old MacOS though...

@nigoroll
Copy link
Member

nigoroll commented Jul 8, 2024

@dwalrond could you please get the return value of sysconf(_SC_THREAD_STACK_MIN) for your system?

@nigoroll
Copy link
Member

nigoroll commented Jul 8, 2024

Actually, despite the dynamic parameter limits having undergone a couple of rework iterations, we report the minimum since 9008a85.
So varnishadm param.show thread_pool_stack of a running instance should report the correct Minimum value. Could you please report it, @dwalrond ?

@dwalrond
Copy link
Author

dwalrond commented Jul 8, 2024

The return value of sysconf(_SC_THREAD_STACK_MIN) is 16384.

The output of varnishadm param.show thread_pool_stack is:

-----------------------------
Varnish Cache CLI 1.0
-----------------------------
Darwin,23.5.0,arm64,-jnone,-smalloc,-sdefault,-hcritbit
varnish-6.0.13 revision ec8f05464fbed40565103beb4a1d834a1f5b24a1

Type 'help' for command list.
Type 'quit' to close CLI session.

varnish> param.show thread_pool_stack
200
thread_pool_stack
        Value is: 64k [bytes] (default)
        Minimum is: 16k
...

On darwin, is jemalloc disabled? And does this mean there's no guard pages? And test c00057.vtc relies upon the guard pages?

@nigoroll
Copy link
Member

nigoroll commented Jul 8, 2024

test c00057.vtc relies upon the guard pages?

yes. the test assumes a downwards stack and stack overflow detection with a red zone / guard page of at least 2KB.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants