Bug 1401252 - Preliminary attempt to recompile 'varnish-5.0.0-1.fc26' SRPM under CentOS 7
Summary: Preliminary attempt to recompile 'varnish-5.0.0-1.fc26' SRPM under CentOS 7
Keywords:
Status: CLOSED INSUFFICIENT_DATA
Alias: None
Product: Fedora
Classification: Fedora
Component: varnish
Version: rawhide
Hardware: Unspecified
OS: Unspecified
unspecified
unspecified
Target Milestone: ---
Assignee: Ingvar Hagelund
QA Contact: Fedora Extras Quality Assurance
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2016-12-04 00:30 UTC by George Notaras
Modified: 2016-12-04 04:01 UTC (History)
1 user (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2016-12-04 04:00:03 UTC
Type: Bug


Attachments (Terms of Use)

Description George Notaras 2016-12-04 00:30:56 UTC
I'm not sure if this is the right place to report this issue. If not, please feel free to close it.

I tried to rebuild the 'varnish-5.0.0-1.fc26.src.rpm' SRPM from Fedora Rawhide under the latest stable release of CentOS 7 (7.2.1511).

Everything went well, except the 'tests/v00045.vtc' test, which failed.

Here is relevant output of the RPM building process:

...
...
FAIL: tests/v00045.vtc
...
...
make[5]: Entering directory `/home/builder/rpmbuild/BUILD/varnish-5.0.0/bin/varnishtest'
make[5]: Nothing to be done for `all'.
make[5]: Leaving directory `/home/builder/rpmbuild/BUILD/varnish-5.0.0/bin/varnishtest'
===================================================
   Varnish 5.0.0: bin/varnishtest/test-suite.log
===================================================

# TOTAL: 557
# PASS:  556
# SKIP:  0
# XFAIL: 0
# FAIL:  1
# XPASS: 0
# ERROR: 0

.. contents:: :depth: 2

FAIL: tests/v00045
==================

**** top   0.0 extmacro def pwd=/home/builder/rpmbuild/BUILD/varnish-5.0.0/bin/varnishtest
**** top   0.0 extmacro def localhost=127.0.0.1
**** top   0.0 extmacro def bad_ip=192.0.2.255
**** top   0.0 extmacro def topbuild=/home/builder/rpmbuild/BUILD/varnish-5.0.0
**** top   0.0 macro def tmpdir=/tmp/vtc.29672.097e2816
*    top   0.0 TEST ./tests/v00045.vtc starting
**   top   0.0 === varnishtest "Hold a reference to a VCL after a COLD event"
*    top   0.0 TEST Hold a reference to a VCL after a COLD event
**   top   0.0 === server s1 -start
**   s1    0.0 Starting server
**** s1    0.0 macro def s1_addr=127.0.0.1
**** s1    0.0 macro def s1_port=42036
**** s1    0.0 macro def s1_sock=127.0.0.1 42036
*    s1    0.0 Listen on 127.0.0.1 42036
**   top   0.0 === varnish v1 -vcl+backend {
**   s1    0.0 Started on 127.0.0.1 42036
**   v1    0.2 Launch
***  v1    0.2 CMD: cd ${pwd} && exec varnishd  -d -n /tmp/vtc.29672.097e2816/v1 -l 2m,1m,- -p auto_restart=off -p syslog_cli_traffic=off -p sigsegv_handler=on -p thread_pool_min=10 -p debug=+vtc_mode -a '127.0.0.1:0' -M '127.0.0.1 38637' -P /tmp/vtc.29672.097e2816/v1/varnishd.pid -p vmod_path=/home/builder/rpmbuild/BUILD/varnish-5.0.0/lib/libvmod_std/.libs:/home/builder/rpmbuild/BUILD/varnish-5.0.0/lib/libvmod_debug/.libs:/home/builder/rpmbuild/BUILD/varnish-5.0.0/lib/libvmod_directors/.libs
***  v1    0.2 CMD: cd /home/builder/rpmbuild/BUILD/varnish-5.0.0/bin/varnishtest && exec varnishd  -d -n /tmp/vtc.29672.097e2816/v1 -l 2m,1m,- -p auto_restart=off -p syslog_cli_traffic=off -p sigsegv_handler=on -p thread_pool_min=10 -p debug=+vtc_mode -a '127.0.0.1:0' -M '127.0.0.1 38637' -P /tmp/vtc.29672.097e2816/v1/varnishd.pid -p vmod_path=/home/builder/rpmbuild/BUILD/varnish-5.0.0/lib/libvmod_std/.libs:/home/builder/rpmbuild/BUILD/varnish-5.0.0/lib/libvmod_debug/.libs:/home/builder/rpmbuild/BUILD/varnish-5.0.0/lib/libvmod_directors/.libs
***  v1    0.2 PID: 29714
**** v1    0.2 macro def v1_pid=29714
**** v1    0.2 macro def v1_name=/tmp/vtc.29672.097e2816/v1
***  v1    1.3 debug| Debug: Platform: Linux,3.10.0-327.36.3.el7.x86_64,x86_64,-jnone,-smalloc,-smalloc,-hcritbit\n
***  v1    1.3 debug| 200 292     \n
***  v1    1.3 debug| -----------------------------\n
***  v1    1.3 debug| Varnish Cache CLI 1.0\n
***  v1    1.3 debug| -----------------------------\n
***  v1    1.3 debug| Linux,3.10.0-327.36.3.el7.x86_64,x86_64,-jnone,-smalloc,-smalloc,-hcritbit\n
***  v1    1.3 debug| varnish-5.0.0 revision 99d036f\n
***  v1    1.3 debug| \n
***  v1    1.3 debug| Type 'help' for command list.\n
***  v1    1.3 debug| Type 'quit' to close CLI session.\n
***  v1    1.3 debug| Type 'start' to launch worker process.\n
***  v1    1.3 debug| \n
**** v1    1.4 CLIPOLL 1 0x1 0x0
***  v1    1.4 CLI connection fd = 10
***  v1    1.4 CLI RX  107
**** v1    1.4 CLI RX| cuwxjkyokzbrcluahfwmpihbpeomieaz\n
**** v1    1.4 CLI RX| \n
**** v1    1.4 CLI RX| Authentication required.\n
**** v1    1.4 CLI TX| auth a60874d64aa2b2ffff1af0905399dc4aa25c6d23a62fca75353f6505b8bec3c2\n
***  v1    1.4 CLI RX  200
**** v1    1.4 CLI RX| -----------------------------\n
**** v1    1.4 CLI RX| Varnish Cache CLI 1.0\n
**** v1    1.4 CLI RX| -----------------------------\n
**** v1    1.4 CLI RX| Linux,3.10.0-327.36.3.el7.x86_64,x86_64,-jnone,-smalloc,-smalloc,-hcritbit\n
**** v1    1.4 CLI RX| varnish-5.0.0 revision 99d036f\n
**** v1    1.4 CLI RX| \n
**** v1    1.4 CLI RX| Type 'help' for command list.\n
**** v1    1.4 CLI RX| Type 'quit' to close CLI session.\n
**** v1    1.4 CLI RX| Type 'start' to launch worker process.\n
**** v1    1.4 CLI TX| vcl.inline vcl1 << %XJEIFLH|)Xspa8P\n
**** v1    1.4 CLI TX| vcl 4.0;\n
**** v1    1.4 CLI TX| backend s1 { .host = "127.0.0.1"; .port = "42036"; }\n
**** v1    1.4 CLI TX| \n
**** v1    1.4 CLI TX| \n
**** v1    1.4 CLI TX| \timport debug;\n
**** v1    1.4 CLI TX| \tsub vcl_init {\n
**** v1    1.4 CLI TX| \t\tdebug.vcl_release_delay(3s);\n
**** v1    1.4 CLI TX| \t}\n
**** v1    1.4 CLI TX| \n
**** v1    1.4 CLI TX| %XJEIFLH|)Xspa8P\n
***  v1    2.5 CLI RX  200
**** v1    2.5 CLI RX| VCL compiled.\n
**** v1    2.5 CLI TX| vcl.use vcl1
***  v1    2.5 CLI RX  200
**   v1    2.5 Start
**** v1    2.5 CLI TX| start
***  v1    2.6 debug| Debug: Child (29857) Started\n
***  v1    2.8 CLI RX  200
***  v1    2.8 wait-running
**** v1    2.8 CLI TX| status
***  v1    2.8 debug| Info: Child (29857) said Child starts\n
***  v1    2.8 CLI RX  200
**** v1    2.8 CLI RX| Child in state running
**** v1    2.8 CLI TX| debug.xid 999
***  v1    2.9 CLI RX  200
**** v1    2.9 CLI RX| XID is 999
**** v1    2.9 CLI TX| debug.listen_address
***  v1    2.9 CLI RX  200
**** v1    2.9 CLI RX| 127.0.0.1 41492\n
**   v1    2.9 Listen on 127.0.0.1 41492
**** v1    2.9 macro def v1_addr=127.0.0.1
**** v1    2.9 macro def v1_port=41492
**** v1    2.9 macro def v1_sock=127.0.0.1 41492
**   top   2.9 === varnish v1 -vcl+backend {}
**** v1    2.9 CLI TX| vcl.inline vcl2 << %XJEIFLH|)Xspa8P\n
**** v1    2.9 CLI TX| vcl 4.0;\n
**** v1    2.9 CLI TX| backend s1 { .host = "127.0.0.1"; .port = "42036"; }\n
**** v1    2.9 CLI TX| \n
**** v1    2.9 CLI TX| \n
**** v1    2.9 CLI TX| %XJEIFLH|)Xspa8P\n
***  v1    4.9 CLI RX  200
**** v1    4.9 CLI RX| VCL compiled.\n
**** v1    4.9 CLI TX| vcl.use vcl2
***  v1    4.9 CLI RX  200
**** v1    4.9 CLI RX| VCL 'vcl2' now active
**   top   4.9 === varnish v1 -cliok "vcl.state vcl1 cold"
**** v1    4.9 CLI TX| vcl.state vcl1 cold
***  v1    4.9 CLI RX  200
**   v1    4.9 CLI 200 <vcl.state vcl1 cold>
**   top   4.9 === delay 1
***  top   4.9 delaying 1 second(s)
**   top   5.9 === shell {
**** top   5.9 shell| \n
**** top   5.9 shell| \tvarnishadm -n /tmp/vtc.29672.097e2816/v1 vcl.list |\n
**** top   5.9 shell| \tgrep "auto/cooling.*vcl1" >/dev/null\n
**   top   6.4 === delay 1
***  top   6.4 delaying 1 second(s)
**   top   7.4 === shell {
**** top   7.4 shell| \n
**** top   7.4 shell| \tvarnishadm -n /tmp/vtc.29672.097e2816/v1 vcl.state vcl1 warm 2>/dev/null |\n
**** top   7.4 shell| \tgrep "vmod-debug ref on vcl1" >/dev/null\n
---- top   8.0 CMD '
        varnishadm -n /tmp/vtc.29672.097e2816/v1 vcl.state vcl1 warm 2>/dev/null |
        grep "vmod-debug ref on vcl1" >/dev/null
' failed with status 1 (Success)
*    top   8.0 RESETTING after ./tests/v00045.vtc
**   s1    8.0 Waiting for server (4/-1)
**** s1    8.0 macro undef s1_addr
**** s1    8.0 macro undef s1_port
**** s1    8.0 macro undef s1_sock
**   v1    8.0 Wait
**** v1    8.0 CLI TX| backend.list
***  v1    8.1 CLI RX  200
**** v1    8.1 CLI RX| Backend name                   Admin      Probe                Last updated\n
**** v1    8.1 CLI RX| vcl2.s1                        probe      Healthy (no probe)   Sat, 03 Dec 2016 20:44:44 GMT
***  v1    8.1 debug| Debug: Stopping Child\n
***  v1    9.1 debug| Info: Child (29857) ended\n
***  v1    9.1 debug| Info: Child (29857) said Child dies\n
***  v1    9.1 debug| Debug: Child cleanup complete\n
**** v1    9.1 STDOUT poll 0x11
**   v1    9.1 R 29714 Status: 0000 (u 0.717461 s 0.690157)
*    top   9.1 TEST ./tests/v00045.vtc FAILED

#     top  TEST ./tests/v00045.vtc FAILED (9.169) exit=1

============================================================================
Testsuite summary for Varnish 5.0.0
============================================================================
# TOTAL: 557
# PASS:  556
# SKIP:  0
# XFAIL: 0
# FAIL:  1
# XPASS: 0
# ERROR: 0
============================================================================
See bin/varnishtest/test-suite.log
Please report to varnish-dev
============================================================================
make[4]: *** [test-suite.log] Error 1
make[4]: Leaving directory `/home/builder/rpmbuild/BUILD/varnish-5.0.0/bin/varnishtest'
make[3]: *** [check-TESTS] Error 2
make[3]: Leaving directory `/home/builder/rpmbuild/BUILD/varnish-5.0.0/bin/varnishtest'
make[2]: *** [check-am] Error 2
make[2]: Leaving directory `/home/builder/rpmbuild/BUILD/varnish-5.0.0/bin/varnishtest'
make[1]: *** [check-recursive] Error 1
make[1]: Leaving directory `/home/builder/rpmbuild/BUILD/varnish-5.0.0/bin'
make: *** [check-recursive] Error 1
error: Bad exit status from /home/builder/rpmbuild/tmp/rpm-tmp.DrHuBz (%check)


RPM build errors:
    Bad exit status from /home/builder/rpmbuild/tmp/rpm-tmp.DrHuBz (%check)

Comment 1 George Notaras 2016-12-04 04:00:03 UTC
Since I noticed some problems with my build server, I'm closing this bug report and will submit a new one once I try to recompile the RPM in a clean environment.

I'm sorry about this.


Note You need to log in before you can comment on or make changes to this bug.