Login
[x]
Log in using an account from:
Fedora Account System
Red Hat Associate
Red Hat Customer
Or login using a Red Hat Bugzilla account
Forgot Password
Login:
Hide Forgot
Create an Account
Red Hat Bugzilla – Attachment 1466801 Details for
Bug 1605900
python-scrapy: FTBFS in Fedora rawhide
[?]
New
Simple Search
Advanced Search
My Links
Browse
Requests
Reports
Current State
Search
Tabular reports
Graphical reports
Duplicates
Other Reports
User Changes
Plotly Reports
Bug Status
Bug Severity
Non-Defaults
|
Product Dashboard
Help
Page Help!
Bug Writing Guidelines
What's new
Browser Support Policy
5.0.4.rh83 Release notes
FAQ
Guides index
User guide
Web Services
Contact
Legal
This site requires JavaScript to be enabled to function correctly, please enable it.
build.log
build.log (text/plain), 25.25 KB, created by
Mohan Boddu
on 2018-07-20 17:32:04 UTC
(
hide
)
Description:
build.log
Filename:
MIME Type:
Creator:
Mohan Boddu
Created:
2018-07-20 17:32:04 UTC
Size:
25.25 KB
patch
obsolete
>Mock Version: 1.3.4 >Mock Version: 1.3.4 >ENTER ['do'](['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target noarch --nodeps /builddir/build/SPECS/python-scrapy.spec'], chrootPath='/var/lib/mock/f29-build-13056306-949986/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'en_US.UTF-8'}shell=Falselogger=<mockbuild.trace_decorator.getLog object at 0x7f50570a4fd0>timeout=172800uid=1000gid=425user='mockbuild'nspawn_args=[]printOutput=False) >Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -bs --target noarch --nodeps /builddir/build/SPECS/python-scrapy.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'en_US.UTF-8'} and shell False >Building target platforms: noarch >Building for target noarch >Wrote: /builddir/build/SRPMS/python-scrapy-1.5.0-3.fc29.src.rpm >Child return code was: 0 >ENTER ['do'](['bash', '--login', '-c', '/usr/bin/rpmbuild -bb --target noarch --nodeps /builddir/build/SPECS/python-scrapy.spec'], chrootPath='/var/lib/mock/f29-build-13056306-949986/root'env={'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'en_US.UTF-8'}shell=Falselogger=<mockbuild.trace_decorator.getLog object at 0x7f50570a4fd0>timeout=172800uid=1000gid=425user='mockbuild'nspawn_args=[]printOutput=False) >Executing command: ['bash', '--login', '-c', '/usr/bin/rpmbuild -bb --target noarch --nodeps /builddir/build/SPECS/python-scrapy.spec'] with env {'TERM': 'vt100', 'SHELL': '/bin/bash', 'HOME': '/builddir', 'HOSTNAME': 'mock', 'PATH': '/usr/bin:/bin:/usr/sbin:/sbin', 'PROMPT_COMMAND': 'printf "\\033]0;<mock-chroot>\\007"', 'PS1': '<mock-chroot> \\s-\\v\\$ ', 'LANG': 'en_US.UTF-8'} and shell False >Building target platforms: noarch >Building for target noarch >Executing(%prep): /bin/sh -e /var/tmp/rpm-tmp.gtrT2N >+ umask 022 >+ cd /builddir/build/BUILD >+ cd /builddir/build/BUILD >+ rm -rf Scrapy-1.5.0 >+ /usr/bin/gzip -dc /builddir/build/SOURCES/Scrapy-1.5.0.tar.gz >+ /usr/bin/tar -xof - >+ STATUS=0 >+ '[' 0 -ne 0 ']' >+ cd Scrapy-1.5.0 >+ /usr/bin/chmod -Rf a+rX,u+w,g-w,o-w . >+ exit 0 >Executing(%build): /bin/sh -e /var/tmp/rpm-tmp.PwokyL >+ umask 022 >+ cd /builddir/build/BUILD >+ cd Scrapy-1.5.0 >+ CFLAGS='-O2 -g -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fexceptions -fstack-protector-strong -grecord-gcc-switches -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection' >+ LDFLAGS='-Wl,-z,relro -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld' >+ /usr/bin/python2 setup.py build '--executable=/usr/bin/python2 -s' >running build >running build_py >creating build >creating build/lib >creating build/lib/scrapy >copying scrapy/statscollectors.py -> build/lib/scrapy >copying scrapy/squeue.py -> build/lib/scrapy >copying scrapy/extension.py -> build/lib/scrapy >copying scrapy/conf.py -> build/lib/scrapy >copying scrapy/mail.py -> build/lib/scrapy >copying scrapy/signals.py -> build/lib/scrapy >copying scrapy/__main__.py -> build/lib/scrapy >copying scrapy/link.py -> build/lib/scrapy >copying scrapy/squeues.py -> build/lib/scrapy >copying scrapy/log.py -> build/lib/scrapy >copying scrapy/_monkeypatches.py -> build/lib/scrapy >copying scrapy/stats.py -> build/lib/scrapy >copying scrapy/exporters.py -> build/lib/scrapy >copying scrapy/cmdline.py -> build/lib/scrapy >copying scrapy/linkextractor.py -> build/lib/scrapy >copying scrapy/command.py -> build/lib/scrapy >copying scrapy/spidermanager.py -> build/lib/scrapy >copying scrapy/dupefilter.py -> build/lib/scrapy >copying scrapy/project.py -> build/lib/scrapy >copying scrapy/middleware.py -> build/lib/scrapy >copying scrapy/exceptions.py -> build/lib/scrapy >copying scrapy/shell.py -> build/lib/scrapy >copying scrapy/telnet.py -> build/lib/scrapy >copying scrapy/spider.py -> build/lib/scrapy >copying scrapy/signalmanager.py -> build/lib/scrapy >copying scrapy/dupefilters.py -> build/lib/scrapy >copying scrapy/item.py -> build/lib/scrapy >copying scrapy/crawler.py -> build/lib/scrapy >copying scrapy/logformatter.py -> build/lib/scrapy >copying scrapy/__init__.py -> build/lib/scrapy >copying scrapy/resolver.py -> build/lib/scrapy >copying scrapy/statscol.py -> build/lib/scrapy >copying scrapy/interfaces.py -> build/lib/scrapy >copying scrapy/spiderloader.py -> build/lib/scrapy >copying scrapy/responsetypes.py -> build/lib/scrapy >creating build/lib/scrapy/commands >copying scrapy/commands/check.py -> build/lib/scrapy/commands >copying scrapy/commands/settings.py -> build/lib/scrapy/commands >copying scrapy/commands/view.py -> build/lib/scrapy/commands >copying scrapy/commands/bench.py -> build/lib/scrapy/commands >copying scrapy/commands/list.py -> build/lib/scrapy/commands >copying scrapy/commands/edit.py -> build/lib/scrapy/commands >copying scrapy/commands/genspider.py -> build/lib/scrapy/commands >copying scrapy/commands/startproject.py -> build/lib/scrapy/commands >copying scrapy/commands/runspider.py -> build/lib/scrapy/commands >copying scrapy/commands/version.py -> build/lib/scrapy/commands >copying scrapy/commands/shell.py -> build/lib/scrapy/commands >copying scrapy/commands/fetch.py -> build/lib/scrapy/commands >copying scrapy/commands/crawl.py -> build/lib/scrapy/commands >copying scrapy/commands/parse.py -> build/lib/scrapy/commands >copying scrapy/commands/__init__.py -> build/lib/scrapy/commands >creating build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/chunked.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/stats.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/decompression.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/ajaxcrawl.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/httpauth.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/httpproxy.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/downloadtimeout.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/robotstxt.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/cookies.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/defaultheaders.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/httpcompression.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/retry.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/httpcache.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/redirect.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/useragent.py -> build/lib/scrapy/downloadermiddlewares >copying scrapy/downloadermiddlewares/__init__.py -> build/lib/scrapy/downloadermiddlewares >creating build/lib/scrapy/spidermiddlewares >copying scrapy/spidermiddlewares/httperror.py -> build/lib/scrapy/spidermiddlewares >copying scrapy/spidermiddlewares/urllength.py -> build/lib/scrapy/spidermiddlewares >copying scrapy/spidermiddlewares/referer.py -> build/lib/scrapy/spidermiddlewares >copying scrapy/spidermiddlewares/offsite.py -> build/lib/scrapy/spidermiddlewares >copying scrapy/spidermiddlewares/__init__.py -> build/lib/scrapy/spidermiddlewares >copying scrapy/spidermiddlewares/depth.py -> build/lib/scrapy/spidermiddlewares >creating build/lib/scrapy/extensions >copying scrapy/extensions/feedexport.py -> build/lib/scrapy/extensions >copying scrapy/extensions/corestats.py -> build/lib/scrapy/extensions >copying scrapy/extensions/debug.py -> build/lib/scrapy/extensions >copying scrapy/extensions/throttle.py -> build/lib/scrapy/extensions >copying scrapy/extensions/statsmailer.py -> build/lib/scrapy/extensions >copying scrapy/extensions/memusage.py -> build/lib/scrapy/extensions >copying scrapy/extensions/memdebug.py -> build/lib/scrapy/extensions >copying scrapy/extensions/logstats.py -> build/lib/scrapy/extensions >copying scrapy/extensions/closespider.py -> build/lib/scrapy/extensions >copying scrapy/extensions/spiderstate.py -> build/lib/scrapy/extensions >copying scrapy/extensions/telnet.py -> build/lib/scrapy/extensions >copying scrapy/extensions/httpcache.py -> build/lib/scrapy/extensions >copying scrapy/extensions/__init__.py -> build/lib/scrapy/extensions >creating build/lib/scrapy/xlib >copying scrapy/xlib/pydispatch.py -> build/lib/scrapy/xlib >copying scrapy/xlib/tx.py -> build/lib/scrapy/xlib >copying scrapy/xlib/__init__.py -> build/lib/scrapy/xlib >creating build/lib/scrapy/contrib >copying scrapy/contrib/feedexport.py -> build/lib/scrapy/contrib >copying scrapy/contrib/corestats.py -> build/lib/scrapy/contrib >copying scrapy/contrib/debug.py -> build/lib/scrapy/contrib >copying scrapy/contrib/throttle.py -> build/lib/scrapy/contrib >copying scrapy/contrib/statsmailer.py -> build/lib/scrapy/contrib >copying scrapy/contrib/memusage.py -> build/lib/scrapy/contrib >copying scrapy/contrib/memdebug.py -> build/lib/scrapy/contrib >copying scrapy/contrib/logstats.py -> build/lib/scrapy/contrib >copying scrapy/contrib/closespider.py -> build/lib/scrapy/contrib >copying scrapy/contrib/spiderstate.py -> build/lib/scrapy/contrib >copying scrapy/contrib/httpcache.py -> build/lib/scrapy/contrib >copying scrapy/contrib/__init__.py -> build/lib/scrapy/contrib >creating build/lib/scrapy/contracts >copying scrapy/contracts/default.py -> build/lib/scrapy/contracts >copying scrapy/contracts/__init__.py -> build/lib/scrapy/contracts >creating build/lib/scrapy/selector >copying scrapy/selector/csstranslator.py -> build/lib/scrapy/selector >copying scrapy/selector/lxmlsel.py -> build/lib/scrapy/selector >copying scrapy/selector/unified.py -> build/lib/scrapy/selector >copying scrapy/selector/__init__.py -> build/lib/scrapy/selector >creating build/lib/scrapy/spiders >copying scrapy/spiders/feed.py -> build/lib/scrapy/spiders >copying scrapy/spiders/init.py -> build/lib/scrapy/spiders >copying scrapy/spiders/crawl.py -> build/lib/scrapy/spiders >copying scrapy/spiders/__init__.py -> build/lib/scrapy/spiders >copying scrapy/spiders/sitemap.py -> build/lib/scrapy/spiders >creating build/lib/scrapy/http >copying scrapy/http/common.py -> build/lib/scrapy/http >copying scrapy/http/headers.py -> build/lib/scrapy/http >copying scrapy/http/cookies.py -> build/lib/scrapy/http >copying scrapy/http/__init__.py -> build/lib/scrapy/http >creating build/lib/scrapy/settings >copying scrapy/settings/deprecated.py -> build/lib/scrapy/settings >copying scrapy/settings/default_settings.py -> build/lib/scrapy/settings >copying scrapy/settings/__init__.py -> build/lib/scrapy/settings >creating build/lib/scrapy/linkextractors >copying scrapy/linkextractors/sgml.py -> build/lib/scrapy/linkextractors >copying scrapy/linkextractors/regex.py -> build/lib/scrapy/linkextractors >copying scrapy/linkextractors/lxmlhtml.py -> build/lib/scrapy/linkextractors >copying scrapy/linkextractors/__init__.py -> build/lib/scrapy/linkextractors >copying scrapy/linkextractors/htmlparser.py -> build/lib/scrapy/linkextractors >creating build/lib/scrapy/utils >copying scrapy/utils/ftp.py -> build/lib/scrapy/utils >copying scrapy/utils/conf.py -> build/lib/scrapy/utils >copying scrapy/utils/display.py -> build/lib/scrapy/utils >copying scrapy/utils/request.py -> build/lib/scrapy/utils >copying scrapy/utils/gz.py -> build/lib/scrapy/utils >copying scrapy/utils/test.py -> build/lib/scrapy/utils >copying scrapy/utils/misc.py -> build/lib/scrapy/utils >copying scrapy/utils/python.py -> build/lib/scrapy/utils >copying scrapy/utils/log.py -> build/lib/scrapy/utils >copying scrapy/utils/defer.py -> build/lib/scrapy/utils >copying scrapy/utils/testproc.py -> build/lib/scrapy/utils >copying scrapy/utils/engine.py -> build/lib/scrapy/utils >copying scrapy/utils/versions.py -> build/lib/scrapy/utils >copying scrapy/utils/deprecate.py -> build/lib/scrapy/utils >copying scrapy/utils/template.py -> build/lib/scrapy/utils >copying scrapy/utils/response.py -> build/lib/scrapy/utils >copying scrapy/utils/reactor.py -> build/lib/scrapy/utils >copying scrapy/utils/ossignal.py -> build/lib/scrapy/utils >copying scrapy/utils/trackref.py -> build/lib/scrapy/utils >copying scrapy/utils/serialize.py -> build/lib/scrapy/utils >copying scrapy/utils/http.py -> build/lib/scrapy/utils >copying scrapy/utils/project.py -> build/lib/scrapy/utils >copying scrapy/utils/url.py -> build/lib/scrapy/utils >copying scrapy/utils/benchserver.py -> build/lib/scrapy/utils >copying scrapy/utils/datatypes.py -> build/lib/scrapy/utils >copying scrapy/utils/iterators.py -> build/lib/scrapy/utils >copying scrapy/utils/decorators.py -> build/lib/scrapy/utils >copying scrapy/utils/spider.py -> build/lib/scrapy/utils >copying scrapy/utils/boto.py -> build/lib/scrapy/utils >copying scrapy/utils/markup.py -> build/lib/scrapy/utils >copying scrapy/utils/testsite.py -> build/lib/scrapy/utils >copying scrapy/utils/job.py -> build/lib/scrapy/utils >copying scrapy/utils/reqser.py -> build/lib/scrapy/utils >copying scrapy/utils/__init__.py -> build/lib/scrapy/utils >copying scrapy/utils/multipart.py -> build/lib/scrapy/utils >copying scrapy/utils/httpobj.py -> build/lib/scrapy/utils >copying scrapy/utils/decorator.py -> build/lib/scrapy/utils >copying scrapy/utils/console.py -> build/lib/scrapy/utils >copying scrapy/utils/sitemap.py -> build/lib/scrapy/utils >copying scrapy/utils/signal.py -> build/lib/scrapy/utils >creating build/lib/scrapy/contrib_exp >copying scrapy/contrib_exp/iterators.py -> build/lib/scrapy/contrib_exp >copying scrapy/contrib_exp/__init__.py -> build/lib/scrapy/contrib_exp >creating build/lib/scrapy/loader >copying scrapy/loader/common.py -> build/lib/scrapy/loader >copying scrapy/loader/processors.py -> build/lib/scrapy/loader >copying scrapy/loader/__init__.py -> build/lib/scrapy/loader >creating build/lib/scrapy/pipelines >copying scrapy/pipelines/files.py -> build/lib/scrapy/pipelines >copying scrapy/pipelines/images.py -> build/lib/scrapy/pipelines >copying scrapy/pipelines/media.py -> build/lib/scrapy/pipelines >copying scrapy/pipelines/__init__.py -> build/lib/scrapy/pipelines >creating build/lib/scrapy/core >copying scrapy/core/scraper.py -> build/lib/scrapy/core >copying scrapy/core/engine.py -> build/lib/scrapy/core >copying scrapy/core/scheduler.py -> build/lib/scrapy/core >copying scrapy/core/__init__.py -> build/lib/scrapy/core >copying scrapy/core/spidermw.py -> build/lib/scrapy/core >creating build/lib/scrapy/contrib/spidermiddleware >copying scrapy/contrib/spidermiddleware/httperror.py -> build/lib/scrapy/contrib/spidermiddleware >copying scrapy/contrib/spidermiddleware/urllength.py -> build/lib/scrapy/contrib/spidermiddleware >copying scrapy/contrib/spidermiddleware/referer.py -> build/lib/scrapy/contrib/spidermiddleware >copying scrapy/contrib/spidermiddleware/offsite.py -> build/lib/scrapy/contrib/spidermiddleware >copying scrapy/contrib/spidermiddleware/__init__.py -> build/lib/scrapy/contrib/spidermiddleware >copying scrapy/contrib/spidermiddleware/depth.py -> build/lib/scrapy/contrib/spidermiddleware >creating build/lib/scrapy/contrib/pipeline >copying scrapy/contrib/pipeline/files.py -> build/lib/scrapy/contrib/pipeline >copying scrapy/contrib/pipeline/images.py -> build/lib/scrapy/contrib/pipeline >copying scrapy/contrib/pipeline/media.py -> build/lib/scrapy/contrib/pipeline >copying scrapy/contrib/pipeline/__init__.py -> build/lib/scrapy/contrib/pipeline >creating build/lib/scrapy/contrib/exporter >copying scrapy/contrib/exporter/__init__.py -> build/lib/scrapy/contrib/exporter >creating build/lib/scrapy/contrib/spiders >copying scrapy/contrib/spiders/feed.py -> build/lib/scrapy/contrib/spiders >copying scrapy/contrib/spiders/init.py -> build/lib/scrapy/contrib/spiders >copying scrapy/contrib/spiders/crawl.py -> build/lib/scrapy/contrib/spiders >copying scrapy/contrib/spiders/__init__.py -> build/lib/scrapy/contrib/spiders >copying scrapy/contrib/spiders/sitemap.py -> build/lib/scrapy/contrib/spiders >creating build/lib/scrapy/contrib/linkextractors >copying scrapy/contrib/linkextractors/sgml.py -> build/lib/scrapy/contrib/linkextractors >copying scrapy/contrib/linkextractors/regex.py -> build/lib/scrapy/contrib/linkextractors >copying scrapy/contrib/linkextractors/lxmlhtml.py -> build/lib/scrapy/contrib/linkextractors >copying scrapy/contrib/linkextractors/__init__.py -> build/lib/scrapy/contrib/linkextractors >copying scrapy/contrib/linkextractors/htmlparser.py -> build/lib/scrapy/contrib/linkextractors >creating build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/chunked.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/stats.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/decompression.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/ajaxcrawl.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/httpauth.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/httpproxy.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/downloadtimeout.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/robotstxt.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/cookies.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/defaultheaders.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/httpcompression.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/retry.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/httpcache.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/redirect.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/useragent.py -> build/lib/scrapy/contrib/downloadermiddleware >copying scrapy/contrib/downloadermiddleware/__init__.py -> build/lib/scrapy/contrib/downloadermiddleware >creating build/lib/scrapy/contrib/loader >copying scrapy/contrib/loader/common.py -> build/lib/scrapy/contrib/loader >copying scrapy/contrib/loader/__init__.py -> build/lib/scrapy/contrib/loader >copying scrapy/contrib/loader/processor.py -> build/lib/scrapy/contrib/loader >creating build/lib/scrapy/http/request >copying scrapy/http/request/form.py -> build/lib/scrapy/http/request >copying scrapy/http/request/rpc.py -> build/lib/scrapy/http/request >copying scrapy/http/request/__init__.py -> build/lib/scrapy/http/request >creating build/lib/scrapy/http/response >copying scrapy/http/response/html.py -> build/lib/scrapy/http/response >copying scrapy/http/response/xml.py -> build/lib/scrapy/http/response >copying scrapy/http/response/text.py -> build/lib/scrapy/http/response >copying scrapy/http/response/__init__.py -> build/lib/scrapy/http/response >creating build/lib/scrapy/contrib_exp/downloadermiddleware >copying scrapy/contrib_exp/downloadermiddleware/decompression.py -> build/lib/scrapy/contrib_exp/downloadermiddleware >copying scrapy/contrib_exp/downloadermiddleware/__init__.py -> build/lib/scrapy/contrib_exp/downloadermiddleware >creating build/lib/scrapy/core/downloader >copying scrapy/core/downloader/webclient.py -> build/lib/scrapy/core/downloader >copying scrapy/core/downloader/tls.py -> build/lib/scrapy/core/downloader >copying scrapy/core/downloader/middleware.py -> build/lib/scrapy/core/downloader >copying scrapy/core/downloader/__init__.py -> build/lib/scrapy/core/downloader >copying scrapy/core/downloader/contextfactory.py -> build/lib/scrapy/core/downloader >creating build/lib/scrapy/core/downloader/handlers >copying scrapy/core/downloader/handlers/s3.py -> build/lib/scrapy/core/downloader/handlers >copying scrapy/core/downloader/handlers/file.py -> build/lib/scrapy/core/downloader/handlers >copying scrapy/core/downloader/handlers/ftp.py -> build/lib/scrapy/core/downloader/handlers >copying scrapy/core/downloader/handlers/http10.py -> build/lib/scrapy/core/downloader/handlers >copying scrapy/core/downloader/handlers/datauri.py -> build/lib/scrapy/core/downloader/handlers >copying scrapy/core/downloader/handlers/http.py -> build/lib/scrapy/core/downloader/handlers >copying scrapy/core/downloader/handlers/http11.py -> build/lib/scrapy/core/downloader/handlers >copying scrapy/core/downloader/handlers/__init__.py -> build/lib/scrapy/core/downloader/handlers >running egg_info >writing requirements to Scrapy.egg-info/requires.txt >writing Scrapy.egg-info/PKG-INFO >writing top-level names to Scrapy.egg-info/top_level.txt >writing dependency_links to Scrapy.egg-info/dependency_links.txt >writing entry points to Scrapy.egg-info/entry_points.txt >reading manifest file 'Scrapy.egg-info/SOURCES.txt' >reading manifest template 'MANIFEST.in' >warning: no files found matching 'license.txt' under directory 'scrapy' >no previously-included directories found matching 'docs/build' >warning: no files found matching '*' under directory 'bin' >warning: no previously-included files matching '__pycache__' found anywhere in distribution >warning: no previously-included files matching '*.py[cod]' found anywhere in distribution >writing manifest file 'Scrapy.egg-info/SOURCES.txt' >copying scrapy/VERSION -> build/lib/scrapy >copying scrapy/mime.types -> build/lib/scrapy >creating build/lib/scrapy/templates >creating build/lib/scrapy/templates/project >copying scrapy/templates/project/scrapy.cfg -> build/lib/scrapy/templates/project >creating build/lib/scrapy/templates/project/module >copying scrapy/templates/project/module/__init__.py -> build/lib/scrapy/templates/project/module >copying scrapy/templates/project/module/items.py.tmpl -> build/lib/scrapy/templates/project/module >copying scrapy/templates/project/module/middlewares.py.tmpl -> build/lib/scrapy/templates/project/module >copying scrapy/templates/project/module/pipelines.py.tmpl -> build/lib/scrapy/templates/project/module >copying scrapy/templates/project/module/settings.py.tmpl -> build/lib/scrapy/templates/project/module >creating build/lib/scrapy/templates/project/module/spiders >copying scrapy/templates/project/module/spiders/__init__.py -> build/lib/scrapy/templates/project/module/spiders >creating build/lib/scrapy/templates/spiders >copying scrapy/templates/spiders/basic.tmpl -> build/lib/scrapy/templates/spiders >copying scrapy/templates/spiders/crawl.tmpl -> build/lib/scrapy/templates/spiders >copying scrapy/templates/spiders/csvfeed.tmpl -> build/lib/scrapy/templates/spiders >copying scrapy/templates/spiders/xmlfeed.tmpl -> build/lib/scrapy/templates/spiders >+ sleep 1 >+ CFLAGS='-O2 -g -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -fexceptions -fstack-protector-strong -grecord-gcc-switches -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection -fcf-protection' >+ LDFLAGS='-Wl,-z,relro -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld' >+ /usr/bin/python3 setup.py build '--executable=/usr/bin/python3 -s' >running build >running build_py >running egg_info >writing Scrapy.egg-info/PKG-INFO >writing dependency_links to Scrapy.egg-info/dependency_links.txt >writing entry points to Scrapy.egg-info/entry_points.txt >writing requirements to Scrapy.egg-info/requires.txt >writing top-level names to Scrapy.egg-info/top_level.txt >reading manifest file 'Scrapy.egg-info/SOURCES.txt' >reading manifest template 'MANIFEST.in' >warning: no files found matching 'license.txt' under directory 'scrapy' >no previously-included directories found matching 'docs/build' >warning: no files found matching '*' under directory 'bin' >warning: no previously-included files matching '__pycache__' found anywhere in distribution >warning: no previously-included files matching '*.py[cod]' found anywhere in distribution >writing manifest file 'Scrapy.egg-info/SOURCES.txt' >+ sleep 1 >++ pwd >+ PYTHONPATH=/builddir/build/BUILD/Scrapy-1.5.0 >+ make -C docs html >make: Entering directory '/builddir/build/BUILD/Scrapy-1.5.0/docs' >mkdir -p build/html build/doctrees >sphinx-build -b html -d build/doctrees -D latex_elements.papersize= . build/html >Running Sphinx v1.7.5 >loading translations [en]... done >Extension error: >Could not import extension scrapydocs (exception: cannot import name Directive) >make: *** [Makefile:33: build] Error 2 >make: Leaving directory '/builddir/build/BUILD/Scrapy-1.5.0/docs' >error: Bad exit status from /var/tmp/rpm-tmp.PwokyL (%build) > Bad exit status from /var/tmp/rpm-tmp.PwokyL (%build) >RPM build errors: >Child return code was: 1 >EXCEPTION: [Error()] >Traceback (most recent call last): > File "/usr/lib/python3.6/site-packages/mockbuild/trace_decorator.py", line 89, in trace > result = func(*args, **kw) > File "/usr/lib/python3.6/site-packages/mockbuild/util.py", line 582, in do > raise exception.Error("Command failed. See logs for output.\n # %s" % (command,), child.returncode) >mockbuild.exception.Error: Command failed. See logs for output. > # bash --login -c /usr/bin/rpmbuild -bb --target noarch --nodeps /builddir/build/SPECS/python-scrapy.spec
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 1605900
: 1466801 |
1466802
|
1466803