Bug 1380134 - FTBFS: APLpy fails to build due to test failures and other issues
Summary: FTBFS: APLpy fails to build due to test failures and other issues
Keywords:
Status: CLOSED RAWHIDE
Alias: None
Product: Fedora
Classification: Fedora
Component: APLpy
Version: rawhide
Hardware: Unspecified
OS: Unspecified
unspecified
unspecified
Target Milestone: ---
Assignee: Sergio Pascual
QA Contact: Fedora Extras Quality Assurance
URL:
Whiteboard:
Depends On: 1380135
Blocks:
TreeView+ depends on / blocked
 
Reported: 2016-09-28 17:58 UTC by Dominik 'Rathann' Mierzejewski
Modified: 2016-10-02 20:57 UTC (History)
2 users (show)

Fixed In Version:
Clone Of:
Environment:
Last Closed: 2016-10-02 20:57:28 UTC
Type: Bug
Embargoed:


Attachments (Terms of Use)

Description Dominik 'Rathann' Mierzejewski 2016-09-28 17:58:45 UTC
Description of problem:
Multiple issues prevent APLpy from building:
1. missing python3 support (fixed upstream already: https://github.com/aplpy/aplpy/issues/304)
2. python-astropy instead of python2-astropy in (Build)Requires: (this is actually a bug in python-astropy package)
3. two actual test failures once the above are fixed

Version-Release number of selected component (if applicable):
APLpy-1.0-6.fc26

How reproducible:
Always.

Actual results:
=================================== FAILURES ===================================
___________________________ TestVectors.test_default ___________________________

self = <aplpy.tests.test_vectors.TestVectors object at 0x7fc635e52e50>
generate = None

    def test_default(self, generate):
        f = FITSFigure(IMAGE, figsize=(4,4))
        f.show_grayscale()
        f.show_vectors(PDATA, ADATA, color='orange')
>       self.generate_or_test(generate, f, 'vectors_default.png', tolerance=2.5)

aplpy/tests/test_vectors.py:22: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
aplpy/tests/test_images.py:51: in generate_or_test
    msg = compare_images(baseline_image, test_image, tol=tolerance)
/usr/lib64/python2.7/site-packages/matplotlib/testing/compare.py:345: in compare_images
    rms = calculate_rms(expectedImage, actualImage)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

expectedImage = array([[[255, 255, 255],
        [255, 255, 255],
        [255, 255, 255],
   ...55, 255, 255],
        [255, 255, 255],
        [255, 255, 255]]], dtype=int16)
actualImage = array([[[255, 255, 255],
        [255, 255, 255],
        [255, 255, 255],
   ...55, 255, 255],
        [255, 255, 255],
        [255, 255, 255]]], dtype=int16)

    def calculate_rms(expectedImage, actualImage):
        "Calculate the per-pixel errors, then compute the root mean square error."
        if expectedImage.shape != actualImage.shape:
            raise ImageComparisonFailure(
                "image sizes do not match expected size: {0} "
>               "actual size {1}".format(expectedImage.shape, actualImage.shape))
E           ImageComparisonFailure: image sizes do not match expected size: (378, 378, 3) actual size (374, 379, 3)

/usr/lib64/python2.7/site-packages/matplotlib/testing/compare.py:256: ImageComparisonFailure
----------------------------- Captured stdout call -----------------------------
INFO: Auto-setting vmin to -8.523e-02 [aplpy.core]
INFO: Auto-setting vmax to  1.096e+00 [aplpy.core]
----------------------------- Captured stderr call -----------------------------
WARNING: No WCS information found in header - using pixel coordinates [aplpy.header]
WARNING: No WCS information found in header - using pixel coordinates [aplpy.header]
WARNING: No WCS information found in header - using pixel coordinates [aplpy.header]
_________________________ TestVectors.test_step_scale __________________________

self = <aplpy.tests.test_vectors.TestVectors object at 0x7fc6374b1ed0>
generate = None

    def test_step_scale(self, generate):
        f = FITSFigure(IMAGE, figsize=(4,4))
        f.show_grayscale()
        f.show_vectors(PDATA, ADATA, step=2, scale=0.8, color='orange')
>       self.generate_or_test(generate, f, 'vectors_step_scale.png', tolerance=2.5)

aplpy/tests/test_vectors.py:29: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
aplpy/tests/test_images.py:51: in generate_or_test
    msg = compare_images(baseline_image, test_image, tol=tolerance)
/usr/lib64/python2.7/site-packages/matplotlib/testing/compare.py:345: in compare_images
    rms = calculate_rms(expectedImage, actualImage)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

expectedImage = array([[[255, 255, 255],
        [255, 255, 255],
        [255, 255, 255],
   ...55, 255, 255],
        [255, 255, 255],
        [255, 255, 255]]], dtype=int16)
actualImage = array([[[255, 255, 255],
        [255, 255, 255],
        [255, 255, 255],
   ...55, 255, 255],
        [255, 255, 255],
        [255, 255, 255]]], dtype=int16)

    def calculate_rms(expectedImage, actualImage):
        "Calculate the per-pixel errors, then compute the root mean square error."
        if expectedImage.shape != actualImage.shape:
            raise ImageComparisonFailure(
                "image sizes do not match expected size: {0} "
>               "actual size {1}".format(expectedImage.shape, actualImage.shape))
E           ImageComparisonFailure: image sizes do not match expected size: (378, 378, 3) actual size (374, 379, 3)

/usr/lib64/python2.7/site-packages/matplotlib/testing/compare.py:256: ImageComparisonFailure
----------------------------- Captured stdout call -----------------------------
INFO: Auto-setting vmin to -8.523e-02 [aplpy.core]
INFO: Auto-setting vmax to  1.096e+00 [aplpy.core]
----------------------------- Captured stderr call -----------------------------
WARNING: No WCS information found in header - using pixel coordinates [aplpy.header]
WARNING: No WCS information found in header - using pixel coordinates [aplpy.header]
WARNING: No WCS information found in header - using pixel coordinates [aplpy.header]
========== 9 tests deselected by '-knot test_images and not test_rgb' ==========
============= 2 failed, 248 passed, 9 deselected in 63.73 seconds ==============
error: Bad exit status from /var/tmp/rpm-tmp.As6i0l (%check)

Expected results:
No test failures.

Additional info:
This package should not use %{py3dir} macro. It leaves stuff in builddir and is not part of the packaging guidelines anymore. Instead, please either do in-place build for both python3 and python2 or create a separate directories inside builddir. You should also use the new py{2,3}_build and py{2,3}_install macros for cleaner spec file. Finally, I recommend splitting the (Build)Requires into one per line and sorting them alphabetically.


Note You need to log in before you can comment on or make changes to this bug.