Bug 2352553 - python-boutdata fails to build with Python 3.14: NotImplementedError: Dataset is not picklable
Summary: python-boutdata fails to build with Python 3.14: NotImplementedError: Dataset...
Keywords:
Status: CLOSED WORKSFORME
Alias: None
Product: Fedora
Classification: Fedora
Component: python-boutdata
Version: rawhide
Hardware: Unspecified
OS: Unspecified
unspecified
unspecified
Target Milestone: ---
Assignee: david08741
QA Contact:
URL:
Whiteboard:
Depends On:
Blocks: PYTHON3.14
TreeView+ depends on / blocked
 
Reported: 2025-03-14 13:23 UTC by Karolina Surma
Modified: 2025-06-11 11:43 UTC (History)
3 users (show)

Fixed In Version:
Clone Of:
Environment:
Last Closed: 2025-06-11 11:43:15 UTC
Type: Bug
Embargoed:


Attachments (Terms of Use)

Description Karolina Surma 2025-03-14 13:23:42 UTC
python-boutdata fails to build with Python 3.14.0a5.

_ TestCollect.test_disconnected_doublenull[2-2-collect_kwargs5-squash_params2] _

self = <boutdata.tests.test_collect.TestCollect object at 0x7fe67a782b70>
tmp_path = PosixPath('/tmp/pytest-of-mockbuild/pytest-0/test_disconnected_doublenull_253')
squash_params = (True, {'parallel': 2})
collect_kwargs = {'xguards': False, 'yguards': False}, mxg = 2, myg = 2

    @pytest.mark.parametrize("squash_params", squash_params_list)
    @pytest.mark.parametrize("collect_kwargs", collect_kwargs_list)
    @pytest.mark.parametrize("mxg", [0, 1, 2])
    @pytest.mark.parametrize("myg", [0, 1, 2])
    def test_disconnected_doublenull(
        self, tmp_path, squash_params, collect_kwargs, mxg, myg
    ):
        """
        Check output from a disconnected double-null case using a large number of
        processes. 'Large' means there is at least one process in each region with no
        edges touching another region.
        """
        squash, squash_kwargs = squash_params
    
        grid_info = make_grid_info(
            mxg=mxg, myg=myg, nxpe=3, nype=18, ixseps1=6, ixseps2=11, xpoints=2
        )
    
        fieldperp_global_yind = 19
        fieldperp_yproc_ind = 4
    
        rng = np.random.default_rng(110)
    
        dump_params = [
            # inner, lower divertor leg
            (0, ["xinner", "ylower"], -1),
            (1, ["ylower"], -1),
            (2, ["xouter", "ylower"], -1),
            (3, ["xinner"], -1),
            (4, [], -1),
            (5, ["xouter"], -1),
            (6, ["xinner"], -1),
            (7, [], -1),
            (8, ["xouter"], -1),
            # inner core
            (9, ["xinner"], -1),
            (10, [], -1),
            (11, ["xouter"], -1),
            (12, ["xinner"], fieldperp_global_yind),
            (13, [], fieldperp_global_yind),
            (14, ["xouter"], fieldperp_global_yind),
            (15, ["xinner"], -1),
            (16, [], -1),
            (17, ["xouter"], -1),
            # inner, upper divertor leg
            (18, ["xinner"], -1),
            (19, [], -1),
            (20, ["xouter"], -1),
            (21, ["xinner"], -1),
            (22, [], -1),
            (23, ["xouter"], -1),
            (24, ["xinner", "yupper"], -1),
            (25, ["yupper"], -1),
            (26, ["xouter", "yupper"], -1),
            # outer, upper divertor leg
            (27, ["xinner", "ylower"], -1),
            (28, ["ylower"], -1),
            (29, ["xouter", "ylower"], -1),
            (30, ["xinner"], -1),
            (31, [], -1),
            (32, ["xouter"], -1),
            (33, ["xinner"], -1),
            (34, [], -1),
            (35, ["xouter"], -1),
            # outer core
            (36, ["xinner"], -1),
            (37, [], -1),
            (38, ["xouter"], -1),
            (39, ["xinner"], -1),
            (40, [], -1),
            (41, ["xouter"], -1),
            (42, ["xinner"], -1),
            (43, [], -1),
            (44, ["xouter"], -1),
            # outer, lower divertor leg
            (45, ["xinner"], -1),
            (46, [], -1),
            (47, ["xouter"], -1),
            (48, ["xinner"], -1),
            (49, [], -1),
            (50, ["xouter"], -1),
            (51, ["xinner", "yupper"], -1),
            (52, ["yupper"], -1),
            (53, ["xouter", "yupper"], -1),
        ]
        dumps = []
        for i, boundaries, fieldperp_yind in dump_params:
            dumps.append(
                create_dump_file(
                    tmpdir=tmp_path,
                    rng=rng,
                    grid_info=grid_info,
                    i=i,
                    boundaries=boundaries,
                    fieldperp_global_yind=fieldperp_yind,
                )
            )
    
        expected = concatenate_data(
            dumps, nxpe=grid_info["NXPE"], fieldperp_yproc_ind=fieldperp_yproc_ind
        )
    
>       check_collected_data(
            expected,
            fieldperp_global_yind=fieldperp_global_yind,
            doublenull=True,
            path=tmp_path,
            squash=squash,
            collect_kwargs=collect_kwargs,
            squash_kwargs=squash_kwargs,
        )

boutdata/tests/test_collect.py:1724: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
boutdata/tests/test_collect.py:84: in check_collected_data
    squashoutput(path, outputname="boutdata.nc", **collect_kwargs, **squash_kwargs)
boutdata/squashoutput.py:153: in squashoutput
    outputs = BoutOutputs(
boutdata/data.py:1175: in __init__
    self._init_parallel()
boutdata/data.py:1251: in _init_parallel
    worker.start()
/usr/lib64/python3.14/multiprocessing/process.py:121: in start
    self._popen = self._Popen(self)
/usr/lib64/python3.14/multiprocessing/context.py:224: in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
/usr/lib64/python3.14/multiprocessing/context.py:300: in _Popen
    return Popen(process_obj)
/usr/lib64/python3.14/multiprocessing/popen_forkserver.py:35: in __init__
    super().__init__(process_obj)
/usr/lib64/python3.14/multiprocessing/popen_fork.py:20: in __init__
    self._launch(process_obj)
/usr/lib64/python3.14/multiprocessing/popen_forkserver.py:47: in _launch
    reduction.dump(process_obj, buf)
/usr/lib64/python3.14/multiprocessing/reduction.py:60: in dump
    ForkingPickler(file, protocol).dump(obj)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

>   ???
E   NotImplementedError: Dataset is not picklable
E   when serializing dict item 'handle'
E   when serializing boututils.datafile.DataFile_netCDF state
E   when serializing boututils.datafile.DataFile_netCDF object
E   when serializing dict item 'impl'
E   when serializing boututils.datafile.DataFile state
E   when serializing boututils.datafile.DataFile object
E   when serializing dict item '_file0'
E   when serializing boutdata.data.BoutOutputs state
E   when serializing boutdata.data.BoutOutputs object
E   when serializing tuple item 0
E   when serializing method reconstructor arguments
E   when serializing method object
E   when serializing dict item '_target'
E   when serializing multiprocessing.context.Process state
E   when serializing multiprocessing.context.Process object

https://docs.python.org/3.14/whatsnew/3.14.html

For the build logs, see:
https://copr-be.cloud.fedoraproject.org/results/@python/python3.14/fedora-rawhide-x86_64/08768201-python-boutdata/

For all our attempts to build python-boutdata with Python 3.14, see:
https://copr.fedorainfracloud.org/coprs/g/python/python3.14/package/python-boutdata/

Testing and mass rebuild of packages is happening in copr.
You can follow these instructions to test locally in mock if your package builds with Python 3.14:
https://copr.fedorainfracloud.org/coprs/g/python/python3.14/

Let us know here if you have any questions.

Python 3.14 is planned to be included in Fedora 43.
To make that update smoother, we're building Fedora packages with all pre-releases of Python 3.14.
A build failure prevents us from testing all dependent packages (transitive [Build]Requires),
so if this package is required a lot, it's important for us to get it fixed soon.

We'd appreciate help from the people who know this package best,
but if you don't want to work on this now, let us know so we can try to work around it on our side.


Note You need to log in before you can comment on or make changes to this bug.