Bug 2218141 - openstack collection: Incompatible openstacksdk library found: Version MUST be >=0.36 and <=0.98.999, but 0.101.0 is larger than maximum version 0.98.999
Summary: openstack collection: Incompatible openstacksdk library found: Version MUST b...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Fedora
Classification: Fedora
Component: ansible-collections-openstack
Version: 38
Hardware: Unspecified
OS: Linux
unspecified
medium
Target Milestone: ---
Assignee: Sagi Shnaidman
QA Contact:
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2023-06-28 10:06 UTC by Martin Pitt
Modified: 2023-07-23 01:27 UTC (History)
10 users (show)

Fixed In Version: ansible-collections-openstack-2.1.0-1.fc38
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2023-07-23 01:27:42 UTC
Type: ---
Embargoed:
jmeng: needinfo-


Attachments (Terms of Use)
Hack to lowers the version dependency against openstacksdk from 1.0.0 to 0.101.0 (2.06 KB, patch)
2023-07-12 19:25 UTC, Jakob Meng
no flags Details | Diff

Description Martin Pitt 2023-06-28 10:06:57 UTC
Apparently Fedora 38's ansible and python3-openstacksdk packages went out of sync. When trying to use the module, it fails:

❱❱❱ ansible-inventory -i inventory --list
[WARNING]:  * Failed to parse cockpituous/ansible/inventory/openstack.yml with auto plugin: Incompatible openstacksdk library found: Version MUST be >=0.36 and <=0.98.999, but 0.101.0 is larger than maximum version 0.98.999.
[WARNING]: Unable to parse cockpituous/ansible/inventory/openstack.yml as an inventory source

So either the openstack collection needs to be updated, or python3-openstacksdk downgraded, or the version check relaxed.

Reproducible: Always

Steps to Reproduce:
1. git clone https://github.com/cockpit-project/cockpituous/
2. cd cockpituous/ansible
3. ansible-inventory -i inventory --list

Comment 1 Martin Pitt 2023-06-28 10:15:12 UTC
> or the version check relaxed.

This doesn't seem to work. Also, /usr/lib/python3.11/site-packages/ansible_collections/openstack/cloud/CHANGELOG.rst explicitly mentions:

  - Lowered maximum OpenStack SDK version to 0.98.999 in inventory plugin

so supposedly that happened for a reason.

Comment 2 Maxwell G 2023-06-28 16:02:01 UTC
It would've been helpful if you filled out the full Bugzilla template with the component versions for the ansible, ansible-core, and python-openstacksdk packages, but okay. 

I would suggest installing the parallel installable `ansible-collections-openstack` package which contains v2 that is compatible with newer python-openstacksdk versions. This will override the version included in the ansible package. I don't think we can do a major version bump of the ansible package in a stable release.

Comment 3 Martin Pitt 2023-06-28 16:16:30 UTC
Argh, sorry! I meant to, but forgot about it:

python3-openstacksdk-0.101.0-2.fc38.noarch
ansible-7.6.0-1.fc38.noarch
ansible-core-2.14.6-1.fc38.noarch

Thanks for the hint with "ansible-collections-openstack". This gets past the version check indeed. I now get

   [WARNING]:  * Failed to parse /var/home/martin/upstream/cockpituous/ansible/inventory/openstack.yml with auto plugin: 'region'

(on https://github.com/cockpit-project/cockpituous/blob/main/ansible/inventory/openstack.yml), but I'll poke at that -- presumably the format changed somehow.

Thanks!

Comment 4 Maxwell G 2023-06-28 16:47:28 UTC
> Argh, sorry! I meant to, but forgot about it:

No worries :).

> Thanks for the hint with "ansible-collections-openstack". This gets past the version check indeed.

Cool! I'll close this issue, as I think that's the best solution for now. Hopefully, we can avoid work with the openstacksdk maintainers to avoid these type of synchronization problems in the future. If anything else pops up, let us know! Thanks for the report.

Comment 5 Martin Pitt 2023-06-29 04:28:52 UTC
For anyone following along at home, the combination above is completely broken, at least for Red Hat's PSI OpenStack. The above "failed to parse" expands to a stack trace with -vvv:

  File "/usr/lib/python3.11/site-packages/ansible/inventory/manager.py", line 293, in parse_source
    plugin.parse(self._inventory, self._loader, source, cache=cache)
  File "/usr/lib/python3.11/site-packages/ansible/plugins/inventory/auto.py", line 59, in parse
    plugin.parse(inventory, loader, path, cache=cache)
  File "/usr/share/ansible/collections/ansible_collections/openstack/cloud/plugins/inventory/openstack.py", line 276, in parse
    self._populate_from_source(source_data)
  File "/usr/share/ansible/collections/ansible_collections/openstack/cloud/plugins/inventory/openstack.py", line 294, in _populate_from_source
    self._append_hostvars(hostvars, groups, name, servers[0])
  File "/usr/share/ansible/collections/ansible_collections/openstack/cloud/plugins/inventory/openstack.py", line 405, in _append_hostvars
    for group in self._get_groups_from_server(server, namegroup=namegroup):
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/share/ansible/collections/ansible_collections/openstack/cloud/plugins/inventory/openstack.py", line 345, in _get_groups_from_server
    region = server_vars['region']
             ~~~~~~~~~~~^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/openstack/resource.py", line 698, in __getitem__
    raise KeyError(name)

Yes, because server_vars has a different structure, the region and cloud name are in a sub-dictionary "location":

  openstack.compute.v2.server.Server(id=[...], metadata={...},  location=Munch({'cloud': 'rhos-01', 'region_name': 'regionOne', 'zone': 'nova', ...}), ...)

Hacking /usr/share/ansible/collections/ansible_collections/openstack/cloud/plugins/inventory/openstack.py _get_groups_from_server() from

        region = server_vars['region']
        cloud = server_vars['cloud']

to

        region = server_vars['location']['region_name']
        cloud = server_vars['location']['cloud']

get a little further. But then it crashes with

ERROR! Unexpected Exception, this is probably a bug: 'Image' object has no attribute 'owner_seen'
the full traceback was:

Traceback (most recent call last):
  File "/usr/lib/python3.11/site-packages/openstack/resource.py", line 170, in __get__
    value = attributes[self.name]
            ~~~~~~~~~~^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/openstack/resource.py", line 275, in __getitem__
    return self.attributes[key]
           ~~~~~~~~~~~~~~~^^^^^
KeyError: 'owner'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.11/site-packages/ansible/cli/__init__.py", line 647, in cli_executor
    exit_code = cli.run()
                ^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/cli/inventory.py", line 156, in run
    results = self.dump(results)
              ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/cli/inventory.py", line 199, in dump
    results = json.dumps(stuff, cls=AnsibleJSONEncoder, sort_keys=True, indent=4, preprocess_unsafe=True, ensure_ascii=False)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/json/__init__.py", line 238, in dumps
    **kw).encode(obj)
          ^^^^^^^^^^^
  File "/usr/lib64/python3.11/json/encoder.py", line 200, in encode
    chunks = self.iterencode(o, _one_shot=True)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/module_utils/common/json.py", line 84, in iterencode
    o = _preprocess_unsafe_encode(o)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/module_utils/common/json.py", line 37, in _preprocess_unsafe_encode
    value = dict((k, _preprocess_unsafe_encode(v)) for k, v in value.items())
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/module_utils/common/json.py", line 37, in <genexpr>
    value = dict((k, _preprocess_unsafe_encode(v)) for k, v in value.items())
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/module_utils/common/json.py", line 37, in _preprocess_unsafe_encode
    value = dict((k, _preprocess_unsafe_encode(v)) for k, v in value.items())
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/module_utils/common/json.py", line 37, in <genexpr>
    value = dict((k, _preprocess_unsafe_encode(v)) for k, v in value.items())
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/module_utils/common/json.py", line 37, in _preprocess_unsafe_encode
    value = dict((k, _preprocess_unsafe_encode(v)) for k, v in value.items())
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/module_utils/common/json.py", line 37, in <genexpr>
    value = dict((k, _preprocess_unsafe_encode(v)) for k, v in value.items())
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/module_utils/common/json.py", line 37, in _preprocess_unsafe_encode
    value = dict((k, _preprocess_unsafe_encode(v)) for k, v in value.items())
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/module_utils/common/json.py", line 37, in <genexpr>
    value = dict((k, _preprocess_unsafe_encode(v)) for k, v in value.items())
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/module_utils/common/json.py", line 37, in _preprocess_unsafe_encode
    value = dict((k, _preprocess_unsafe_encode(v)) for k, v in value.items())
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/module_utils/common/json.py", line 37, in <genexpr>
    value = dict((k, _preprocess_unsafe_encode(v)) for k, v in value.items())
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/ansible/module_utils/common/json.py", line 37, in _preprocess_unsafe_encode
    value = dict((k, _preprocess_unsafe_encode(v)) for k, v in value.items())
                                                               ^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/openstack/resource.py", line 762, in items
    res.append((attr, self[attr]))
                      ~~~~^^^^^^
  File "/usr/lib/python3.11/site-packages/openstack/resource.py", line 683, in __getitem__
    return getattr(self, name)
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/openstack/resource.py", line 670, in __getattribute__
    raise e
  File "/usr/lib/python3.11/site-packages/openstack/resource.py", line 658, in __getattribute__
    return object.__getattribute__(self, name)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/site-packages/openstack/resource.py", line 191, in __get__
    delattr(instance, seen_flag)
AttributeError: 'Image' object has no attribute 'owner_seen'

I gave up at this point, and just reverted to the earlier openstacksdk version:

  sudo dnf install https://kojipkgs.fedoraproject.org//packages/python-openstacksdk/0.61.0/5.fc37/noarch/python3-openstacksdk-0.61.0-5.fc37.noarch.rpm

this works fine.

Comment 6 Maxwell G 2023-06-29 05:11:55 UTC
Thanks for the follow up, Martin. I'm going to reassign this bug to ansible-collections-openstack, as this is no longer a bug with ansible itself. I believe the Fedora maintainers are also the upstream maintainers, so they should have more context than I.

Comment 7 Maxwell G 2023-06-29 05:17:12 UTC
I'm not sure about the underlying issue with ansible-collections-openstack-2.0.0-0.2.ed36d82git.fc38 and python3-openstacksdk-0.101.0-2.fc38.noarch, but we could fix the ansible 7 (with v1 of the openstack collection) version compat issue by introducing a python3-openstacksdk compat package.

Comment 8 Jakob Meng 2023-06-29 07:00:23 UTC
ansible-collections-openstack aka openstack.cloud (collection name on Galaxy) has two release series [0], in a nutshell:

* openstack.cloud 1.x.x is for openstacksdk <0.99.0
* openstack.cloud 2.x.x is for openstacksdk >=0.99.0

Fedora 37 and 38 [2] have Ansible 7 which contains openstack.cloud 1.10.0 and thus requires openstacksdk <0.99.0. In Fedora 38, openstacksdk was bumped to 0.101.0 though [3]. This bump was necessary for RDO and other reasons but openstack.cloud 2.x.x was not released yet (we had to port dozens over dozens of modules..) and thus a prerelease for 2.0.0 was added to Fedora 38 (and RDO). In January we released 2.0.0 finally but that was too late for Ansible 7 and thus its only available in Ansible 8.

For users the safest / easiest option is probably to install openstacksdk and ansible in a venv and fetch openstack.cloud from Ansible Galaxy.

I do not see any solution for Fedora 38 that does not break anything:

I assume we cannot bump openstack.cloud from 1.x.x to 2.x.x in Ansible 7 and we cannot bump Ansible from 7 to 8 in Fedora 38 without breaking backward compat promises.

Not sure how this openstacksdk compat package as proposed by Maxwell's proposal would work. It would have to conflict with the openstacksdk 0.101.0 package in Fedora 38, but then openstackclient would get broken. Installing both at the same time would probably not help because Ansible or rather the Python modules in openstack.cloud would always pull in sdk 0.101.0.

Fedora's RPM ansible-collections-openstack 2.0.0 prerelease is horrible outdated and half-broken. If possible we should drop the whole package, which benefits does it have for users anyway?!?

Since openstack.cloud from Ansible 7 in Fedora 8 is not useable anyway atm there is one ugly hack which we could do though: Replace openstack.cloud 1.10.0 in F38's ansible RPM with openstack.cloud 2.1.0. Of course this breaks backward compat within the F38 release but openstack.cloud as pulled in from F38's ansible RPM is not useable anyway atm.

@Sagi, @Alfredo: Any other ideas? Suggestions? Rants? :D

[0] https://opendev.org/openstack/ansible-collections-openstack/src/branch/master/docs/branching.md
[1] https://github.com/ansible-community/ansible-build-data/blob/main/7/CHANGELOG-v7.rst
[2] https://packages.fedoraproject.org/pkgs/ansible/ansible/
[3] https://packages.fedoraproject.org/pkgs/python-openstacksdk/python3-openstacksdk/
[4] https://src.fedoraproject.org/rpms/ansible-collections-openstack
[5] https://github.com/ansible-community/ansible-build-data/blob/main/8/CHANGELOG-v8.rst

Comment 9 Jakob Meng 2023-06-29 07:07:30 UTC
Or we bump RPM ansible-collections-openstack in F38 to 2.1.0 and require it from F38's ansible (v7) package?

Comment 10 Maxwell G 2023-06-29 19:17:01 UTC
I think we should update ansible-collections-openstack to 2.1.0 regardless. Is there a reason you've kept the prerelease up until now?

Comment 11 Sagi Shnaidman 2023-06-29 21:20:27 UTC
Agree about bumping the ansible-collections-openstack collection, actually it should be done for all Fedora releases I think.
If @amorelej doesn't have an automated mechanism to do it, we'll do it manually. I think RDO folks has it for CentOS and maybe for Fedora too..?

Comment 12 Jakob Meng 2023-07-07 12:41:44 UTC
Argh, we cannot simply update ansible-collections-openstack RPM to >=2.0.0 in F38, because >=2.0.0 requires python3-openstacksdk >=1.0.0 which is available in F39 only. F38 has 0.101.0 and bumping it would also require bumping python-openstackclient and probably other packages. The initial plan was to backport openstacksdk 1.0.0 to OpenStack Zed officially [0], then bump the RPMs in Fedora and in RDO Zed and finally replace this prerelease with the final 2.0.0 of ansible-collections-openstack. Just nobody had time to do it.

The prerelease was meant as an temporary solution to allow bumping RPMs for openstacksdk etc. in RDO Zed to >=0.99.0 without breaking RDO (which (used to?) use ansible-collections-openstack). The proper solution to get rid of this cursed prerelease in F38 would be to bump python-openstacksdk, python-openstackclient etc to what we have in F39 and then bump a-c-o to 2.1.0. Despite what the version numbers say, the big breaking change was openstacksdk 0.99.0, not 1.0.0. But bumping all versions is a lot of work for minimal benefit. People should really pull ansible-collections-openstack from Ansible Galaxy.

[0] Backporting openstacksdk 1.0.0 (and thus replacing 0.101.0 or whatever it has) to Zed is probably safe, because 0.99.0 broke backward compat with <0.99.0 while 1.0.0 has only small non-breaking changes compared to 0.99.0/0.101.0.

Comment 13 Alfredo Moralejo 2023-07-12 14:52:56 UTC
The policy we have been following in Fedora is to update all the openstack related clients as a whole in rawhide on each new OpenStack major release:

F37 -> yoga
F38 -> zed
F39 -> antelope

As explained by Jakob, updating a single package to a different version can be problematic because of compatibilities issues between packages (only the clients and library versions in the same major openstack version are tested together and there is no guarantee that any mix of releases can work together). The case of prerelease in Zed has already been explained also.

- Would a-c-o-2.1.0 work with openstacksdk-0.101.0 ? (you mentioned 1.0.0 has only small non-breaking changes compared to 0.99.0/0.101.0)

If that's the case, i think updating only a-c-o to 2.1.0 would be the best solution for FC38

Comment 14 Alfredo Moralejo 2023-07-12 14:57:23 UTC
another question, given that, iiuc, openstack collection is bundled in ansible rpm, does it makes sense to maintain it also in a separated package too for F39?

Comment 15 Maxwell G 2023-07-12 17:26:18 UTC
Well, we established that the openstack collection in the ansible rpm doesn't work on F38, because the openstack sdk version is too new. In general, I think having standalone collections is useful. In most cases, users shouldn't need the entire ansible bundle. We encourage users to utilize individual collections, whether installed through the RPM or with ansible-galaxy. I prefer managing my software and updates in one place (i.e. through the system package manager) where possible.

Comment 16 Jakob Meng 2023-07-12 19:23:02 UTC
Most modules in Ansible OpenStack collection 2.1.0 are (should?) be compatible with openstacksdk 0.101.0, but some code actually requires openstacksdk 1.0.0. Still, introducing 2.1.0 to F38 will greatly improve the situation for users because the prerelease is horrible broken in many places.

As discussed with Alfredo, what we will do for F38 is to bump the package to 2.1.0 and add a patch just for the F38 RPM which will reduce the dependency against openstacksdk from 1.0.0 to 0.101.0.

In general I agree with maxwell that its better for user to manage all software in a single place, but this requires continuous maintenance. Who would be interested in maintaining the collection both upstream and here in Fedora though?

Comment 17 Jakob Meng 2023-07-12 19:25:26 UTC
Created attachment 1975460 [details]
Hack to lowers the version dependency against openstacksdk from 1.0.0 to 0.101.0

Patch for the F38 RPM which lowers the version dependency against openstacksdk from 1.0.0 to 0.101.0.

Comment 18 Alfredo Moralejo 2023-07-14 11:13:54 UTC
Sent PR for F38 update to 2.1.0 + Jakob's patch:

https://src.fedoraproject.org/rpms/ansible-collections-openstack/pull-request/3#

Comment 19 Fedora Update System 2023-07-14 12:36:22 UTC
FEDORA-2023-88df046168 has been submitted as an update to Fedora 38. https://bodhi.fedoraproject.org/updates/FEDORA-2023-88df046168

Comment 20 Martin Pitt 2023-07-14 15:00:35 UTC
Thanks! I tested this and karma'ed in bodhi.

Comment 21 Fedora Update System 2023-07-15 01:32:49 UTC
FEDORA-2023-88df046168 has been pushed to the Fedora 38 testing repository.
Soon you'll be able to install the update with the following command:
`sudo dnf upgrade --enablerepo=updates-testing --refresh --advisory=FEDORA-2023-88df046168`
You can provide feedback for this update here: https://bodhi.fedoraproject.org/updates/FEDORA-2023-88df046168

See also https://fedoraproject.org/wiki/QA:Updates_Testing for more information on how to test updates.

Comment 22 Fedora Update System 2023-07-23 01:27:42 UTC
FEDORA-2023-88df046168 has been pushed to the Fedora 38 stable repository.
If problem still persists, please make note of it in this bug report.


Note You need to log in before you can comment on or make changes to this bug.