Bug 2100002
| Summary: | Updating ovirt-node installs old ovirt-node-ng-image-update-placeholder | ||
|---|---|---|---|
| Product: | [oVirt] ovirt-release | Reporter: | Jean-Louis Dupond <jean-louis> |
| Component: | ovirt-release-host-node | Assignee: | Asaf Rachmani <arachman> |
| Status: | CLOSED CURRENTRELEASE | QA Contact: | cshao <cshao> |
| Severity: | high | Docs Contact: | |
| Priority: | unspecified | ||
| Version: | 4.5.0 | CC: | arachman, bugs, cshao, delfassy, jean-louis, lsvaty, lveyde, mavital, peyu, sbonazzo, weiwang, yaniwang |
| Target Milestone: | ovirt-4.5.2 | Flags: | pm-rhel:
ovirt-4.5?
cshao: testing_ack+ |
| Target Release: | 4.5.2 | ||
| Hardware: | Unspecified | ||
| OS: | Unspecified | ||
| Whiteboard: | |||
| Fixed In Version: | ovirt-release-host-node-4.5.2 | Doc Type: | If docs needed, set a value |
| Doc Text: | Story Points: | --- | |
| Clone Of: | Environment: | ||
| Last Closed: | 2022-08-02 08:29:24 UTC | Type: | Bug |
| Regression: | --- | Mount Type: | --- |
| Documentation: | --- | CRM: | |
| Verified Versions: | Category: | --- | |
| oVirt Team: | Node | RHEL 7.3 requirements from Atomic Host: | |
| Cloudforms Team: | --- | Target Upstream Version: | |
| Embargoed: | |||
|
Description
Jean-Louis Dupond
2022-06-22 07:49:33 UTC
Can you please share imgbase log file and steps to reproduce? Did some more tests and think I found the root cause. When you run the cluster update with 'Check for updates' enabled, it does the following flow: https://github.com/oVirt/ovirt-engine/blob/master/packaging/ansible-runner-service-project/project/roles/ovirt-host-check-upgrade/tasks/main.yml#L33 This will give: # yum check-update -q --exclude=ansible ovirt-node-ng-image-update.noarch 4.5.1-1.el8 ovirt-45-upstream ovirt-node-ng-image-update-placeholder.noarch 4.5.1-1.el8 centos-ovirt45 Obsoleting Packages ovirt-node-ng-image-update.noarch 4.5.1-1.el8 ovirt-45-upstream ovirt-node-ng-image-update.noarch 4.5.0.3-1.el8 @System ovirt-node-ng-image-update.noarch 4.5.1-1.el8 ovirt-45-upstream ovirt-node-ng-image-update-placeholder.noarch 4.5.0.3-1.el8 @System Later we run an update on those packages: https://github.com/oVirt/ovirt-engine/blob/master/packaging/ansible-runner-service-project/project/roles/ovirt-host-upgrade/tasks/main.yml#L66 And this will cause the following: python3[3853683]: ansible-ansible.legacy.dnf Invoked with name=['ovirt-node-ng-image-update-placeholder.noarch'] state=latest lock_timeout=300 conf_file=/tmp/yum.conf allow_downgrade=False autoremo ve=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_b roken=False update_cache=False update_only=False validate_certs=True allowerasing=False nobest=False disable_excludes=None download_dir=None list=None releasever=None Which results in: 2022-06-29T14:51:46+0200 INFO Persisting: ovirt-node-ng-image-update-placeholder-4.5.1-1.el8.noarch.rpm And you end up with: # cd /var/imgbased/persisted-rpms/ # ls -la -rw-r--r--. 1 root root 7.2K Jun 29 14:51 ovirt-node-ng-image-update-placeholder-4.5.1-1.el8.noarch.rpm The next time you will upgrade oVirt via the cluster update, imgbased will see the rpm in the persisted-rpms folder, and try to install it. Causing the up-to-date ovirt-node-ng-image-update-placeholder to be downgraded to the one of the previous version you were running. ovirt-node-ng-image-update-placeholder shouldn't appear in the yum check-update query, sounds like a bug in repo config. The root cause for that seems to be: https://github.com/oVirt/ovirt-release/blob/master/ovirt-release-host-node.spec.in#L183 As the new repo files are now CentOS-oVirt-xxx.repo, this isn't matched anymore, so the includepkgs is not set anymore. Move back due to mistake. This bugzilla is included in oVirt 4.5.2 release, published on August 10th 2022. Since the problem described in this bug report should be resolved in oVirt 4.5.2 release, it has been closed with a resolution of CURRENT RELEASE. If the solution does not work for you, please open a new bug report. |