RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 2176490 - Command 'pcs config checkpoint diff' does not show configuration differences between checkpoints
Summary: Command 'pcs config checkpoint diff' does not show configuration differences ...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Linux 8
Classification: Red Hat
Component: pcs
Version: 8.2
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: rc
: 8.9
Assignee: Miroslav Lisik
QA Contact: cluster-qe
Steven J. Levine
URL:
Whiteboard:
Depends On:
Blocks: 2180700 2180701 2180702 2180703
TreeView+ depends on / blocked
 
Reported: 2023-03-08 14:13 UTC by Tomas Jelinek
Modified: 2023-11-14 15:56 UTC (History)
10 users (show)

Fixed In Version: pcs-0.10.16-1.el8
Doc Type: Bug Fix
Doc Text:
.The `pcs config checkpoint diff` command now works correctly for all configuration sections When it was introduced in the the RHEL 8.1 release, the `pcs config checkpoint diff` command did not show the differences for the Fencing Levels, Ordering Constraints, Colocation Constraints, and Ticket Constraints configuration sections. As of the RHEL 8.4 release, the `pcs config checkpoint diff` command had stopped showing the differences for the the Resources Defaults and Operations Defaults configuration sections. As of the RHEL 8.7 release, the `pcs config checkpoint diff` command had stopped showing the differences for the Resources, and Stonith devices configuration sections. This is because as the code responsible for displaying each of the different configuration sections switched to a new mechanism for loading CIB files, the loaded content was cached. The second file used for the difference comparison was not loaded and the cached content of the first file was used instead. As a result, the `diff` command yielded no output. With this fix, the CIB file content is no longer cached and the `pcs config checkpoint diff` command shows differences for all configuration sections.
Clone Of: 2175881
: 2180700 2180701 2180702 2180703 (view as bug list)
Environment:
Last Closed: 2023-11-14 15:22:35 UTC
Type: Bug
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Bugzilla 1655055 1 None None None 2023-03-08 14:13:15 UTC
Red Hat Issue Tracker CLUSTERQE-6465 0 None None None 2023-03-20 12:02:10 UTC
Red Hat Issue Tracker RHELPLAN-151060 0 None None None 2023-03-08 14:14:49 UTC
Red Hat Product Errata RHBA-2023:6903 0 None None None 2023-11-14 15:23:35 UTC

Description Tomas Jelinek 2023-03-08 14:13:15 UTC
+++ This bug was initially created as a clone of Bug #2175881 +++

Description of problem:
Command 'pcs config checkpoint diff' does not show configuration differences between checkpoints. Only differences for 'Cluster Properties section' are displayed.

Displaying differences in following configuration sections does not work:
* Resources
* Stonith Devices
* Fencing Levels
* Location Constraints
* Ordering Constraints
* Colocation Constraints
* Ticket Constraints
* Alerts
* Resources Defaults
* Operations Defaults
* Tags

Version-Release number of selected component (if applicable):
pcs-0.11.4-6.el9

How reproducible:
always

Steps to Reproduce:
1. Create an empty cluster
2. Create resources, stonith devices, fencing levels, constraint of all types, alerts, resource and operation defaults and tags.
3. Run command `pcs config checkpoint diff 1 live`

Actual results:
Only 'Cluster Properties' section has displayed differences marked by a '+' sign.

Expected results:
All sections should have displayed differences in configuration.

Additional info:
Differences should be displayed for added, removed or modified configuration.

Comment 2 Miroslav Lisik 2023-03-20 10:16:40 UTC
Upstream commit: https://github.com/ClusterLabs/pcs/commit/2e225d1e46f962fddb1d6e6bd1d13a2348dc5c41
Updated commands:
  * pcs config checkpoint diff

Test:

Prepare configuration for `pcs config`:

export NODELIST=(r8-node-01 r8-node-02 r8-node-03)
pcs host auth -u hacluster -p $PASSWORD ${NODELIST[*]}
pcs cluster setup HACluster ${NODELIST[*]} --start --wait
for node in ${NODELIST[*]}; do
    pcs stonith create fence-1-$node fence_xvm;
done
for node in ${NODELIST[*]}; do
    pcs stonith create fence-2-$node fence_xvm;
done
for node in ${NODELIST[*]}; do
    pcs stonith level add 1 $node fence-1-$node;
    pcs stonith level add 2 $node fence-2-$node;
done
pcs resource create p-1 ocf:pacemaker:Dummy --no-default-ops
pcs resource create p-2 ocf:pacemaker:Dummy --no-default-ops
pcs constraint location p-1 prefers ${NODELIST[0]}
pcs constraint location p-2 avoids ${NODELIST[0]}
pcs resource create s-1 ocf:pacemaker:Stateful promotable --no-default-ops
pcs constraint location s-1-clone rule role=master "#uname" eq ${NODELIST[0]}
pcs resource create oc-1 ocf:pacemaker:Dummy --no-default-ops
pcs resource create oc-2 ocf:pacemaker:Dummy --no-default-ops
pcs constraint order oc-1 then oc-2
pcs constraint colocation add oc-2 with oc-1
pcs resource create oc-set-1 ocf:pacemaker:Dummy --no-default-ops
pcs resource create oc-set-2 ocf:pacemaker:Dummy --no-default-ops
pcs constraint order set oc-set-1 oc-set-2
pcs constraint colocation set oc-set-2 oc-set-1
pcs resource create t ocf:pacemaker:Dummy --no-default-ops
pcs constraint ticket add Ticket t
pcs constraint ticket set p-1 p-2 setoptions ticket=Ticket-set
pcs alert create path=/usr/bin/true id=Alert
pcs alert recipient add Alert value=recipient-value
pcs resource defaults resource-stickiness=2
pcs resource op defaults timeout=90
pcs property set maintenance-mode=false
pcs tag create TAG p-1 p-2
pcs resource defaults set create id=set-1 meta target-role=Started
pcs resource op defaults set create id=op-set-1 score=10 meta interval=30s


Test `pcs config checkpoint diff` commmand:

pcs config checkpoint diff 1 live
pcs config checkpoint diff live 1

Comment 12 Michal Pospisil 2023-05-29 10:10:03 UTC
DevTestResults:

[root@r08-09-a ~]# rpm -q pcs
pcs-0.10.16-1.el8.x86_64

[root@r08-09-a ~]# pcs config checkpoint diff 1 live
Differences between checkpoint 1 (-) and live configuration (+):
  Resources:
+   Resource: r1 (class=ocf provider=pacemaker type=Dummy)
+     Operations:
+       migrate_from: r1-migrate_from-interval-0s
+         interval=0s
+         timeout=20s
+       migrate_to: r1-migrate_to-interval-0s
+         interval=0s
+         timeout=20s
+       monitor: r1-monitor-interval-10s
+         interval=10s
+         timeout=20s
+       reload: r1-reload-interval-0s
+         interval=0s
+         timeout=20s
+       reload-agent: r1-reload-agent-interval-0s
+         interval=0s
+         timeout=20s
+       start: r1-start-interval-0s
+         interval=0s
+         timeout=20s
+       stop: r1-stop-interval-0s
+         interval=0s
+         timeout=20s
+   Resource: p-1 (class=ocf provider=pacemaker type=Dummy)
+     Operations:
+       monitor: p-1-monitor-interval-10s
+         interval=10s
+         timeout=20s
+   Resource: p-2 (class=ocf provider=pacemaker type=Dummy)
+     Operations:
+       monitor: p-2-monitor-interval-10s
+         interval=10s
+         timeout=20s
+   Resource: oc-1 (class=ocf provider=pacemaker type=Dummy)
+     Operations:
+       monitor: oc-1-monitor-interval-10s
+         interval=10s
+         timeout=20s
+   Resource: oc-2 (class=ocf provider=pacemaker type=Dummy)
+     Operations:
+       monitor: oc-2-monitor-interval-10s
+         interval=10s
+         timeout=20s
+   Resource: oc-set-1 (class=ocf provider=pacemaker type=Dummy)
+     Operations:
+       monitor: oc-set-1-monitor-interval-10s
+         interval=10s
+         timeout=20s
+   Resource: oc-set-2 (class=ocf provider=pacemaker type=Dummy)
+     Operations:
+       monitor: oc-set-2-monitor-interval-10s
+         interval=10s
+         timeout=20s
+   Resource: t (class=ocf provider=pacemaker type=Dummy)
+     Operations:
+       monitor: t-monitor-interval-10s
+         interval=10s
+         timeout=20s
+   Clone: s-1-clone
+     Meta Attributes: s-1-clone-meta_attributes
+       promotable=true
+     Resource: s-1 (class=ocf provider=pacemaker type=Stateful)
+       Operations:
+         monitor: s-1-monitor-interval-10s
+           interval=10s
+           timeout=20s
+           role=Master
+         monitor: s-1-monitor-interval-11s
+           interval=11s
+           timeout=20s
+           role=Slave

  Stonith Devices:
+   Resource: fence-1-r8-09-a.vm (class=stonith type=fence_xvm)
+     Operations:
+       monitor: fence-1-r8-09-a.vm-monitor-interval-60s
+         interval=60s
+   Resource: fence-1-r08-09-b.vm (class=stonith type=fence_xvm)
+     Operations:
+       monitor: fence-1-r08-09-b.vm-monitor-interval-60s
+         interval=60s
+   Resource: fence-2-r8-09-a.vm (class=stonith type=fence_xvm)
+     Operations:
+       monitor: fence-2-r8-09-a.vm-monitor-interval-60s
+         interval=60s
+   Resource: fence-2-r08-09-b.vm (class=stonith type=fence_xvm)
+     Operations:
+       monitor: fence-2-r08-09-b.vm-monitor-interval-60s
+         interval=60s
+   Resource: fence-1-r08-09-a.vm (class=stonith type=fence_xvm)
+     Operations:
+       monitor: fence-1-r08-09-a.vm-monitor-interval-60s
+         interval=60s
+   Resource: fence-2-r08-09-a.vm (class=stonith type=fence_xvm)
+     Operations:
+       monitor: fence-2-r08-09-a.vm-monitor-interval-60s
+         interval=60s
  Fencing Levels:
+   Target: r08-09-a.vm
+     Level 1 - fence-1-r08-09-a.vm
+     Level 2 - fence-2-r08-09-a.vm
+   Target: r08-09-b.vm
+     Level 1 - fence-1-r08-09-b.vm
+     Level 2 - fence-2-r08-09-b.vm

  Location Constraints:
+   Resource: p-1
+     Enabled on:
+       Node: r08-09-a.vm (score:INFINITY) (id:location-p-1-r08-09-a.vm-INFINITY)
+   Resource: p-2
+     Disabled on:
+       Node: r08-09-a.vm (score:-INFINITY) (id:location-p-2-r08-09-a.vm--INFINITY)
+   Resource: s-1-clone
+     Constraint: location-s-1-clone
+       Rule: role=Master score=INFINITY (id:location-s-1-clone-rule)
        Expression: #uname eq r08-09-a.vm (id:location-s-1-clone-rule-expr)
  Ordering Constraints:
+   start oc-1 then start oc-2 (kind:Mandatory) (id:order-oc-1-oc-2-mandatory)
+   Resource Sets:
+     set oc-set-1 oc-set-2 (id:order_set_o1o2_set) (id:order_set_o1o2)
  Colocation Constraints:
+   oc-2 with oc-1 (score:INFINITY) (id:colocation-oc-2-oc-1-INFINITY)
+   Resource Sets:
+     set oc-set-2 oc-set-1 (id:colocation_set_o2o1_set) setoptions score=INFINITY (id:colocation_set_o2o1)
  Ticket Constraints:
+   t ticket=Ticket (id:ticket-Ticket-t)
+   Resource Sets:
+     set p-1 p-2 (id:ticket_set_p1p2_set) setoptions ticket=Ticket-set (id:ticket_set_p1p2)

  Alerts:
-  No alerts defined
+  Alert: Alert (path=/usr/bin/true)
+   Recipients:
+    Recipient: Alert-recipient (value=recipient-value)

  Resources Defaults:
-   No defaults set
+   Meta Attrs: rsc_defaults-meta_attributes
+     resource-stickiness=2
+   Meta Attrs: set-1
+     target-role=Started
  Operations Defaults:
-   No defaults set
+   Meta Attrs: op_defaults-meta_attributes
+     timeout=90
+   Meta Attrs: op-set-1 score=10
+     interval=30s

- Cluster Properties:
+ Cluster Properties: cib-bootstrap-options
+   cluster-infrastructure=corosync
+   cluster-name=pre1
+   dc-version=2.1.5-8.el8-a3f44794f94
+   have-watchdog=false
+   maintenance-mode=false
+   placement-strategy=minimal

  Tags:
-  No tags defined
+  TAG
+    p-1
+    p-2

Comment 17 svalasti 2023-06-29 10:08:40 UTC
[root@virt-500 ~]# rpm -q pcs
pcs-0.10.16-1.el8.x86_64

Starting tests, logging all output to /tmp/vedder.CHERRY.STSRHTS11062.202306291158
Starting XMLRPC server....
DEBUG:STSXMLRPC_DAEMON:Debugging enabled
INFO:STSXMLRPC:Server started. Logging to /tmp/vedder.CHERRY.STSRHTS11062.202306291158/stsxmlrpc.log
XMLRPC server started at http://virt-482:36909/
<start name="pcs,cli,CheckpointDiff" id="pcs,cli,CheckpointDiff" pid="256418" time="Thu Jun 29 11:58:17 2023 +0200" type="cmd" />
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:17 INFO:	PCS_CLUSTER_DESTROY
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:20 INFO:	cluster destroyed.
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:20 INFO:	CHECK_CLUSTER_HEALTH
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:20 INFO:	SETUP_PCSD_PORTS
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:20 INFO:	CLEAR_DAEMON_CONFIGS
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:20 INFO:	RESTART_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:22 INFO:	
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:22 INFO:	SET_PASSWORD
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:23 INFO:	RESTART_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:24 INFO:	ENABLE_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:24 INFO:	PCS_HOST_AUTH
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:24 INFO:	running auth from virt-500
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:26 INFO:	PCS_STATUS_AUTH
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:27 INFO:	PCS_CLUSTER_SETUP
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:27 INFO:	running: pcs cluster setup STSRHTS11062 virt-500 virt-501 
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:32 INFO:	cluster created
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:32 INFO:	PCS_CLUSTER_START
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:32 INFO:	starting cluster from virt-500 with --all
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:58 INFO:	cluster started
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:58 INFO:	== SUBTEST: StonithCheckpointDiff ==
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:58 INFO:	PCS_STONITH_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:58 INFO:	running: pcs stonith create fence-virt-500 fence_xvm pcmk_host_list=virt-500 on virt-500
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:59 INFO:	Stonith device fence-virt-500 created.
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:59 INFO:	GET_DEVICES_BY_NODE
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:59 INFO:	PCS_STONITH_LEVEL_ADD
[pcs,cli,CheckpointDiff] 2023-06-29 11:58:59 INFO:	running: pcs stonith level add 1 virt-500 fence-virt-500 on virt-500
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:00 INFO:	PCS_STONITH_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:00 INFO:	running: pcs stonith create fence-virt-501 fence_xvm pcmk_host_list=virt-501 on virt-501
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:01 INFO:	Stonith device fence-virt-501 created.
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:01 INFO:	GET_DEVICES_BY_NODE
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:01 INFO:	PCS_STONITH_LEVEL_ADD
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:01 INFO:	running: pcs stonith level add 2 virt-501 fence-virt-501 on virt-500
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:02 INFO:	PCS_CONFIG_CHECKPOINT_DIFF
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:02 INFO:	running: pcs config checkpoint diff 1 live
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:04 INFO:	Differences between checkpoint 1 (-) and live configuration (+):
[pcs,cli,CheckpointDiff]  Resources:
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Stonith Devices:
[pcs,cli,CheckpointDiff] + Resource: fence-virt-500 (class=stonith type=fence_xvm)
[pcs,cli,CheckpointDiff] + Attributes: fence-virt-500-instance_attributes
[pcs,cli,CheckpointDiff] + pcmk_host_list=virt-500
[pcs,cli,CheckpointDiff] + Operations:
[pcs,cli,CheckpointDiff] + monitor: fence-virt-500-monitor-interval-60s
[pcs,cli,CheckpointDiff] + interval=60s
[pcs,cli,CheckpointDiff] + Resource: fence-virt-501 (class=stonith type=fence_xvm)
[pcs,cli,CheckpointDiff] + Attributes: fence-virt-501-instance_attributes
[pcs,cli,CheckpointDiff] + pcmk_host_list=virt-501
[pcs,cli,CheckpointDiff] + Operations:
[pcs,cli,CheckpointDiff] + monitor: fence-virt-501-monitor-interval-60s
[pcs,cli,CheckpointDiff] + interval=60s
[pcs,cli,CheckpointDiff]  Fencing Levels:
[pcs,cli,CheckpointDiff] + Target: virt-500
[pcs,cli,CheckpointDiff] + Level 1 - fence-virt-500
[pcs,cli,CheckpointDiff] + Target: virt-501
[pcs,cli,CheckpointDiff] + Level 2 - fence-virt-501
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Location Constraints:
[pcs,cli,CheckpointDiff]  Ordering Constraints:
[pcs,cli,CheckpointDiff]  Colocation Constraints:
[pcs,cli,CheckpointDiff]  Ticket Constraints:
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Alerts:
[pcs,cli,CheckpointDiff]  No alerts defined
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Resources Defaults:
[pcs,cli,CheckpointDiff]  No defaults set
[pcs,cli,CheckpointDiff]  Operations Defaults:
[pcs,cli,CheckpointDiff]  No defaults set
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff] - Cluster Properties:
[pcs,cli,CheckpointDiff] + Cluster Properties: cib-bootstrap-options
[pcs,cli,CheckpointDiff] + cluster-infrastructure=corosync
[pcs,cli,CheckpointDiff] + cluster-name=STSRHTS11062
[pcs,cli,CheckpointDiff] + dc-version=2.1.6-1.el8-6fdc9deea29
[pcs,cli,CheckpointDiff] + have-watchdog=false
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Tags:
[pcs,cli,CheckpointDiff]  No tags defined
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:05 INFO:	PCS_STONITH_LEVEL_REMOVE
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:05 INFO:	running: pcs stonith level remove 1 target virt-500 stonith fence-virt-500
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:06 INFO:	PCS_STONITH_LEVEL_REMOVE
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:06 INFO:	running: pcs stonith level remove 2 target virt-501 stonith fence-virt-501
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:08 INFO:	PCS_CLUSTER_DESTROY
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:11 INFO:	cluster destroyed.
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:11 INFO:	CHECK_CLUSTER_HEALTH
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:12 INFO:	SETUP_PCSD_PORTS
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:12 INFO:	CLEAR_DAEMON_CONFIGS
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:12 INFO:	RESTART_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:14 INFO:	
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:14 INFO:	SET_PASSWORD
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:14 INFO:	RESTART_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:16 INFO:	ENABLE_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:16 INFO:	PCS_HOST_AUTH
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:16 INFO:	running auth from virt-500
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:18 INFO:	PCS_STATUS_AUTH
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:19 INFO:	PCS_CLUSTER_SETUP
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:19 INFO:	running: pcs cluster setup STSRHTS11062 virt-500 virt-501 
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:24 INFO:	cluster created
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:24 INFO:	PCS_CLUSTER_START
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:24 INFO:	starting cluster from virt-500 with --all
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:51 INFO:	cluster started
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:51 INFO:	== SUBTEST: ResourceConstraintLocationCheckpointDiff ==
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:51 INFO:	PCS_RESOURCE_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:51 INFO:	running: pcs resource create d-0 ocf:heartbeat:Dummy --no-default-ops 
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:52 INFO:	Resource created.
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:52 INFO:	PCS_LOCATION_PREFERS
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:52 INFO:	running: pcs constraint location d-0 prefers virt-500 
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:53 INFO:	Location constraint created.
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:53 INFO:	PCS_RESOURCE_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:53 INFO:	running: pcs resource create d-1 ocf:heartbeat:Dummy --no-default-ops 
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:54 INFO:	Resource created.
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:54 INFO:	PCS_LOCATION_PREFERS
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:54 INFO:	running: pcs constraint location d-1 prefers virt-501 
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:55 INFO:	Location constraint created.
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:55 INFO:	PCS_CONFIG_CHECKPOINT_DIFF
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:55 INFO:	running: pcs config checkpoint diff 1 live
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:56 INFO:	Differences between checkpoint 1 (-) and live configuration (+):
[pcs,cli,CheckpointDiff]  Resources:
[pcs,cli,CheckpointDiff] + Resource: d-0 (class=ocf provider=heartbeat type=Dummy)
[pcs,cli,CheckpointDiff] + Operations:
[pcs,cli,CheckpointDiff] + monitor: d-0-monitor-interval-10s
[pcs,cli,CheckpointDiff] + interval=10s
[pcs,cli,CheckpointDiff] + timeout=20s
[pcs,cli,CheckpointDiff] + Resource: d-1 (class=ocf provider=heartbeat type=Dummy)
[pcs,cli,CheckpointDiff] + Operations:
[pcs,cli,CheckpointDiff] + monitor: d-1-monitor-interval-10s
[pcs,cli,CheckpointDiff] + interval=10s
[pcs,cli,CheckpointDiff] + timeout=20s
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Stonith Devices:
[pcs,cli,CheckpointDiff]  Fencing Levels:
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Location Constraints:
[pcs,cli,CheckpointDiff] + Resource: d-0
[pcs,cli,CheckpointDiff] + Enabled on:
[pcs,cli,CheckpointDiff] + Node: virt-500 (score:INFINITY) (id:location-d-0-virt-500-INFINITY)
[pcs,cli,CheckpointDiff] + Resource: d-1
[pcs,cli,CheckpointDiff] + Enabled on:
[pcs,cli,CheckpointDiff] + Node: virt-501 (score:INFINITY) (id:location-d-1-virt-501-INFINITY)
[pcs,cli,CheckpointDiff]  Ordering Constraints:
[pcs,cli,CheckpointDiff]  Colocation Constraints:
[pcs,cli,CheckpointDiff]  Ticket Constraints:
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Alerts:
[pcs,cli,CheckpointDiff]  No alerts defined
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Resources Defaults:
[pcs,cli,CheckpointDiff]  No defaults set
[pcs,cli,CheckpointDiff]  Operations Defaults:
[pcs,cli,CheckpointDiff]  No defaults set
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff] - Cluster Properties:
[pcs,cli,CheckpointDiff] + Cluster Properties: cib-bootstrap-options
[pcs,cli,CheckpointDiff] + cluster-infrastructure=corosync
[pcs,cli,CheckpointDiff] + cluster-name=STSRHTS11062
[pcs,cli,CheckpointDiff] + dc-version=2.1.6-1.el8-6fdc9deea29
[pcs,cli,CheckpointDiff] + have-watchdog=false
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Tags:
[pcs,cli,CheckpointDiff]  No tags defined
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:57 INFO:	PCS_RESOURCE_DELETE
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:57 INFO:	running: pcs resource delete d-0 
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:58 INFO:	Resource deleted.
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:58 INFO:	PCS_RESOURCE_DELETE
[pcs,cli,CheckpointDiff] 2023-06-29 11:59:58 INFO:	running: pcs resource delete d-1 
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:00 INFO:	Resource deleted.
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:01 INFO:	PCS_CLUSTER_DESTROY
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:04 INFO:	cluster destroyed.
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:04 INFO:	CHECK_CLUSTER_HEALTH
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:04 INFO:	SETUP_PCSD_PORTS
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:04 INFO:	CLEAR_DAEMON_CONFIGS
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:05 INFO:	RESTART_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:07 INFO:	
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:07 INFO:	SET_PASSWORD
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:07 INFO:	RESTART_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:09 INFO:	ENABLE_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:09 INFO:	PCS_HOST_AUTH
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:09 INFO:	running auth from virt-500
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:10 INFO:	PCS_STATUS_AUTH
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:11 INFO:	PCS_CLUSTER_SETUP
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:11 INFO:	running: pcs cluster setup STSRHTS11062 virt-500 virt-501 
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:16 INFO:	cluster created
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:16 INFO:	PCS_CLUSTER_START
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:16 INFO:	starting cluster from virt-500 with --all
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:42 INFO:	cluster started
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:42 INFO:	== SUBTEST: TagCheckpointDiff ==
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:42 INFO:	PCS_RESOURCE_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:42 INFO:	running: pcs resource create d-0 ocf:heartbeat:Dummy --no-default-ops 
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:43 INFO:	Resource created.
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:43 INFO:	PCS_TAG_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:43 INFO:	running: pcs tag create tag-d-0 d-0
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:44 INFO:	Tag created.
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:44 INFO:	PCS_RESOURCE_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:44 INFO:	running: pcs resource create d-1 ocf:heartbeat:Dummy --no-default-ops 
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:45 INFO:	Resource created.
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:45 INFO:	PCS_TAG_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:45 INFO:	running: pcs tag create tag-d-1 d-1
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:46 INFO:	Tag created.
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:46 INFO:	PCS_CONFIG_CHECKPOINT_DIFF
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:46 INFO:	running: pcs config checkpoint diff 1 live
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:48 INFO:	Differences between checkpoint 1 (-) and live configuration (+):
[pcs,cli,CheckpointDiff]  Resources:
[pcs,cli,CheckpointDiff] + Resource: d-0 (class=ocf provider=heartbeat type=Dummy)
[pcs,cli,CheckpointDiff] + Operations:
[pcs,cli,CheckpointDiff] + monitor: d-0-monitor-interval-10s
[pcs,cli,CheckpointDiff] + interval=10s
[pcs,cli,CheckpointDiff] + timeout=20s
[pcs,cli,CheckpointDiff] + Resource: d-1 (class=ocf provider=heartbeat type=Dummy)
[pcs,cli,CheckpointDiff] + Operations:
[pcs,cli,CheckpointDiff] + monitor: d-1-monitor-interval-10s
[pcs,cli,CheckpointDiff] + interval=10s
[pcs,cli,CheckpointDiff] + timeout=20s
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Stonith Devices:
[pcs,cli,CheckpointDiff]  Fencing Levels:
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Location Constraints:
[pcs,cli,CheckpointDiff]  Ordering Constraints:
[pcs,cli,CheckpointDiff]  Colocation Constraints:
[pcs,cli,CheckpointDiff]  Ticket Constraints:
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Alerts:
[pcs,cli,CheckpointDiff]  No alerts defined
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Resources Defaults:
[pcs,cli,CheckpointDiff]  No defaults set
[pcs,cli,CheckpointDiff]  Operations Defaults:
[pcs,cli,CheckpointDiff]  No defaults set
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff] - Cluster Properties:
[pcs,cli,CheckpointDiff] + Cluster Properties: cib-bootstrap-options
[pcs,cli,CheckpointDiff] + cluster-infrastructure=corosync
[pcs,cli,CheckpointDiff] + cluster-name=STSRHTS11062
[pcs,cli,CheckpointDiff] + dc-version=2.1.6-1.el8-6fdc9deea29
[pcs,cli,CheckpointDiff] + have-watchdog=false
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff]  Tags:
[pcs,cli,CheckpointDiff] - No tags defined
[pcs,cli,CheckpointDiff] + tag-d-0
[pcs,cli,CheckpointDiff] + d-0
[pcs,cli,CheckpointDiff] + tag-d-1
[pcs,cli,CheckpointDiff] + d-1
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:49 INFO:	PCS_TAG_DELETE
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:49 INFO:	running: pcs tag delete tag-d-0
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:50 INFO:	Tag deleted.
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:50 INFO:	PCS_RESOURCE_DELETE
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:50 INFO:	running: pcs resource delete d-0 
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:51 INFO:	Resource deleted.
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:51 INFO:	PCS_TAG_DELETE
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:51 INFO:	running: pcs tag delete tag-d-1
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:52 INFO:	Tag deleted.
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:52 INFO:	PCS_RESOURCE_DELETE
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:52 INFO:	running: pcs resource delete d-1 
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:53 INFO:	Resource deleted.
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:54 INFO:	CHECK_LOGS
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:54 INFO:	CHECK_LOGS_FOR_CIB_REPLACE
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:54 INFO:	PCS_CLUSTER_DESTROY
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:58 INFO:	cluster destroyed.
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:58 INFO:	CHECK_CLUSTER_HEALTH
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:58 INFO:	SETUP_PCSD_PORTS
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:58 INFO:	CLEAR_DAEMON_CONFIGS
[pcs,cli,CheckpointDiff] 2023-06-29 12:00:58 INFO:	RESTART_DAEMON
<pass name="pcs,cli,CheckpointDiff" id="pcs,cli,CheckpointDiff" pid="256418" time="Thu Jun 29 12:01:00 2023 +0200" type="cmd" duration="163" />
------------------- Summary ---------------------
Testcase                                 Result    
--------                                 ------    
pcs,cli,CheckpointDiff                             PASS      
=================================================
Total Tests Run: 1
Total PASS:      1
Total FAIL:      0
Total TIMEOUT:   0
Total KILLED:    0
Total STOPPED:   0
Test output in /tmp/vedder.CHERRY.STSRHTS11062.202306291158
Killing XMLRPC server...
DEBUG:STSXMLRPC_DAEMON:Debugging enabled
DEBUG:STSXMLRPC:Killing server with PID 256416 (SIGTERM)
INFO:STSXMLRPC:Server terminated.

> Marking as VERIFIED for pcs-0.10.16-1.el8.x86_64

Comment 22 errata-xmlrpc 2023-11-14 15:22:35 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (pcs bug fix and enhancement update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2023:6903


Note You need to log in before you can comment on or make changes to this bug.