Bug 2175881 - Command 'pcs config checkpoint diff' does not show configuration differences between checkpoints [NEEDINFO]
Summary: Command 'pcs config checkpoint diff' does not show configuration differences ...
Keywords:
Status: VERIFIED
Alias: None
Product: Red Hat Enterprise Linux 9
Classification: Red Hat
Component: pcs
Version: 9.0
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: rc
: 9.3
Assignee: Miroslav Lisik
QA Contact: cluster-qe
Steven J. Levine
URL:
Whiteboard:
Depends On:
Blocks: 2180697 2180698 2180699
TreeView+ depends on / blocked
 
Reported: 2023-03-06 17:07 UTC by Miroslav Lisik
Modified: 2023-08-16 13:28 UTC (History)
10 users (show)

Fixed In Version: pcs-0.11.5-1.el9
Doc Type: Bug Fix
Doc Text:
.The `pcs config checkpoint diff` command now works correctly again When a new mechanism for loading CIB files was implemented, the `pcs config checkpoint diff` command stopped showing the differences for some configuration sections. This was because displaying configuration sections with overhauled code used the new mechanism for loading CIB files, which cached the loaded content. The second file used for the difference comparison was not loaded and the cached content of the first file was used instead. As a result, the `diff` command yielded no output. With this fix, the CIB file content is no longer cached and the `pcs config checkpoint diff` command shows differences for all configuration sections.
Clone Of:
: 2176490 2180697 2180698 2180699 (view as bug list)
Environment:
Last Closed:
Type: Bug
Target Upstream Version:
Embargoed:
slevine: needinfo? (mlisik)


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Bugzilla 1655055 1 None None None 2023-03-07 07:56:13 UTC
Red Hat Issue Tracker CLUSTERQE-6466 0 None None None 2023-03-20 13:06:23 UTC
Red Hat Issue Tracker RHELPLAN-150852 0 None None None 2023-03-06 17:09:21 UTC

Description Miroslav Lisik 2023-03-06 17:07:47 UTC
Description of problem:
Command 'pcs config checkpoint diff' does not show configuration differences between checkpoints. Only differences for 'Cluster Properties section' are displayed.

Displaying differences in following configuration sections does not work:
* Resources
* Stonith Devices
* Fencing Levels
* Location Constraints
* Ordering Constraints
* Colocation Constraints
* Ticket Constraints
* Alerts
* Resources Defaults
* Operations Defaults
* Tags

Version-Release number of selected component (if applicable):
pcs-0.11.4-6.el9

How reproducible:
always

Steps to Reproduce:
1. Create an empty cluster
2. Create resources, stonith devices, fencing levels, constraint of all types, alerts, resource and operation defaults and tags.
3. Run command `pcs config checkpoint diff 1 live`

Actual results:
Only 'Cluster Properties' section has displayed differences marked by a '+' sign.

Expected results:
All sections should have displayed differences in configuration.

Additional info:
Differences should be displayed for added, removed or modified configuration.

Comment 6 Miroslav Lisik 2023-03-20 14:56:17 UTC
Upstream commit: https://github.com/ClusterLabs/pcs/commit/76b2f524f64e66d99b8a9fef68c70532361f8a65
Updated commands:
  * pcs config checkpoint diff

Test:

Prepare configuration for `pcs config`:

export NODELIST=(r9-node-01 r9-node-02)
pcs host auth -u hacluster -p $PASSWORD ${NODELIST[*]}
pcs cluster setup HACluster ${NODELIST[*]} --start --wait
for node in ${NODELIST[*]}; do
    pcs stonith create fence-1-$node fence_xvm;
done
for node in ${NODELIST[*]}; do
    pcs stonith create fence-2-$node fence_xvm;
done
for node in ${NODELIST[*]}; do
    pcs stonith level add 1 $node fence-1-$node;
    pcs stonith level add 2 $node fence-2-$node;
done
pcs resource create p-1 ocf:pacemaker:Dummy --no-default-ops
pcs resource create p-2 ocf:pacemaker:Dummy --no-default-ops
pcs constraint location p-1 prefers ${NODELIST[0]}
pcs constraint location p-2 avoids ${NODELIST[0]}
pcs resource create s-1 ocf:pacemaker:Stateful promotable --no-default-ops
pcs constraint location s-1-clone rule role=master "#uname" eq ${NODELIST[0]}
pcs resource create oc-1 ocf:pacemaker:Dummy --no-default-ops
pcs resource create oc-2 ocf:pacemaker:Dummy --no-default-ops
pcs constraint order oc-1 then oc-2
pcs constraint colocation add oc-2 with oc-1
pcs resource create oc-set-1 ocf:pacemaker:Dummy --no-default-ops
pcs resource create oc-set-2 ocf:pacemaker:Dummy --no-default-ops
pcs constraint order set oc-set-1 oc-set-2
pcs constraint colocation set oc-set-2 oc-set-1
pcs resource create t ocf:pacemaker:Dummy --no-default-ops
pcs constraint ticket add Ticket t
pcs constraint ticket set p-1 p-2 setoptions ticket=Ticket-set
pcs alert create path=/usr/bin/true id=Alert
pcs alert recipient add Alert value=recipient-value
pcs resource defaults resource-stickiness=2
pcs resource op defaults timeout=90
pcs property set maintenance-mode=false
pcs tag create TAG p-1 p-2
pcs resource defaults set create id=set-1 meta target-role=Started
pcs resource op defaults set create id=op-set-1 score=10 meta interval=30s


Test `pcs config checkpoint diff` commmand:

pcs config checkpoint diff 1 live
pcs config checkpoint diff live 1

Comment 11 Michal Pospisil 2023-05-26 09:28:33 UTC
DevTestResults:

[root@r09-03-a ~]# rpm -q pcs
pcs-0.11.5-1.el9.x86_64

[root@r09-03-a ~]# pcs config checkpoint diff 1 live
Differences between checkpoint 1 (-) and live configuration (+):
+ Resources:
+   Resource: r1 (class=ocf provider=pacemaker type=Dummy)
+ 	Operations:
+   	migrate_from: r1-migrate_from-interval-0s
+     	interval=0s timeout=20s
+   	migrate_to: r1-migrate_to-interval-0s
+     	interval=0s timeout=20s
+   	monitor: r1-monitor-interval-10s
+     	interval=10s timeout=20s
+   	reload: r1-reload-interval-0s
+     	interval=0s timeout=20s
+   	reload-agent: r1-reload-agent-interval-0s
+     	interval=0s timeout=20s
+   	start: r1-start-interval-0s
+     	interval=0s timeout=20s
+   	stop: r1-stop-interval-0s
+     	interval=0s timeout=20s
+   Resource: r2 (class=ocf provider=pacemaker type=Dummy)
+ 	Operations:
+   	migrate_from: r2-migrate_from-interval-0s
+     	interval=0s timeout=20s
+   	migrate_to: r2-migrate_to-interval-0s
+     	interval=0s timeout=20s
+   	monitor: r2-monitor-interval-10s
+     	interval=10s timeout=20s
+   	reload: r2-reload-interval-0s
+     	interval=0s timeout=20s
+   	reload-agent: r2-reload-agent-interval-0s
+     	interval=0s timeout=20s
+   	start: r2-start-interval-0s
+     	interval=0s timeout=20s
+   	stop: r2-stop-interval-0s
+     	interval=0s timeout=20s
+   Resource: p-1 (class=ocf provider=pacemaker type=Dummy)
+ 	Operations:
+   	monitor: p-1-monitor-interval-10s
+     	interval=10s timeout=20s
+   Resource: p-2 (class=ocf provider=pacemaker type=Dummy)
+ 	Operations:
+   	monitor: p-2-monitor-interval-10s
+     	interval=10s timeout=20s
+   Resource: oc-1 (class=ocf provider=pacemaker type=Dummy)
+ 	Operations:
+   	monitor: oc-1-monitor-interval-10s
+     	interval=10s timeout=20s
+   Resource: oc-2 (class=ocf provider=pacemaker type=Dummy)
+ 	Operations:
+   	monitor: oc-2-monitor-interval-10s
+     	interval=10s timeout=20s
+   Resource: oc-set-1 (class=ocf provider=pacemaker type=Dummy)
+ 	Operations:
+   	monitor: oc-set-1-monitor-interval-10s
+     	interval=10s timeout=20s
+   Resource: oc-set-2 (class=ocf provider=pacemaker type=Dummy)
+ 	Operations:
+   	monitor: oc-set-2-monitor-interval-10s
+     	interval=10s timeout=20s
+   Resource: t (class=ocf provider=pacemaker type=Dummy)
+ 	Operations:
+   	monitor: t-monitor-interval-10s
+     	interval=10s timeout=20s
+   Clone: s-1-clone
+ 	Meta Attributes: s-1-clone-meta_attributes
+   	promotable=true
+ 	Resource: s-1 (class=ocf provider=pacemaker type=Stateful)
+   	Operations:
+     	monitor: s-1-monitor-interval-10s
+       	interval=10s timeout=20s role=Promoted
+     	monitor: s-1-monitor-interval-11s
+       	interval=11s timeout=20s role=Unpromoted
+
+ Stonith Devices:
+   Resource: xvm (class=stonith type=fence_xvm)
+ 	Operations:
+   	monitor: xvm-monitor-interval-60s
+     	interval=60s
+   Resource: fence-1-r09-03-a.vm (class=stonith type=fence_xvm)
+ 	Operations:
+   	monitor: fence-1-r09-03-a.vm-monitor-interval-60s
+     	interval=60s
+   Resource: fence-1-r09-03-b.vm (class=stonith type=fence_xvm)
+ 	Operations:
+   	monitor: fence-1-r09-03-b.vm-monitor-interval-60s
+     	interval=60s
+   Resource: fence-2-r09-03-a.vm (class=stonith type=fence_xvm)
+ 	Operations:
+   	monitor: fence-2-r09-03-a.vm-monitor-interval-60s
+     	interval=60s
+   Resource: fence-2-r09-03-b.vm (class=stonith type=fence_xvm)
+ 	Operations:
+   	monitor: fence-2-r09-03-b.vm-monitor-interval-60s
+     	interval=60s
+
+ Fencing Levels:
+   Target: r09-03-a.vm
+ 	Level 1 - fence-1-r09-03-a.vm
+ 	Level 2 - fence-2-r09-03-a.vm
+   Target: r09-03-b.vm
+ 	Level 1 - fence-1-r09-03-b.vm
+ 	Level 2 - fence-2-r09-03-b.vm
+
+ Location Constraints:
+   resource 'r2' (id: location-r2)
+ 	Rules:
+   	Rule: score=INFINITY (id: location-r2-rule)
+     	Expression: #uname eq r09-03-a.vm (id: location-r2-rule-expr)
+   resource 'r1' prefers node 'r09-03-b.vm' with score INFINITY (id: location-r1-r09-03-b.vm-INFINITY)
+   resource 'p-1' prefers node 'r09-03-a.vm' with score INFINITY (id: location-p-1-r09-03-a.vm-INFINITY)
+   resource 'p-2' avoids node 'r09-03-a.vm' with score INFINITY (id: location-p-2-r09-03-a.vm--INFINITY)
+   resource 's-1-clone' (id: location-s-1-clone)
+ 	Rules:
+   	Rule: role=Promoted score=INFINITY (id: location-s-1-clone-rule)
+     	Expression: #uname eq r09-03-a.vm (id: location-s-1-clone-rule-expr)
+ Colocation Constraints:
+   resource 'oc-2' with resource 'oc-1' (id: colocation-oc-2-oc-1-INFINITY)
+ 	score=INFINITY
+ Colocation Set Constraints:
+   Set Constraint: colocation_set_o2o1
+ 	score=INFINITY
+ 	Resource Set: colocation_set_o2o1_set
+   	Resources: 'oc-set-1', 'oc-set-2'
+ Order Constraints:
+   start resource 'oc-1' then start resource 'oc-2' (id: order-oc-1-oc-2-mandatory)
+ Order Set Constraints:
+   Set Constraint: order_set_o1o2
+ 	Resource Set: order_set_o1o2_set
+   	Resources: 'oc-set-1', 'oc-set-2'
+ Ticket Constraints:
+   resource 't' depends on ticket 'Ticket' (id: ticket-Ticket-t)
+ Ticket Set Constraints:
+   Set Constraint: ticket_set_p1p2
+ 	ticket=Ticket-set
+ 	Resource Set: ticket_set_p1p2_set
+   	Resources: 'p-1', 'p-2'
+
+ Alerts:
+  Alert: Alert (path=/usr/bin/true)
+   Recipients:
+	Recipient: Alert-recipient (value=recipient-value)
+
  Resources Defaults:
	Meta Attrs: build-resource-defaults
- 	resource-stickiness=1 (id: build-resource-stickiness)
?                     	^
+ 	resource-stickiness=2 (id: build-resource-stickiness)
?                     	^
+   Meta Attrs: set-1
+ 	target-role=Started (id: set-1-target-role)
+
+ Operations Defaults:
+   Meta Attrs: op_defaults-meta_attributes
+ 	timeout=90 (id: op_defaults-meta_attributes-timeout)
+   Meta Attrs: op-set-1 score=10
+ 	interval=30s (id: op-set-1-interval)
+
+ Cluster Properties: cib-bootstrap-options
+   cluster-infrastructure=corosync
+   cluster-name=rh93
+   dc-version=2.1.5-7.el9-a3f44794f94
+   have-watchdog=false
+   maintenance-mode=false
+   migration-limit=10
+   placement-strategy=minimal
+
+ Tags:
+  TAG
+	p-1
+	p-2

Comment 17 svalasti 2023-06-29 11:30:50 UTC
[root@virt-018 ~]# rpm -q pcs
pcs-0.11.5-2.el9.x86_64


Starting tests, logging all output to /tmp/vedder.CHERRY.STSRHTS9444.202306291321
Starting XMLRPC server....
DEBUG:STSXMLRPC_DAEMON:Debugging enabled
INFO:STSXMLRPC:Server started. Logging to /tmp/vedder.CHERRY.STSRHTS9444.202306291321/stsxmlrpc.log
XMLRPC server started at http://virt-016.cluster-qe.lab.eng.brq.redhat.com:34631/
<start name="pcs,cli,CheckpointDiff" id="pcs,cli,CheckpointDiff" pid="644956" time="Thu Jun 29 13:21:16 2023 +0200" type="cmd" />
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:16 INFO:	PCS_CLUSTER_DESTROY
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:19 INFO:	cluster destroyed.
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:19 INFO:	CHECK_CLUSTER_HEALTH
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:19 INFO:	SETUP_PCSD_PORTS
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:19 INFO:	CLEAR_DAEMON_CONFIGS
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:19 INFO:	RESTART_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:21 INFO:	
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:21 INFO:	SET_PASSWORD
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:21 INFO:	RESTART_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:23 INFO:	ENABLE_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:23 INFO:	PCS_HOST_AUTH
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:23 INFO:	running auth from virt-019
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:24 INFO:	PCS_STATUS_AUTH
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:24 INFO:	PCS_CLUSTER_SETUP
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:24 INFO:	running: pcs cluster setup STSRHTS9444 virt-018 virt-019 
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:28 INFO:	cluster created
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:28 INFO:	PCS_CLUSTER_START
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:28 INFO:	starting cluster from virt-019 with --all
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:53 INFO:	cluster started
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:53 INFO:	== SUBTEST: StonithCheckpointDiff ==
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:53 INFO:	PCS_STONITH_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:53 INFO:	running: pcs stonith create fence-virt-018 fence_xvm pcmk_host_list=virt-018 on virt-018
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:54 INFO:	Stonith device fence-virt-018 created.
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:54 INFO:	GET_DEVICES_BY_NODE
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:54 INFO:	PCS_STONITH_LEVEL_ADD
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:54 INFO:	running: pcs stonith level add 1 virt-018 fence-virt-018 on virt-019
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:55 INFO:	PCS_STONITH_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:55 INFO:	running: pcs stonith create fence-virt-019 fence_xvm pcmk_host_list=virt-019 on virt-019
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:55 INFO:	Stonith device fence-virt-019 created.
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:55 INFO:	GET_DEVICES_BY_NODE
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:55 INFO:	PCS_STONITH_LEVEL_ADD
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:55 INFO:	running: pcs stonith level add 2 virt-019 fence-virt-019 on virt-019
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:56 INFO:	PCS_CONFIG_CHECKPOINT_DIFF
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:56 INFO:	running: pcs config checkpoint diff 1 live
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:57 INFO:	Differences between checkpoint 1 (-) and live configuration (+):
[pcs,cli,CheckpointDiff] + Stonith Devices:
[pcs,cli,CheckpointDiff] + Resource: fence-virt-018 (class=stonith type=fence_xvm)
[pcs,cli,CheckpointDiff] + Attributes: fence-virt-018-instance_attributes
[pcs,cli,CheckpointDiff] + pcmk_host_list=virt-018
[pcs,cli,CheckpointDiff] + Operations:
[pcs,cli,CheckpointDiff] + monitor: fence-virt-018-monitor-interval-60s
[pcs,cli,CheckpointDiff] + interval=60s
[pcs,cli,CheckpointDiff] + Resource: fence-virt-019 (class=stonith type=fence_xvm)
[pcs,cli,CheckpointDiff] + Attributes: fence-virt-019-instance_attributes
[pcs,cli,CheckpointDiff] + pcmk_host_list=virt-019
[pcs,cli,CheckpointDiff] + Operations:
[pcs,cli,CheckpointDiff] + monitor: fence-virt-019-monitor-interval-60s
[pcs,cli,CheckpointDiff] + interval=60s
[pcs,cli,CheckpointDiff] +
[pcs,cli,CheckpointDiff] + Fencing Levels:
[pcs,cli,CheckpointDiff] + Target: virt-018
[pcs,cli,CheckpointDiff] + Level 1 - fence-virt-018
[pcs,cli,CheckpointDiff] + Target: virt-019
[pcs,cli,CheckpointDiff] + Level 2 - fence-virt-019
[pcs,cli,CheckpointDiff] +
[pcs,cli,CheckpointDiff]  Resources Defaults:
[pcs,cli,CheckpointDiff]  Meta Attrs: build-resource-defaults
[pcs,cli,CheckpointDiff]  resource-stickiness=1 (id: build-resource-stickiness)
[pcs,cli,CheckpointDiff] +
[pcs,cli,CheckpointDiff] + Cluster Properties: cib-bootstrap-options
[pcs,cli,CheckpointDiff] + cluster-infrastructure=corosync
[pcs,cli,CheckpointDiff] + cluster-name=STSRHTS9444
[pcs,cli,CheckpointDiff] + dc-version=2.1.6-2.el9-6fdc9deea29
[pcs,cli,CheckpointDiff] + have-watchdog=false
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:58 INFO:	PCS_STONITH_LEVEL_REMOVE
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:58 INFO:	running: pcs stonith level remove 1 target virt-018 stonith fence-virt-018
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:58 INFO:	PCS_STONITH_LEVEL_REMOVE
[pcs,cli,CheckpointDiff] 2023-06-29 13:21:58 INFO:	running: pcs stonith level remove 2 target virt-019 stonith fence-virt-019
[pcs,cli,CheckpointDiff] 2023-06-29 13:22:00 INFO:	PCS_CLUSTER_DESTROY
[pcs,cli,CheckpointDiff] 2023-06-29 13:22:03 INFO:	cluster destroyed.
[pcs,cli,CheckpointDiff] 2023-06-29 13:22:03 INFO:	CHECK_CLUSTER_HEALTH
[pcs,cli,CheckpointDiff] 2023-06-29 13:22:03 INFO:	SETUP_PCSD_PORTS
[pcs,cli,CheckpointDiff] 2023-06-29 13:22:03 INFO:	CLEAR_DAEMON_CONFIGS
[pcs,cli,CheckpointDiff] 2023-06-29 13:22:03 INFO:	RESTART_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 13:23:36 INFO:	
[pcs,cli,CheckpointDiff] 2023-06-29 13:23:36 INFO:	SET_PASSWORD
[pcs,cli,CheckpointDiff] 2023-06-29 13:23:37 INFO:	RESTART_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 13:23:38 INFO:	ENABLE_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 13:23:38 INFO:	PCS_HOST_AUTH
[pcs,cli,CheckpointDiff] 2023-06-29 13:23:38 INFO:	running auth from virt-019
[pcs,cli,CheckpointDiff] 2023-06-29 13:23:39 INFO:	PCS_STATUS_AUTH
[pcs,cli,CheckpointDiff] 2023-06-29 13:23:40 INFO:	PCS_CLUSTER_SETUP
[pcs,cli,CheckpointDiff] 2023-06-29 13:23:40 INFO:	running: pcs cluster setup STSRHTS9444 virt-018 virt-019 
[pcs,cli,CheckpointDiff] 2023-06-29 13:23:43 INFO:	cluster created
[pcs,cli,CheckpointDiff] 2023-06-29 13:23:43 INFO:	PCS_CLUSTER_START
[pcs,cli,CheckpointDiff] 2023-06-29 13:23:43 INFO:	starting cluster from virt-019 with --all
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:09 INFO:	cluster started
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:09 INFO:	== SUBTEST: ResourceConstraintLocationCheckpointDiff ==
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:09 INFO:	PCS_RESOURCE_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:09 INFO:	running: pcs resource create d-0 ocf:heartbeat:Dummy --no-default-ops 
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:09 INFO:	Resource created.
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:09 INFO:	PCS_LOCATION_PREFERS
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:09 INFO:	running: pcs constraint location d-0 prefers virt-018 
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:10 INFO:	Location constraint created.
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:10 INFO:	PCS_RESOURCE_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:10 INFO:	running: pcs resource create d-1 ocf:heartbeat:Dummy --no-default-ops 
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:11 INFO:	Resource created.
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:11 INFO:	PCS_LOCATION_PREFERS
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:11 INFO:	running: pcs constraint location d-1 prefers virt-019 
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:11 INFO:	Location constraint created.
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:11 INFO:	PCS_CONFIG_CHECKPOINT_DIFF
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:11 INFO:	running: pcs config checkpoint diff 1 live
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:12 INFO:	Differences between checkpoint 1 (-) and live configuration (+):
[pcs,cli,CheckpointDiff] + Resources:
[pcs,cli,CheckpointDiff] + Resource: d-0 (class=ocf provider=heartbeat type=Dummy)
[pcs,cli,CheckpointDiff] + Operations:
[pcs,cli,CheckpointDiff] + monitor: d-0-monitor-interval-10s
[pcs,cli,CheckpointDiff] + interval=10s timeout=20s
[pcs,cli,CheckpointDiff] + Resource: d-1 (class=ocf provider=heartbeat type=Dummy)
[pcs,cli,CheckpointDiff] + Operations:
[pcs,cli,CheckpointDiff] + monitor: d-1-monitor-interval-10s
[pcs,cli,CheckpointDiff] + interval=10s timeout=20s
[pcs,cli,CheckpointDiff] +
[pcs,cli,CheckpointDiff] + Location Constraints:
[pcs,cli,CheckpointDiff] + resource 'd-0' prefers node 'virt-018' with score INFINITY (id: location-d-0-virt-018-INFINITY)
[pcs,cli,CheckpointDiff] + resource 'd-1' prefers node 'virt-019' with score INFINITY (id: location-d-1-virt-019-INFINITY)
[pcs,cli,CheckpointDiff] +
[pcs,cli,CheckpointDiff]  Resources Defaults:
[pcs,cli,CheckpointDiff]  Meta Attrs: build-resource-defaults
[pcs,cli,CheckpointDiff]  resource-stickiness=1 (id: build-resource-stickiness)
[pcs,cli,CheckpointDiff] +
[pcs,cli,CheckpointDiff] + Cluster Properties: cib-bootstrap-options
[pcs,cli,CheckpointDiff] + cluster-infrastructure=corosync
[pcs,cli,CheckpointDiff] + cluster-name=STSRHTS9444
[pcs,cli,CheckpointDiff] + dc-version=2.1.6-2.el9-6fdc9deea29
[pcs,cli,CheckpointDiff] + have-watchdog=false
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:13 INFO:	PCS_RESOURCE_DELETE
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:13 INFO:	running: pcs resource delete d-0 
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:14 INFO:	Resource deleted.
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:14 INFO:	PCS_RESOURCE_DELETE
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:14 INFO:	running: pcs resource delete d-1 
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:14 INFO:	Resource deleted.
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:15 INFO:	PCS_CLUSTER_DESTROY
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:18 INFO:	cluster destroyed.
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:18 INFO:	CHECK_CLUSTER_HEALTH
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:18 INFO:	SETUP_PCSD_PORTS
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:18 INFO:	CLEAR_DAEMON_CONFIGS
[pcs,cli,CheckpointDiff] 2023-06-29 13:24:18 INFO:	RESTART_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 13:25:51 INFO:	
[pcs,cli,CheckpointDiff] 2023-06-29 13:25:51 INFO:	SET_PASSWORD
[pcs,cli,CheckpointDiff] 2023-06-29 13:25:51 INFO:	RESTART_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 13:25:52 INFO:	ENABLE_DAEMON
[pcs,cli,CheckpointDiff] 2023-06-29 13:25:52 INFO:	PCS_HOST_AUTH
[pcs,cli,CheckpointDiff] 2023-06-29 13:25:52 INFO:	running auth from virt-019
[pcs,cli,CheckpointDiff] 2023-06-29 13:25:53 INFO:	PCS_STATUS_AUTH
[pcs,cli,CheckpointDiff] 2023-06-29 13:25:54 INFO:	PCS_CLUSTER_SETUP
[pcs,cli,CheckpointDiff] 2023-06-29 13:25:54 INFO:	running: pcs cluster setup STSRHTS9444 virt-018 virt-019 
[pcs,cli,CheckpointDiff] 2023-06-29 13:25:58 INFO:	cluster created
[pcs,cli,CheckpointDiff] 2023-06-29 13:25:58 INFO:	PCS_CLUSTER_START
[pcs,cli,CheckpointDiff] 2023-06-29 13:25:58 INFO:	starting cluster from virt-019 with --all
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:23 INFO:	cluster started
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:23 INFO:	== SUBTEST: TagCheckpointDiff ==
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:23 INFO:	PCS_RESOURCE_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:23 INFO:	running: pcs resource create d-0 ocf:heartbeat:Dummy --no-default-ops 
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:24 INFO:	Resource created.
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:24 INFO:	PCS_TAG_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:24 INFO:	running: pcs tag create tag-d-0 d-0
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:24 INFO:	Tag created.
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:24 INFO:	PCS_RESOURCE_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:24 INFO:	running: pcs resource create d-1 ocf:heartbeat:Dummy --no-default-ops 
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:25 INFO:	Resource created.
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:25 INFO:	PCS_TAG_CREATE
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:25 INFO:	running: pcs tag create tag-d-1 d-1
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:25 INFO:	Tag created.
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:25 INFO:	PCS_CONFIG_CHECKPOINT_DIFF
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:25 INFO:	running: pcs config checkpoint diff 1 live
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:26 INFO:	Differences between checkpoint 1 (-) and live configuration (+):
[pcs,cli,CheckpointDiff] + Resources:
[pcs,cli,CheckpointDiff] + Resource: d-0 (class=ocf provider=heartbeat type=Dummy)
[pcs,cli,CheckpointDiff] + Operations:
[pcs,cli,CheckpointDiff] + monitor: d-0-monitor-interval-10s
[pcs,cli,CheckpointDiff] + interval=10s timeout=20s
[pcs,cli,CheckpointDiff] + Resource: d-1 (class=ocf provider=heartbeat type=Dummy)
[pcs,cli,CheckpointDiff] + Operations:
[pcs,cli,CheckpointDiff] + monitor: d-1-monitor-interval-10s
[pcs,cli,CheckpointDiff] + interval=10s timeout=20s
[pcs,cli,CheckpointDiff] +
[pcs,cli,CheckpointDiff]  Resources Defaults:
[pcs,cli,CheckpointDiff]  Meta Attrs: build-resource-defaults
[pcs,cli,CheckpointDiff]  resource-stickiness=1 (id: build-resource-stickiness)
[pcs,cli,CheckpointDiff] +
[pcs,cli,CheckpointDiff] + Cluster Properties: cib-bootstrap-options
[pcs,cli,CheckpointDiff] + cluster-infrastructure=corosync
[pcs,cli,CheckpointDiff] + cluster-name=STSRHTS9444
[pcs,cli,CheckpointDiff] + dc-version=2.1.6-2.el9-6fdc9deea29
[pcs,cli,CheckpointDiff] + have-watchdog=false
[pcs,cli,CheckpointDiff] +
[pcs,cli,CheckpointDiff] + Tags:
[pcs,cli,CheckpointDiff] + tag-d-0
[pcs,cli,CheckpointDiff] + d-0
[pcs,cli,CheckpointDiff] + tag-d-1
[pcs,cli,CheckpointDiff] + d-1
[pcs,cli,CheckpointDiff] 
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:27 INFO:	PCS_TAG_DELETE
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:27 INFO:	running: pcs tag delete tag-d-0
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:28 INFO:	Tag deleted.
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:28 INFO:	PCS_RESOURCE_DELETE
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:28 INFO:	running: pcs resource delete d-0 
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:28 INFO:	Resource deleted.
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:28 INFO:	PCS_TAG_DELETE
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:28 INFO:	running: pcs tag delete tag-d-1
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:29 INFO:	Tag deleted.
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:29 INFO:	PCS_RESOURCE_DELETE
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:29 INFO:	running: pcs resource delete d-1 
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:30 INFO:	Resource deleted.
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:31 INFO:	CHECK_LOGS
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:31 INFO:	CHECK_LOGS_FOR_CIB_REPLACE
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:31 INFO:	PCS_CLUSTER_DESTROY
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:34 INFO:	cluster destroyed.
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:34 INFO:	CHECK_CLUSTER_HEALTH
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:34 INFO:	SETUP_PCSD_PORTS
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:34 INFO:	CLEAR_DAEMON_CONFIGS
[pcs,cli,CheckpointDiff] 2023-06-29 13:26:34 INFO:	RESTART_DAEMON
<pass name="pcs,cli,CheckpointDiff" id="pcs,cli,CheckpointDiff" pid="644956" time="Thu Jun 29 13:26:37 2023 +0200" type="cmd" duration="321" />
------------------- Summary ---------------------
Testcase                                 Result    
--------                                 ------    
pcs,cli,CheckpointDiff                     PASS      
=================================================
Total Tests Run: 1
Total PASS:      1
Total FAIL:      0
Total TIMEOUT:   0
Total KILLED:    0
Total STOPPED:   0


> Marking as VERIFIED for pcs-0.11.5-2.el9.x86_64


Note You need to log in before you can comment on or make changes to this bug.