Note: This bug is displayed in read-only format because the product is no longer active in Red Hat Bugzilla.
RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.

Bug 1872477

Summary: virt-who fails to parse output from hypervisor. [rhel-7.9.z]
Product: Red Hat Enterprise Linux 7 Reporter: Rudnei Bertol Jr. <rbertolj>
Component: virt-whoAssignee: candlepin-bugs
Status: CLOSED ERRATA QA Contact: Eko <hsun>
Severity: high Docs Contact:
Priority: high    
Version: 7.8CC: csnyder, hsun, jreznik, kuhuang, phess, redakkan, wpoteat
Target Milestone: rcKeywords: Reopened, Triaged, ZStream
Target Release: ---   
Hardware: x86_64   
OS: Linux   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of:
: 1876927 (view as bug list) Environment:
Last Closed: 2020-12-15 11:19:06 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On:    
Bug Blocks: 1876927    

Description Rudnei Bertol Jr. 2020-08-25 21:02:37 UTC
Description of problem:

virt-who fails to parse the output from hypervisor when the API from hypervisor decode special character to html like '/' to '%2f'

Version-Release number of selected component (if applicable):

]# rpm -qa |grep virt-who
virt-who-0.26.5-1.el7.noarch


How reproducible:

A complete reproducer will be provided on the next update.

Steps to Reproduce:
1.
2.
3.

Actual results:

The virt-who debug command fail, to parse the json.

~~~
2020-08-25 17:00:50,699 [virtwho.destination_8596163159926476453 DEBUG] MainProcess(16987):Thread-3 @subscriptionmanager.py:_is_rhsm_server_async:290 - Server has capability 'hypervisors_async'
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/virtwho/log.py", line 95, in emit
    self._queue.put_nowait(self.prepare(record))
  File "/usr/lib/python2.7/site-packages/virtwho/log.py", line 87, in prepare
    record.msg = record.msg % record.args
TypeError: not enough arguments for format string
Logged from file subscriptionmanager.py, line 207
~~~


Expected results:

The virt-who debug command should show the debug.

Additional info:

Comment 5 William Poteat 2020-08-26 20:58:45 UTC
The attached json does not contain characters that cause the error on my machine. Can you check the file and confirm.

Thanks

Comment 6 William Poteat 2020-08-26 21:06:34 UTC
I edited the file to match the error described above. I do not need a new file.

Any idea where the slash '/' is getting converted to url escaping?

Comment 8 William Poteat 2020-08-27 12:47:26 UTC
Where is this conversion of '/' to %2f happening? Is it given to us from vCenter as '/' or %2f? 
We try not to use virt-who to translate data if at all possible.

Comment 9 Rudnei Bertol Jr. 2020-08-27 16:41:17 UTC
Hey William,

I am not able to see raw file collected by the virt-who, however, I used our internal vmware creating a cluster called 'Test_1/2' and used the resource 'vmware_cluster_facts' to collect the clusters from the vCenter, and we can see that the encode came from VMware API.


~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

PLAY [localhost] ************************************************************************************************************************************************************

TASK [Gathering Facts] ******************************************************************************************************************************************************
ok: [localhost]

TASK [Gather cluster info from given datacenter] ****************************************************************************************************************************
ok: [localhost]

TASK [debug] ****************************************************************************************************************************************************************
ok: [localhost] => {
    "cluster_info": {
        "changed": false, 
        "clusters": {
            "Test_1%2f2": {                                       <=================== Cluster named as 'Test_1/2', but it is being collected as 'Test_1%2f2'
                "drs_default_vm_behavior": "fullyAutomated", 
                "drs_enable_vm_behavior_overrides": true, 
                "drs_vmotion_rate": 3, 
                "enable_ha": false, 
                "enabled_drs": false, 
                "enabled_vsan": false, 
                "ha_admission_control_enabled": true, 
                "ha_failover_level": 1, 
                "ha_host_monitoring": "enabled", 
                "ha_restart_priority": [
                    "medium"
                ], 
                "ha_vm_failure_interval": [
                    30
                ], 
                "ha_vm_max_failure_window": [
                    -1
                ], 
                "ha_vm_max_failures": [
                    3
                ], 
                "ha_vm_min_up_time": [
                    120
                ], 
                "ha_vm_monitoring": "vmMonitoringDisabled", 
                "ha_vm_tools_monitoring": [
                    "vmMonitoringDisabled"
                ], 
                "vsan_auto_claim_storage": false
            }, 
            "vMotion-Cluster": {
                "drs_default_vm_behavior": "fullyAutomated", 
                "drs_enable_vm_behavior_overrides": true, 
                "drs_vmotion_rate": 3, 
                "enable_ha": false, 
                "enabled_drs": true, 
                "enabled_vsan": false, 
                "ha_admission_control_enabled": true, 
                "ha_failover_level": 1, 
                "ha_host_monitoring": "enabled", 
                "ha_restart_priority": [
                    "medium"
                ], 
                "ha_vm_failure_interval": [
                    30
                ], 
                "ha_vm_max_failure_window": [
                    -1
                ], 
                "ha_vm_max_failures": [
                    3
                ], 
                "ha_vm_min_up_time": [
                    120
                ], 
                "ha_vm_monitoring": "vmMonitoringDisabled", 
                "ha_vm_tools_monitoring": [
                    "vmMonitoringDisabled"
                ], 
                "vsan_auto_claim_storage": false
            }
        }, 
        "failed": false
    }
}

PLAY RECAP ******************************************************************************************************************************************************************
localhost                  : ok=3    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0   

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Applied the same idea applied on 'log.py' to fix the output.

~~~
]# diff /usr/lib/python2.7/site-packages/ansible/modules/cloud/vmware/vmware_cluster_facts.py /root/vmware_cluster_facts.py 
99a100,103
> try:
>     from urllib import unquote as urldecode
> except:
>     from urllib.parse import unquote as urldecode
184c188
<             results['clusters'][cluster.name] = dict(
---
>             results['clusters'][urldecode(cluster.name)] = dict(
~~~

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
]# ansible-playbook playbook.yml -e vcenter_password=$SENHA
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all'


PLAY [localhost] *****************************************************************************************************************************************************************************

TASK [Gathering Facts] ***********************************************************************************************************************************************************************
ok: [localhost]

TASK [Gather cluster info from given datacenter] *********************************************************************************************************************************************
ok: [localhost]

TASK [debug] *********************************************************************************************************************************************************************************
ok: [localhost] => {
    "cluster_info": {                                       <=================== Cluster named as expected.
        "changed": false, 
        "clusters": {
            "Test_1/2": {
                "drs_default_vm_behavior": "fullyAutomated", 
                "drs_enable_vm_behavior_overrides": true, 
                "drs_vmotion_rate": 3, 
                "enable_ha": false, 
                "enabled_drs": false, 
                "enabled_vsan": false, 
                "ha_admission_control_enabled": true, 
                "ha_failover_level": 1, 
                "ha_host_monitoring": "enabled", 
                "ha_restart_priority": [
                    "medium"
                ], 
                "ha_vm_failure_interval": [
                    30
                ], 
                "ha_vm_max_failure_window": [
                    -1
                ], 
                "ha_vm_max_failures": [
                    3
                ], 
                "ha_vm_min_up_time": [
                    120
                ], 
                "ha_vm_monitoring": "vmMonitoringDisabled", 
                "ha_vm_tools_monitoring": [
                    "vmMonitoringDisabled"
                ], 
                "vsan_auto_claim_storage": false
            }, 
            "vMotion-Cluster": {
                "drs_default_vm_behavior": "fullyAutomated", 
                "drs_enable_vm_behavior_overrides": true, 
                "drs_vmotion_rate": 3, 
                "enable_ha": false, 
                "enabled_drs": true, 
                "enabled_vsan": false, 
                "ha_admission_control_enabled": true, 
                "ha_failover_level": 1, 
                "ha_host_monitoring": "enabled", 
                "ha_restart_priority": [
                    "medium"
                ], 
                "ha_vm_failure_interval": [
                    30
                ], 
                "ha_vm_max_failure_window": [
                    -1
                ], 
                "ha_vm_max_failures": [
                    3
                ], 
                "ha_vm_min_up_time": [
                    120
                ], 
                "ha_vm_monitoring": "vmMonitoringDisabled", 
                "ha_vm_tools_monitoring": [
                    "vmMonitoringDisabled"
                ], 
                "vsan_auto_claim_storage": false
            }
        }, 
        "failed": false
    }
}

PLAY RECAP ***********************************************************************************************************************************************************************************
localhost                  : ok=3    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0  
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

regards
rbertol

Comment 10 Rudnei Bertol Jr. 2020-08-27 17:14:18 UTC
In addition, doing some test, I realized that the patch fix just the debug, the facts is still being created on the Satellite as 'Test_1%2f2', instead of 'Test_1/2'.

I am looking here and it looks like that the root issue is on file 'virt/esx/esx.py' that collect the information from VMware.

regards
rbertol

Comment 11 Rudnei Bertol Jr. 2020-08-28 14:20:41 UTC
Hey guys,

Just to let you know, that this issue raised this issue https://github.com/ansible-collections/vmware/issues/365 on Ansible VMware plugin.

regards
rbertol

Comment 19 Chris Williams 2020-11-11 21:55:18 UTC
Red Hat Enterprise Linux 7 shipped it's final minor release on September 29th, 2020. 7.9 was the last minor releases scheduled for RHEL 7.
From intial triage it does not appear the remaining Bugzillas meet the inclusion criteria for Maintenance Phase 2 and will now be closed. 

From the RHEL life cycle page:
https://access.redhat.com/support/policy/updates/errata#Maintenance_Support_2_Phase
"During Maintenance Support 2 Phase for Red Hat Enterprise Linux version 7,Red Hat defined Critical and Important impact Security Advisories (RHSAs) and selected (at Red Hat discretion) Urgent Priority Bug Fix Advisories (RHBAs) may be released as they become available."

If this BZ was closed in error and meets the above criteria please re-open it flag for 7.9.z, provide suitable business and technical justifications, and follow the process for Accelerated Fixes:
https://source.redhat.com/groups/public/pnt-cxno/pnt_customer_experience_and_operations_wiki/support_delivery_accelerated_fix_release_handbook  

Feature Requests can re-opened and moved to RHEL 8 if the desired functionality is not already present in the product. 

Please reach out to the applicable Product Experience Engineer[0] if you have any questions or concerns.  

[0] https://bugzilla.redhat.com/page.cgi?id=agile_component_mapping.html&product=Red+Hat+Enterprise+Linux+7

Comment 20 Chris Williams 2020-11-11 23:16:55 UTC
Apologies for the inadvertent closure.

Comment 29 errata-xmlrpc 2020-12-15 11:19:06 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (virt-who bug fix and enhancement update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2020:5444