Bug 1278321 - Failed to activate iscsi storage after RHEV-H registered to RHEV-M.
Failed to activate iscsi storage after RHEV-H registered to RHEV-M.
Status: CLOSED NOTABUG
Product: Red Hat Enterprise Virtualization Manager
Classification: Red Hat
Component: vdsm (Show other bugs)
3.6.0
Unspecified Unspecified
urgent Severity high
: ---
: ---
Assigned To: Amit Aviram
Aharon Canan
node
:
Depends On:
Blocks:
  Show dependency treegraph
 
Reported: 2015-11-05 04:24 EST by cshao
Modified: 2016-02-10 15:11 EST (History)
18 users (show)

See Also:
Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of:
Environment:
Last Closed: 2015-11-16 22:32:34 EST
Type: Bug
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: Node
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)
failed to active iscsi.png (21.32 KB, image/png)
2015-11-05 04:24 EST, cshao
no flags Details
failed-active-iscsi.tar.gz (978.07 KB, application/x-gzip)
2015-11-05 04:25 EST, cshao
no flags Details
old-vs-new (231.92 KB, application/x-gzip)
2015-11-16 05:58 EST, cshao
no flags Details

  None (edit)
Description cshao 2015-11-05 04:24:28 EST
Created attachment 1089990 [details]
failed to active iscsi.png

Description of problem:
RHEV-H failed to activate iscsi storage after register to RHEV-M.

Test version:
rhev-hypervisor7-7.2-20151025.0
ovirt-node-3.3.0-0.18.20151022git82dc52c.el7ev.noarch
vdsm-4.17.10-5.el7ev.noarch
RHEV-M 3.6.0.2-0.1.el6

Test steps:
1. Install RHEV-H 7.2 on iscsi machine via pxe.
2. Register to RHEV-M 3.6
3. Try to create new iscsi storage in RHEVM side.

Test results:
RHEV-H can retrieve iscsi lun, but failed to activate iscsi storage after register to RHEV-M.

Please see new attachment for more details.
Comment 1 cshao 2015-11-05 04:25 EST
Created attachment 1089991 [details]
failed-active-iscsi.tar.gz
Comment 2 Tal Nisan 2015-11-05 08:28:04 EST
Look at the VDSM log it seems like a specific node issue and not general storage one, moving to node
Comment 10 Fabian Deutsch 2015-11-09 10:11:10 EST
Tal, what part in the logs makes you think that this is node specific?

I only see:
Thread-744::DEBUG::2015-11-04 07:58:06,646::iscsiadm::97::Storage.Misc.excCmd::(_runCmd) FAILED: <err> = 'iscsiadm: Could not login to [iface: default, target: iqn.2001-05.com.equallogic:0-8a0906-3831f7d03-857f49b26655031e-s1-gouyang-165404-02, portal: 10.66.90.100,3260].\niscsiadm: initiator reported error (24 - iSCSI login failed due to authorization failure)\niscsiadm: Could not log into all portals\n'; <rc> = 24
Thread-744::DEBUG::2015-11-04 07:58:06,646::iscsiadm::97::Storage.Misc.excCmd::(_runCmd) /usr/bin/sudo -n /sbin/iscsiadm -m iface (cwd None)
Thread-744::DEBUG::2015-11-04 07:58:06,725::iscsiadm::97::Storage.Misc.excCmd::(_runCmd) SUCCESS: <err> = ''; <rc> = 0
Thread-744::DEBUG::2015-11-04 07:58:06,726::iscsiadm::97::Storage.Misc.excCmd::(_runCmd) /usr/bin/sudo -n /sbin/iscsiadm -m node -T iqn.2001-05.com.equallogic:0-8a0906-3831f7d03-857f49b26655031e-s1-gouyang-165404-02 -I default -p 10.66.90.100:3260,1 -u (cwd None)
Thread-744::DEBUG::2015-11-04 07:58:06,803::iscsiadm::97::Storage.Misc.excCmd::(_runCmd) FAILED: <err> = 'iscsiadm: No matching sessions found\n'; <rc> = 21
Thread-744::DEBUG::2015-11-04 07:58:06,803::iscsiadm::97::Storage.Misc.excCmd::(_runCmd) /usr/bin/sudo -n /sbin/iscsiadm -m iface (cwd None)
Thread-744::DEBUG::2015-11-04 07:58:06,881::iscsiadm::97::Storage.Misc.excCmd::(_runCmd) SUCCESS: <err> = ''; <rc> = 0
Thread-744::DEBUG::2015-11-04 07:58:06,881::iscsiadm::97::Storage.Misc.excCmd::(_runCmd) /usr/bin/sudo -n /sbin/iscsiadm -m node -T iqn.2001-05.com.equallogic:0-8a0906-3831f7d03-857f49b26655031e-s1-gouyang-165404-02 -I default -p 10.66.90.100:3260,1 --op=delete (cwd None)
Thread-744::DEBUG::2015-11-04 07:58:06,958::iscsiadm::97::Storage.Misc.excCmd::(_runCmd) SUCCESS: <err> = ''; <rc> = 0
Thread-744::INFO::2015-11-04 07:58:06,958::iscsi::564::Storage.ISCSI::(setRpFilterIfNeeded) iSCSI iface.net_ifacename not provided. Skipping.
Thread-744::ERROR::2015-11-04 07:58:06,958::hsm::2465::Storage.HSM::(connectStorageServer) Could not connect to storageServer
Traceback (most recent call last):
  File "/usr/share/vdsm/storage/hsm.py", line 2462, in connectStorageServer
    conObj.connect()
  File "/usr/share/vdsm/storage/storageServer.py", line 473, in connect
    iscsi.addIscsiNode(self._iface, self._target, self._cred)
  File "/usr/share/vdsm/storage/iscsi.py", line 201, in addIscsiNode
    iscsiadm.node_login(iface.name, portalStr, target.iqn)
  File "/usr/share/vdsm/storage/iscsiadm.py", line 312, in node_login
    raise IscsiAuthenticationError(rc, out, err)
IscsiAuthenticationError: (24, ['Logging in to [iface: default, target: iqn.2001-05.com.equallogic:0-8a0906-3831f7d03-857f49b26655031e-s1-gouyang-165404-02, portal: 10.66.90.100,3260] (multiple)'], ['iscsiadm: Could not login to [iface: default, target: iqn.2001-05.com.equallogic:0-8a0906-3831f7d03-857f49b26655031e-s1-gouyang-165404-02, portal: 10.66.90.100,3260].', 'iscsiadm: initiator reported error (24 - iSCSI login failed due to authorization failure)', 'iscsiadm: Could not log into all portals'])


Which looks like an incorrect passwords was used.
Comment 12 Tal Nisan 2015-11-15 05:19:49 EST
Fabian, I was basing that on the fact that it does not occur on RHEV alone with iSCSI storage
Comment 13 Amit Aviram 2015-11-15 09:37:52 EST
shaochen, Did you try to connect manually to the iscsi storage from the host? did it work? It will be helpful if you could provide the commands executed so we will able to compare it to what VDSM is doing.
Also, how does the target server configured in terms of authentication?

Thanks
Comment 15 cshao 2015-11-16 05:58 EST
Created attachment 1094848 [details]
old-vs-new
Comment 16 Ying Cui 2015-11-16 06:27:59 EST
Chen, according to comment 14, then did you test this bug with rhev-hypervisor7-7.2-20151025.0 version which reported in bug description? if so, we consider close this bug as notabug.
Comment 17 cshao 2015-11-16 22:32:34 EST
(In reply to Ying Cui from comment #16)
> Chen, according to comment 14, then did you test this bug with
> rhev-hypervisor7-7.2-20151025.0 version which reported in bug description?
> if so, we consider close this bug as notabug.

Yes, I was tested it with rhev-hypervisor7-7.2-20151025.0.
So close this bug as WORKSFORME.

Thanks!

Note You need to log in before you can comment on or make changes to this bug.