Bug 1277552 - VDSM erroneously reports storage as RO and downs all storage domains
Summary: VDSM erroneously reports storage as RO and downs all storage domains
Keywords:
Status: CLOSED INSUFFICIENT_DATA
Alias: None
Product: Red Hat Enterprise Virtualization Manager
Classification: Red Hat
Component: vdsm
Version: 3.6.0
Hardware: x86_64
OS: Unspecified
unspecified
urgent
Target Milestone: ovirt-3.6.1
: 3.6.0
Assignee: Fred Rolland
QA Contact: Aharon Canan
URL:
Whiteboard: storage
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2015-11-03 14:37 UTC by Kevin Alon Goldblatt
Modified: 2016-03-10 07:33 UTC (History)
8 users (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2015-11-04 15:47:20 UTC
oVirt Team: Storage
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
engine, vdsm, server logs (3.14 MB, application/x-gzip)
2015-11-03 14:51 UTC, Kevin Alon Goldblatt
no flags Details

Description Kevin Alon Goldblatt 2015-11-03 14:37:55 UTC
Description of problem:
vdsm erroneously reports all storage pools as r/o and downs all storage domains

Version-Release number of selected component (if applicable):
vdsm-4.17.8-1.el7ev.noarch
rhevm-3.6.0-0.18.el6.noarch


How reproducible:
Happened on at least 2 occasions over the last few days

Steps to Reproduce:
1. Have a rhevm setup with at least 12 storage domains (4 nfs with 2 masters, 8 gluster domains and 2 block domains)
2.During a test sessions which included creating gluster domains, while the system was idle for about and hour, the vdsm erroneously reports errors in monitoring the each of the storage pools and that the file system is R/O. All the storage domains are set to down.
3.Upon investingation on the host all files systems are actually R/W and no error is reported when accessing them


Actual results:
All the strorage pools are reported to be Read Only

Expected results:
All the storage pools should be accessable

Additional info:

VDSM LOG
---------------------------

Thread-498::DEBUG::2015-11-03 12:47:37,693::fileSD::173::Storage.Misc.excCmd::(getReadDelay) SUCCESS: <err> = '0+1 records in\n0+1 records out\n339 bytes (339 B) copied, 0.00478487 s, 70.8 kB/s\n'; <rc> = 0
Thread-498::INFO::2015-11-03 12:47:37,697::clusterlock::219::Storage.SANLock::(acquireHostId) Acquiring host id for domain 67d3271b-b1c9-4401-8fa0-0b9ced7c53e7 (id: 1)
Thread-498::DEBUG::2015-11-03 12:47:37,698::clusterlock::237::Storage.SANLock::(acquireHostId) Host id for domain 67d3271b-b1c9-4401-8fa0-0b9ced7c53e7 successfully acquired (id: 1)
Thread-473::DEBUG::2015-11-03 12:47:37,963::monitor::288::Storage.Monitor::(_refreshDomain) Refreshing domain d4f33bae-ab63-4b46-9330-cd98544e9a6b
Thread-473::DEBUG::2015-11-03 12:47:37,987::fileSD::157::Storage.StorageDomainManifest::(__init__) Reading domain in path /rhev/data-center/mnt/glusterSD/10.35.160.6:_gluster4444/d4f33bae-ab63-4b46-9330-cd98544e9a6b
Thread-473::DEBUG::2015-11-03 12:47:37,992::persistentDict::192::Storage.PersistentDict::(__init__) Created a persistent dict with FileMetadataRW backend
Thread-473::ERROR::2015-11-03 12:47:37,999::monitor::250::Storage.Monitor::(_monitorDomain) Error monitoring domain d4f33bae-ab63-4b46-9330-cd98544e9a6b
Traceback (most recent call last):
  File "/usr/share/vdsm/storage/monitor.py", line 246, in _monitorDomain
    self._performDomainSelftest()
  File "/usr/lib/python2.7/site-packages/vdsm/utils.py", line 774, in wrapper
    value = meth(self, *a, **kw)
  File "/usr/share/vdsm/storage/monitor.py", line 313, in _performDomainSelftest
    self.domain.selftest()
  File "/usr/share/vdsm/storage/sdc.py", line 49, in __getattr__
    return getattr(self.getRealDomain(), attrName)
  File "/usr/share/vdsm/storage/sdc.py", line 52, in getRealDomain
    return self._cache._realProduce(self._sdUUID)
  File "/usr/share/vdsm/storage/sdc.py", line 123, in _realProduce
    domain = self._findDomain(sdUUID)
  File "/usr/share/vdsm/storage/sdc.py", line 142, in _findDomain
    dom = findMethod(sdUUID)
  File "/usr/share/vdsm/storage/glusterSD.py", line 32, in findDomain
    return GlusterStorageDomain(GlusterStorageDomain.findDomainPath(sdUUID))
  File "/usr/share/vdsm/storage/fileSD.py", line 198, in __init__
    validateFileSystemFeatures(manifest.sdUUID, manifest.mountpoint)
  File "/usr/share/vdsm/storage/fileSD.py", line 93, in validateFileSystemFeatures
    oop.getProcessPool(sdUUID).directTouch(testFilePath)
  File "/usr/share/vdsm/storage/outOfProcess.py", line 350, in directTouch
    ioproc.touch(path, flags, mode)
  File "/usr/lib/python2.7/site-packages/ioprocess/__init__.py", line 507, in touch
    self.timeout)
  File "/usr/lib/python2.7/site-packages/ioprocess/__init__.py", line 391, in _sendCommand
    raise OSError(errcode, errstr)
OSError: [Errno 30] Read-only file system
Thread-473::INFO::2015-11-03 12:47:38,000::monitor::273::Storage.Monitor::(_notifyStatusChanges) Domain d4f33bae-ab63-4b46-9330-cd98544e9a6b became INVALID
Thread-265864::DEBUG::2015-11-03 12:47:38,001::misc::777::Storage.Event.Storage.DomainMonitor.onDomainStateChange::(_emit) Emitting event
Thread-265864::DEBUG::2015-11-03 12:47:38,002::misc::787::Storage.Event.Storage.DomainMonitor.onDomainStateChange::(_emit) Calling registered method `contEIOVms`
Thread-265864::DEBUG::2015-11-03 12:47:38,003::misc::797::Storage.Event.Storage.DomainMonitor.onDomainStateChange::(_emit) Event emitted
Reactor thread::INFO::2015-11-03 12:47:38,317::protocoldetector::72::ProtocolDetector.AcceptorImpl::(handle_accept) Accepting connection from 127.0.0.1:34022
Reactor thread::DEBUG::2015-11-03 12:47:38,329::protocoldetector::82::ProtocolDetector.Detector::(__init__) Using required_size=11
Reactor thread::INFO::2015-11-03 12:47:38,330::protocoldetector::118::ProtocolDetector.Detector::(handle_read) Detected protocol xml from 127.0.0.1:34022
Reactor thread::DEBUG::2015-11-03 12:47:38,331::bindingxmlrpc::1297::XmlDetector::(handle_socket) xml over http detected from ('127.0.0.1', 34022)
BindingXMLRPC::INFO::2015-11-03 12:47:38,331::xmlrpc::73::vds.XMLRPCServer::(handle_request) Starting request handler for 127.0.0.1:34022
Thread-265866::INFO::2015-11-03 12:47:38,332::xmlrpc::84::vds.XMLRPCServer::(_process_requests) Request handler for 127.0.0.1:34022 started
Thread-265866::INFO::2015-11-03 12:47:38,346::xmlrpc::92::vds.XMLRPCServer::(_process_requests) Request handler for 127.0.0.1:34022 stopped


ENGINE LOG
-----------------------------
2015-11-03 11:47:25,276 WARN  [org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] (org.ovirt.thread.pool-7-thread-40) [] domain 'd4f33bae-ab63-4b46-9330-cd98544e9a6b:gluster4444' in problem. vds: 'blond-vdsf'
2015-11-03 11:47:25,277 INFO  [org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] (org.ovirt.thread.pool-7-thread-40) [] Host '42c269ef-04fd-4815-9482-fb9f463c336d' has reported new storage access problem to the following domains 'd
4f33bae-ab63-4b46-9330-cd98544e9a6b' marking it for storage connections and pool metadata refresh (report id: 'c9775a64-7e00-4bfc-aa6c-953649a97700')
2015-11-03 11:47:31,076 INFO  [org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] (DefaultQuartzScheduler_Worker-45) [2f93570f] Running storage connections refresh for hosts '[42c269ef-04fd-4815-9482-fb9f463c336d]'
2015-11-03 11:47:31,078 INFO  [org.ovirt.engine.core.bll.storage.ConnectHostToStoragePoolServersCommand] (org.ovirt.thread.pool-7-thread-29) [7b74054d] Running command: ConnectHostToStoragePoolServersCommand internal: true. Entities affe
cted :  ID: c5f8a645-549e-4e3f-8a65-c669b4665852 Type: StoragePool
2015-11-03 11:47:31,133 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand] (org.ovirt.thread.pool-7-thread-29) [7b74054d] START, ConnectStorageServerVDSCommand(HostName = blond-vdsf, StorageServerConnectionM
anagementVDSParameters:{runAsync='true', hostId='42c269ef-04fd-4815-9482-fb9f463c336d', storagePoolId='c5f8a645-549e-4e3f-8a65-c669b4665852', storageType='ISCSI', connectionList='[StorageServerConnections:{id='f0d2bce9-c866-4488-a9c3-bae
29ca29056', connection='10.35.160.104', iqn='iqn.1992-04.com.emc:cx.ckm00121000438.a6', vfsType='null', mountOptions='null', nfsVersion='null', nfsRetrans='null', nfsTimeo='null', iface='null', netIfaceName='null'}, StorageServerConnecti
ons:{id='e0426226-afd3-4bf7-a296-a2f30f643f73', connection='10.35.160.105', iqn='iqn.1992-04.com.emc:cx.ckm00121000438.a7', vfsType='null', mountOptions='null', nfsVersion='null', nfsRetrans='null', nfsTimeo='null', iface='null', netIfac
eName='null'}, StorageServerConnections:{id='b8acb774-afa6-42d1-8217-b24b49f5b43a', connection='10.35.160.106', iqn='iqn.1992-04.com.emc:cx.ckm00121000438.b6', vfsType='null', mountOptions='null', nfsVersion='null', nfsRetrans='null', nf
sTimeo='null', iface='null', netIfaceName='null'}, StorageServerConnections:{id='221242ef-cb2c-4fa2-aa89-e6b9ae9a3bc3', connection='10.35.160.107', iqn='iqn.1992-04.com.emc:cx.ckm00121000438.b7', vfsType='null', mountOptions='null', nfsV
ersion='null', nfsRetrans='null', nfsTimeo='null', iface='null', netIfaceName='null'}]'}), log id: 36b4ceed
2015-11-03 11:47:40,367 WARN  [org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] (org.ovirt.thread.pool-7-thread-32) [] domain '00ed0b5c-4fa3-4e12-98c2-47511ac6d2c7:gluster446' in problem. vds: 'blond-vdsf'
2015-11-03 11:47:40,371 WARN  [org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] (org.ovirt.thread.pool-7-thread-32) [] domain '67d3271b-b1c9-4401-8fa0-0b9ced7c53e7:gluster4448' in problem. vds: 'blond-vdsf'
2015-11-03 11:47:40,372 WARN  [org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] (org.ovirt.thread.pool-7-thread-32) [] domain 'bb9854bb-798b-4143-b68c-fda3879eb474:gluster4443' in problem. vds: 'blond-vdsf'
2015-11-03 11:47:40,373 INFO  [org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData] (org.ovirt.thread.pool-7-thread-32) [] Host '42c269ef-04fd-4815-9482-fb9f463c336d' has reported new storage access problem to the following domains '0
0ed0b5c-4fa3-4e12-98c2-47511ac6d2c7,d4f33bae-ab63-4b46-9330-cd98544e9a6b,67d3271b-b1c9-4401-8fa0-0b9ced7c53e7,bb9854bb-798b-4143-b68c-fda3879eb474' marking it for storage connections and pool metadata refresh (report id: '31384c34-0c53-4
746-8ca7-d00552a0b9fb')
2015-11-03 11:47:44,097 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.ConnectStorageServerVDSCommand] (org.ovirt.thread.pool-7-thread-29) [7b74054d] FINISH, ConnectStorageServerVDSCommand, return: {221242ef-cb2c-4fa2-aa89-e6b9ae9a3bc3
=0, f0d2bce9-c866-4488-a9c3-bae29ca29056=0, e0426226-afd3-4bf7-a296-a2f30f643f73=0, b8acb774-afa6-42d1-8217-b24b49f5b43a=0}, log id: 36b4ceed
2015-11-03 11:47:44,098 ERROR [org.ovirt.engine.core.bll.storage.ConnectHostToStoragePoolServersCommand] (org.ovirt.thread.pool-7-thread-29) [7b74054d] Command 'org.ovirt.engine.core.bll.storage.ConnectHostToStoragePoolServersCommand' fa
iled: null
2015-11-03 11:47:44,098 ERROR [org.ovirt.engine.core.bll.storage.ConnectHostToStoragePoolServersCommand] (org.ovirt.thread.pool-7-thread-29) [7b74054d] Exception: java.lang.NullPointerException
        at org.ovirt.engine.core.bll.storage.GLUSTERFSStorageHelper.prepareConnectHostToStoragePoolServers(GLUSTERFSStorageHelper.java:54) [bll.jar:]
        at org.ovirt.engine.core.bll.storage.ConnectHostToStoragePoolServersCommand.connectStorageServersByType(ConnectHostToStoragePoolServersCommand.java:64) [bll.jar:]
        at org.ovirt.engine.core.bll.storage.ConnectHostToStoragePoolServersCommand.connectStorageServer(ConnectHostToStoragePoolServersCommand.java:55) [bll.jar:]
        at org.ovirt.engine.core.bll.storage.ConnectHostToStoragePoolServersCommand.executeCommand(ConnectHostToStoragePoolServersCommand.java:42) [bll.jar:]
        at org.ovirt.engine.core.bll.CommandBase.executeWithoutTransaction(CommandBase.java:1211) [bll.jar:]
        at org.ovirt.engine.core.bll.CommandBase.executeActionInTransactionScope(CommandBase.java:1355) [bll.jar:]
        at org.ovirt.engine.core.bll.CommandBase.runInTransaction(CommandBase.java:1979) [bll.jar:]
        at org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInSuppressed(TransactionSupport.java:174) [utils.jar:]
        at org.ovirt.engine.core.utils.transaction.TransactionSupport.executeInScope(TransactionSupport.java:116) [utils.jar:]
        at org.ovirt.engine.core.bll.CommandBase.execute(CommandBase.java:1392) [bll.jar:]
        at org.ovirt.engine.core.bll.CommandBase.executeAction(CommandBase.java:374) [bll.jar:]
        at org.ovirt.engine.core.bll.Backend.runAction(Backend.java:475) [bll.jar:]
        at org.ovirt.engine.core.bll.Backend.runActionImpl(Backend.java:457) [bll.jar:]
        at org.ovirt.engine.core.bll.Backend.runInternalAction(Backend.java:404) [bll.jar:]
        at sun.reflect.GeneratedMethodAccessor80.invoke(Unknown Source) [:1.8.0_51]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [rt.jar:1.8.0_51]
        at java.lang.reflect.Method.invoke(Method.java:497) [rt.jar:1.8.0_51]
        at org.jboss.as.ee.component.ManagedReferenceMethodInterceptor.processInvocation(ManagedReferenceMethodInterceptor.java:52) [jboss-as-ee.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.invocation.WeavedInterceptor.processInvocation(WeavedInterceptor.java:53) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ee.component.interceptors.UserInterceptorFactory$1.processInvocation(UserInterceptorFactory.java:63) [jboss-as-ee.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.invocation.InterceptorContext$Invocation.proceed(InterceptorContext.java:374) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.weld.ejb.Jsr299BindingsInterceptor.delegateInterception(Jsr299BindingsInterceptor.java:74) [jboss-as-weld.jar:7.5.3.Final-redhat-2]
        at org.jboss.as.weld.ejb.Jsr299BindingsInterceptor.doMethodInterception(Jsr299BindingsInterceptor.java:84) [jboss-as-weld.jar:7.5.3.Final-redhat-2]
        at org.jboss.as.weld.ejb.Jsr299BindingsInterceptor.processInvocation(Jsr299BindingsInterceptor.java:97) [jboss-as-weld.jar:7.5.3.Final-redhat-2]
        at org.jboss.as.ee.component.interceptors.UserInterceptorFactory$1.processInvocation(UserInterceptorFactory.java:63) [jboss-as-ee.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
       at org.jboss.invocation.WeavedInterceptor.processInvocation(WeavedInterceptor.java:53) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ee.component.interceptors.UserInterceptorFactory$1.processInvocation(UserInterceptorFactory.java:63) [jboss-as-ee.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ejb3.component.invocationmetrics.ExecutionTimeInterceptor.processInvocation(ExecutionTimeInterceptor.java:43) [jboss-as-ejb3.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.weld.ejb.EjbRequestScopeActivationInterceptor.processInvocation(EjbRequestScopeActivationInterceptor.java:93) [jboss-as-weld.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.invocation.InitialInterceptor.processInvocation(InitialInterceptor.java:21) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.invocation.ChainedInterceptor.processInvocation(ChainedInterceptor.java:61) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ee.component.interceptors.ComponentDispatcherInterceptor.processInvocation(ComponentDispatcherInterceptor.java:53) [jboss-as-ee.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ejb3.component.singleton.SingletonComponentInstanceAssociationInterceptor.processInvocation(SingletonComponentInstanceAssociationInterceptor.java:52) [jboss-as-ejb3.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ejb3.tx.CMTTxInterceptor.invokeInNoTx(CMTTxInterceptor.java:266) [jboss-as-ejb3.jar:7.5.3.Final-redhat-2]
        at org.jboss.as.ejb3.tx.CMTTxInterceptor.supports(CMTTxInterceptor.java:377) [jboss-as-ejb3.jar:7.5.3.Final-redhat-2]
        at org.jboss.as.ejb3.tx.CMTTxInterceptor.processInvocation(CMTTxInterceptor.java:246) [jboss-as-ejb3.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ejb3.component.interceptors.CurrentInvocationContextInterceptor.processInvocation(CurrentInvocationContextInterceptor.java:41) [jboss-as-ejb3.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ejb3.component.invocationmetrics.WaitTimeInterceptor.processInvocation(WaitTimeInterceptor.java:43) [jboss-as-ejb3.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ejb3.component.interceptors.ShutDownInterceptorFactory$1.processInvocation(ShutDownInterceptorFactory.java:64) [jboss-as-ejb3.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ejb3.component.interceptors.LoggingInterceptor.processInvocation(LoggingInterceptor.java:59) [jboss-as-ejb3.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ee.component.NamespaceContextInterceptor.processInvocation(NamespaceContextInterceptor.java:50) [jboss-as-ee.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ee.component.TCCLInterceptor.processInvocation(TCCLInterceptor.java:45) [jboss-as-ee.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.invocation.ChainedInterceptor.processInvocation(ChainedInterceptor.java:61) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ee.component.ViewService$View.invoke(ViewService.java:185) [jboss-as-ee.jar:7.5.3.Final-redhat-2]
        at org.jboss.as.ee.component.ViewDescription$1.processInvocation(ViewDescription.java:185) [jboss-as-ee.jar:7.5.3.Final-redhat-2]
        at org.jboss.invocation.InterceptorContext.proceed(InterceptorContext.java:288) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.invocation.ChainedInterceptor.processInvocation(ChainedInterceptor.java:61) [jboss-invocation.jar:1.1.2.Final-redhat-1]
        at org.jboss.as.ee.component.ProxyInvocationHandler.invoke(ProxyInvocationHandler.java:73) [jboss-as-ee.jar:7.5.3.Final-redhat-2]
        at org.ovirt.engine.core.bll.interfaces.BackendInternal$$$view2.runInternalAction(Unknown Source) [bll.jar:]
        at sun.reflect.GeneratedMethodAccessor79.invoke(Unknown Source) [:1.8.0_51]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [rt.jar:1.8.0_51]
        at java.lang.reflect.Method.invoke(Method.java:497) [rt.jar:1.8.0_51]
        at org.jboss.weld.util.reflection.SecureReflections$13.work(SecureReflections.java:267) [weld-core.jar:1.1.30.Final-redhat-1]
        at org.jboss.weld.util.reflection.SecureReflectionAccess.run(SecureReflectionAccess.java:52) [weld-core.jar:1.1.30.Final-redhat-1]
        at org.jboss.weld.util.reflection.SecureReflectionAccess.runAsInvocation(SecureReflectionAccess.java:137) [weld-core.jar:1.1.30.Final-redhat-1]
        at org.jboss.weld.util.reflection.SecureReflections.invoke(SecureReflections.java:263) [weld-core.jar:1.1.30.Final-redhat-1]
        at org.jboss.weld.bean.proxy.EnterpriseBeanProxyMethodHandler.invoke(EnterpriseBeanProxyMethodHandler.java:115) [weld-core.jar:1.1.30.Final-redhat-1]
        at org.jboss.weld.bean.proxy.EnterpriseTargetBeanInstance.invoke(EnterpriseTargetBeanInstance.java:56) [weld-core.jar:1.1.30.Final-redhat-1]
        at org.jboss.weld.bean.proxy.ProxyMethodHandler.invoke(ProxyMethodHandler.java:105) [weld-core.jar:1.1.30.Final-redhat-1]
        at org.ovirt.engine.core.bll.BackendCommandObjectsHandler$BackendInternal$BackendLocal$142795634$Proxy$_$$_Weld$Proxy$.runInternalAction(BackendCommandObjectsHandler$BackendInternal$BackendLocal$142795634$Proxy$_$$_Weld$Proxy$.java) [bll.jar:]
 at org.ovirt.engine.core.bll.VdsEventListener.connectHostToDomainsInActiveOrUnknownStatus(VdsEventListener.java:344) [bll.jar:]
        at org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData$9.call(IrsProxyData.java:1537) [vdsbroker.jar:]
        at org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxyData$9.call(IrsProxyData.java:1533) [vdsbroker.jar:]
        at org.ovirt.engine.core.utils.threadpool.ThreadPoolUtil$InternalCallable.call(ThreadPoolUtil.java:114) [utils.jar:]
        at java.util.concurrent.FutureTask.run(FutureTask.java:266) [rt.jar:1.8.0_51]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [rt.jar:1.8.0_51]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [rt.jar:1.8.0_51]
        at java.lang.Thread.run(Thread.java:745) [rt.jar:1.8.0_51]

Comment 1 Kevin Alon Goldblatt 2015-11-03 14:51:04 UTC
Created attachment 1089036 [details]
engine, vdsm, server logs

Added logs

Comment 4 Kevin Alon Goldblatt 2015-11-04 15:47:20 UTC
Upon further investigation we found questionable behaviour on both our gluster and nfs storage servers connected to this issue. This puts into question the validity of this defect. As such, I am closing this defect until further the testing environment is stabilized.


Note You need to log in before you can comment on or make changes to this bug.