Bug 1484053 - create_lease task is not ordered to be cleared and gets stuck on SPM in state 'finished' in case of a failure on lease_info. Storage pool gets non-responsive
Summary: create_lease task is not ordered to be cleared and gets stuck on SPM in state...
Keywords:
Status: CLOSED NOTABUG
Alias: None
Product: ovirt-engine
Classification: oVirt
Component: BLL.Storage
Version: 4.2.0
Hardware: x86_64
OS: Unspecified
unspecified
high
Target Milestone: ovirt-4.3.4
: ---
Assignee: Eyal Shenitzky
QA Contact: Elad
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2017-08-22 14:31 UTC by Elad
Modified: 2019-04-07 12:52 UTC (History)
6 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2019-03-18 09:35:51 UTC
oVirt Team: Storage
Embargoed:
aoconnor: ovirt-4.3+


Attachments (Terms of Use)
logs from host and engine (112 bytes, application/x-gzip)
2017-08-22 14:31 UTC, Elad
no flags Details
logs (3.29 MB, application/x-gzip)
2017-12-31 12:38 UTC, Elad
no flags Details
logs (2.37 MB, application/zip)
2019-02-10 10:05 UTC, Yosi Ben Shimon
no flags Details

Description Elad 2017-08-22 14:31:23 UTC
Created attachment 1316663 [details]
logs from host and engine

Description:
Querying for VM lease while the lease is being created with a "NoSuchLease: No such lease" failure, because the lease is in the creation process, will cause the create_lease task to get stuck on state 'finished' on SPM.
This causes the storage pool to get flapping between up and non-responsive in a severl seconds frequency.

The bug seems to be on engine which not ordering SPM to clear the create_lease task.


Version-Release number of selected component (if applicable):
vdsm-4.20.2-64.git072feb0.el7.centos.x86_64
libvirt-3.2.0-14.el7_4.2.x86_64
qemu-kvm-ev-2.6.0-28.el7.10.1.x86_64
sanlock-3.5.0-1.el7.x86_64
selinux-policy-targeted-3.13.1-166.el7.noarch
kernel - 3.10.0-693.1.1.el7.x86_64
Red Hat Enterprise Linux Server 7.4 (Maipo)
ovirt-engine-4.2.0-0.0.master.20170814200319.git3da5f35.el7.centos.noarch

How reproducible:
Reproduces always during execution of https://polarion.engineering.redhat.com/polarion/redirect/project/RHEVM3/workitem?id=RHEVM-17624

Steps to Reproduce:
1. Create a VM with a lease on the storage (NFS in my case) and during the creation, try to remove the lease (via REST)

Actual results:

lease_info fails with the following because the lease has not been created yet:

2017-08-22 16:02:48,242+0300 DEBUG (jsonrpc/1) [storage.TaskManager.Task] (Task='cf797248-6a89-4063-9720-8c6de2036214') moving from state init -> state preparing (task:599)
2017-08-22 16:02:48,243+0300 INFO  (jsonrpc/1) [vdsm.api] START lease_info(lease={'sd_id': '3415af18-36f8-41c1-8e6e-1a405c13b601', 'lease_id': 'edc8d436-49b6-4d7f-a983-079811b036bc'}) from=::ffff:10.35.161.182,51622, flow_id=db63c80, task_id=cf797248-6a89-4063-9720-8c6de2036214 (api:46)
2017-08-22 16:02:48,244+0300 DEBUG (jsonrpc/1) [storage.ResourceManager] Trying to register resource '00_storage.3415af18-36f8-41c1-8e6e-1a405c13b601' for lock type 'shared' (resourceManager:495)
2017-08-22 16:02:48,245+0300 DEBUG (jsonrpc/1) [storage.ResourceManager] Resource '00_storage.3415af18-36f8-41c1-8e6e-1a405c13b601' is free. Now locking as 'shared' (1 active user) (resourceManager:552)
2017-08-22 16:02:48,246+0300 DEBUG (jsonrpc/1) [storage.ResourceManager.Request] (ResName='00_storage.3415af18-36f8-41c1-8e6e-1a405c13b601', ReqID='f6775daa-8a9d-41b2-adb8-ee6665106171') Granted request (resourceManager:222)
2017-08-22 16:02:48,248+0300 DEBUG (jsonrpc/1) [storage.xlease] Loading index from u'/rhev/data-center/mnt/yellow-vdsb.qa.lab.tlv.redhat.com:_Storage__NFS_storage__local__ge2__nfs__1/3415af18-36f8-41c1-8e6e-1a405c13b601/dom_md/xleases' (xlease:476)
2017-08-22 16:02:48,251+0300 DEBUG (jsonrpc/1) [storage.xlease] Loaded <IndexMetadata version=1, lockspace=u'3415af18-36f8-41c1-8e6e-1a405c13b601', mtime=1502635412, updating=False at 0x2fbf150> (xlease:487)
2017-08-22 16:02:48,251+0300 DEBUG (jsonrpc/1) [storage.xlease] Looking up lease 'edc8d436-49b6-4d7f-a983-079811b036bc' in lockspace u'3415af18-36f8-41c1-8e6e-1a405c13b601' (xlease:515)
2017-08-22 16:02:48,253+0300 DEBUG (jsonrpc/1) [storage.xlease] Closing index for lockspace u'3415af18-36f8-41c1-8e6e-1a405c13b601' (xlease:612)
2017-08-22 16:02:48,254+0300 DEBUG (jsonrpc/1) [storage.ResourceManager] Trying to release resource '00_storage.3415af18-36f8-41c1-8e6e-1a405c13b601' (resourceManager:567)
2017-08-22 16:02:48,254+0300 DEBUG (jsonrpc/1) [storage.ResourceManager] Released resource '00_storage.3415af18-36f8-41c1-8e6e-1a405c13b601' (0 active users) (resourceManager:585)
2017-08-22 16:02:48,255+0300 DEBUG (jsonrpc/1) [storage.ResourceManager] Resource '00_storage.3415af18-36f8-41c1-8e6e-1a405c13b601' is free, finding out if anyone is waiting for it. (resourceManager:591)
2017-08-22 16:02:48,256+0300 DEBUG (jsonrpc/1) [storage.ResourceManager] No one is waiting for resource '00_storage.3415af18-36f8-41c1-8e6e-1a405c13b601', Clearing records. (resourceManager:599)
2017-08-22 16:02:48,257+0300 INFO  (jsonrpc/1) [vdsm.api] FINISH lease_info error=No such lease edc8d436-49b6-4d7f-a983-079811b036bc from=::ffff:10.35.161.182,51622, flow_id=db63c80, task_id=cf797248-6a89-4063-9720-8c6de2036214 (api:50)
2017-08-22 16:02:48,257+0300 ERROR (jsonrpc/1) [storage.TaskManager.Task] (Task='cf797248-6a89-4063-9720-8c6de2036214') Unexpected error (task:872)
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/vdsm/storage/task.py", line 879, in _run
    return fn(*args, **kargs)
  File "<string>", line 2, in lease_info
  File "/usr/lib/python2.7/site-packages/vdsm/common/api.py", line 48, in method
    ret = func(*args, **kwargs)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/hsm.py", line 3617, in lease_info
    info = dom.lease_info(lease.lease_id)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/sd.py", line 625, in lease_info
    return vol.lookup(lease_id)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/xlease.py", line 518, in lookup
    raise NoSuchLease(lease_id)
NoSuchLease: No such lease edc8d436-49b6-4d7f-a983-079811b036bc


nfo for VM edc8d436-49b6-4d7f-a983-079811b036bc, message: EngineException: org.ovirt.engine.core.vdsbroker.irsbroker.IRSErrorException: IRSGenericException: IRSErrorException: Failed to GetVmLeaseInfoVDS, error = No such lease edc8d436-49b6-4d7f-a983-079811b036bc, code = 100 (Failed with error GeneralException and code 100)



After that, the create_lease task will be stuck in state 'finished' in SPM which causes the whole storage pool to change states between up and non-responsive every few seconds.

2017-08-22 17:07:14,532+03 ERROR [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (org.ovirt.thread.EE-ManagedThreadFactory-engine-Thread-35) [db63c80] EVENT_ID: IRS_B
ROKER_COMMAND_FAILURE(10,803), VDSM command GetVmLeaseInfoVDS failed: No such lease edc8d436-49b6-4d7f-a983-079811b036bc


The following SQL exception repeats in engine.log:

2017-08-22 17:06:59,658+03 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksInfoVDSCommand] (org.ovirt.thread.EE-ManagedThreadFactory-engine-Thread-27) [26726f43] START, HSMGe
tAllTasksInfoVDSCommand(HostName = host_mixed_1, VdsIdVDSCommandParametersBase:{hostId='bd4e1c67-931c-4b99-8289-ccebad3d6c75'}), log id: 7b533921
2017-08-22 17:06:59,671+03 ERROR [org.ovirt.engine.core.bll.job.JobRepositoryImpl] (org.ovirt.thread.EE-ManagedThreadFactory-engine-Thread-19) [db63c80] Failed to save step '788ce6f8-b25d-4
3c6-9d02-4288f6c87676', 'FINALIZING': CallableStatementCallback; SQL [{call insertstep(?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)}]; ERROR: insert or update on table "step" violates foreign 
key constraint "fk_step_job"
  Detail: Key (job_id)=(43cef4dd-0e16-4816-ba4e-c56df5b5b521) is not present in table "job".
  Where: SQL statement "INSERT INTO step (
        step_id,
        parent_step_id,
        job_id,
        step_type,
        description,
        step_number,
        status,
        progress,
        start_time,
        end_time,
        correlation_id,
        external_id,
        external_system_type,
        is_external
        )
    VALUES (
        v_step_id,
        v_parent_step_id,
        v_job_id,
        v_step_type,
        v_description,
        v_step_number,
        v_status,
        v_progress,
        v_start_time,
        v_end_time,
        v_correlation_id,
        v_external_id,
        v_external_system_type,
        v_is_external
        )"
PL/pgSQL function insertstep(uuid,uuid,uuid,character varying,text,integer,character varying,smallint,timestamp with time zone,timestamp with time zone,character varying,uuid,character varying,boolean) line 3 at SQL statement; nested exception is org.postgresql.util.PSQLException: ERROR: insert or update on table "step" violates foreign key constraint "fk_step_job"
  Detail: Key (job_id)=(43cef4dd-0e16-4816-ba4e-c56df5b5b521) is not present in table "job".


Expected results:
create_lease task should be cleared

Workaround:
Restart ovirt-engine service

Additional info:
logs from host and engine

Comment 1 Eyal Shenitzky 2017-12-31 05:50:51 UTC
Elad, 
Can you please add the REST-API call which removes the lease.

Comment 2 Eyal Shenitzky 2017-12-31 07:37:22 UTC
Also, 
There are no logs in the attached file.
Can you please add the relevant logs?

Comment 3 Elad 2017-12-31 12:38:26 UTC
Created attachment 1374898 [details]
logs

I think I managed to reproduce by removing the lease via REST [1] during its creation



2017-12-31 14:31:44,366+02 ERROR [org.ovirt.engine.api.restapi.resource.validation.IOExceptionMapper] (default task-23) [] IO exception while processing "PUT" request for path "/vms/badc826d-6b2b-4248-89ee-0eeaf
8a773e8"


[1]
https://storage-ge-02.scl.lab.tlv.redhat.com/ovirt-engine/api/vms/badc826d-6b2b-4248-89ee-0eeaf8a773e8   PUT


<vm><lease/>
</vm>

Comment 4 Eyal Shenitzky 2017-12-31 14:07:46 UTC
As you wrote, this is an error which suggests that the request you sent to the engine isn't valid:

2017-12-31 14:31:44,366+02 ERROR [org.ovirt.engine.api.restapi.resource.validation.IOExceptionMapper] (default task-23) [] IO exception while processing "PUT" request for path "/vms/badc826d-6b2b-4248-89ee-0eeaf8a773e8"
2017-12-31 14:31:44,366+02 ERROR [org.ovirt.engine.api.restapi.resource.validation.IOExceptionMapper] (default task-23) [] Exception: java.io.IOException: javax.xml.bind.UnmarshalException: unexpected element (uri:"", local:"lease"). Expected elements are <{}action>,<{}affinity_group>,<{}affinity_groups>,<{}affinity_label>,<{}affinity_labels>,<{}affinity_rule>,<{}affinity_rules>,<{}agent>.....

This error is not related to the removal of the lease.

Comment 5 Elad 2017-12-31 15:59:49 UTC
Could not reproduce the original scenario (remove lease AFTER its creation). 
Executed the mentioned automation job (from the bug description) several times, all passed.

Used:
rhvm-4.2.0.2-0.1.el7.noarch

Comment 6 Eyal Shenitzky 2018-01-03 08:05:33 UTC
(In reply to Elad from comment #5)
> Could not reproduce the original scenario (remove lease AFTER its creation). 
> Executed the mentioned automation job (from the bug description) several
> times, all passed.
> 
> Used:
> rhvm-4.2.0.2-0.1.el7.noarch

Can you please close the bug?

Comment 7 Yosi Ben Shimon 2019-02-10 10:04:51 UTC
Happened again during automation execution.
Failed to create VM due to creation of lease failure.
* attached relevant logs

*** From art log:
Request to create the VM at 2019-02-04 09:38:29,670 using correlation-Id: vms_create_b658132a-53bd-4186


*** From engine log:


2019-02-04 09:38:29,765+02 INFO  [org.ovirt.engine.core.bll.AddVmCommand] (default task-76) [vms_create_b658132a-53bd-4186] Lock Acquired to object 'EngineLock:{exclusiveLocks='[vm_0_TestCase17618_0409382851=VM_NAME]', sharedLocks='[62828ecd-a85f-4457-96b3-4ab56bd24e52=TEMPLATE, 862c8030-a7d4-41cb-b414-acec7cf4da7e=DISK]'}'
2019-02-04 09:38:29,820+02 INFO  [org.ovirt.engine.core.bll.AddVmCommand] (default task-76) [vms_create_b658132a-53bd-4186] Running command: AddVmCommand internal: false. Entities affected :  ID: 3390da29-630d-4094-9871-1f2b2a855741 Type: ClusterAction group CREATE_VM with role type USER,  ID: 62828ecd-a85f-4457-96b3-4ab56bd24e52 Type: VmTemplateAction group CREATE_VM with role type USER,  ID: 0e0366a3-25d2-4652-b169-1672fddd769e Type: StorageAction group CREATE_DISK with role type USER
2019-02-04 09:38:29,872+02 INFO  [org.ovirt.engine.core.bll.storage.lease.AddVmLeaseCommand] (default task-76) [vms_create_b658132a-53bd-4186] Running command: AddVmLeaseCommand internal: true. Entities affected :  ID: 0e0366a3-25d2-4652-b169-1672fddd769e Type: Storage
2019-02-04 09:38:29,916+02 INFO  [org.ovirt.engine.core.bll.storage.lease.GetVmLeaseInfoCommand] (default task-76) [vms_create_b658132a-53bd-4186] Running command: GetVmLeaseInfoCommand internal: true. Entities affected :  ID: 0e0366a3-25d2-4652-b169-1672fddd769e Type: Storage
2019-02-04 09:38:29,920+02 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.GetVmLeaseInfoVDSCommand] (default task-76) [vms_create_b658132a-53bd-4186] START, GetVmLeaseInfoVDSCommand( VmLeaseVDSParameters:{storagePoolId='1edf774c-e65f-422c-8d6f-de2623d60545', ignoreFailoverLimit='false', leaseId='902224d0-77e4-4478-af66-8d2e0db5d46c', storageDomainId='0e0366a3-25d2-4652-b169-1672fddd769e'}), log id: 580483c
2019-02-04 09:38:29,942+02 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.GetVmLeaseInfoVDSCommand] (default task-76) [vms_create_b658132a-53bd-4186] Failed in 'GetVmLeaseInfoVDS' method
2019-02-04 09:38:29,954+02 ERROR [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (default task-76) [vms_create_b658132a-53bd-4186] EVENT_ID: IRS_BROKER_COMMAND_FAILURE(10,803), VDSM command GetVmLeaseInfoVDS failed: No such lease 902224d0-77e4-4478-af66-8d2e0db5d46c
2019-02-04 09:38:29,954+02 ERROR [org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] (default task-76) [vms_create_b658132a-53bd-4186] IrsBroker::Failed::GetVmLeaseInfoVDS: IRSGenericException: IRSErrorException: Failed to GetVmLeaseInfoVDS, error = No such lease 902224d0-77e4-4478-af66-8d2e0db5d46c, code = 100
2019-02-04 09:38:29,994+02 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStopVDSCommand] (default task-76) [vms_create_b658132a-53bd-4186] START, SpmStopVDSCommand(HostName = host_mixed_3, SpmStopVDSCommandParameters:{hostId='33d753d5-41eb-4955-94e3-8a3608965245', storagePoolId='1edf774c-e65f-422c-8d6f-de2623d60545'}), log id: 1dbbd1b5
2019-02-04 09:38:30,001+02 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] (default task-76) [vms_create_b658132a-53bd-4186] Failed in 'HSMGetAllTasksStatusesVDS' method
2019-02-04 09:38:30,002+02 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStopVDSCommand] (default task-76) [vms_create_b658132a-53bd-4186] SpmStopVDSCommand::Not stopping SPM on vds 'host_mixed_3', pool id '1edf774c-e65f-422c-8d6f-de2623d60545' as there are uncleared tasks 'Task 'f5a37282-725f-4557-b955-002221314089', status 'finished'
Task 'd28fcc5d-19f4-41b5-8617-526aa43f9839', status 'finished'
Task '0a4eaedd-a6f5-4242-ad27-e66cba7ba46d', status 'running'
Task '9bb8e164-0acd-43cb-855b-c64362a2175a', status 'finished'
Task '14675551-c22a-4f3c-8b6c-820fd3ab8c94', status 'finished'
Task '3ca33109-5477-4747-9869-45826063a83f', status 'finished'
Task '3482e4b9-af12-4701-a5dc-be9dd9305634', status 'running''
2019-02-04 09:38:30,002+02 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStopVDSCommand] (default task-76) [vms_create_b658132a-53bd-4186] FINISH, SpmStopVDSCommand, return: , log id: 1dbbd1b5
2019-02-04 09:38:30,002+02 ERROR [org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxy] (default task-76) [vms_create_b658132a-53bd-4186] IRS failover failed - can't allocate vds server
2019-02-04 09:38:30,002+02 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.GetVmLeaseInfoVDSCommand] (default task-76) [vms_create_b658132a-53bd-4186] FINISH, GetVmLeaseInfoVDSCommand, return: , log id: 580483c
2019-02-04 09:38:30,002+02 ERROR [org.ovirt.engine.core.bll.storage.lease.GetVmLeaseInfoCommand] (default task-76) [vms_create_b658132a-53bd-4186] Failure in getting lease info for VM '902224d0-77e4-4478-af66-8d2e0db5d46c' from storage domains '0e0366a3-25d2-4652-b169-1672fddd769e', message: EngineException: org.ovirt.engine.core.vdsbroker.irsbroker.IRSErrorException: IRSGenericException: IRSErrorException: Failed to GetVmLeaseInfoVDS, error = No such lease 902224d0-77e4-4478-af66-8d2e0db5d46c, code = 100 (Failed with error GeneralException and code 100)


*** after that, all storage domain switched to 'Unknown':


2019-02-04 09:38:30,093+02 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.AddVmLeaseVDSCommand] (default task-76) [vms_create_b658132a-53bd-4186] START, AddVmLeaseVDSCommand( VmLeaseVDSParameters:{storagePoolId='1edf774c-e65f-422c-8d6f-de2623d60545', ignoreFailoverLimit='false', leaseId='902224d0-77e4-4478-af66-8d2e0db5d46c', storageDomainId='0e0366a3-25d2-4652-b169-1672fddd769e'}), log id: 349f06c6
2019-02-04 09:38:30,114+02 INFO  [org.ovirt.engine.core.bll.storage.pool.SetStoragePoolStatusCommand] (default task-76) [13279c2b] Running command: SetStoragePoolStatusCommand internal: true. Entities affected :  ID: 1edf774c-e65f-422c-8d6f-de2623d60545 Type: StoragePool
2019-02-04 09:38:30,116+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain '114c6492-0331-4883-9633-0a5beb567fda' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,116+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain '2ed446e7-30a2-4f11-993c-c03ab82867d6' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,117+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain 'ff8fd019-1a64-45d3-846d-1d817d2f9832' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,117+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain '37f59475-0371-47e9-9aef-997e6429d9ea' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,117+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain 'e86ef26d-671d-48cb-874b-65ff33194cfe' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,117+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain 'c4580ad6-8997-4968-adba-cde96cdd2c52' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,117+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain 'd660ed4a-4b2c-4da7-8688-0a6c12802add' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,118+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain '0e0366a3-25d2-4652-b169-1672fddd769e' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,118+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain 'bf439e40-9cea-49dc-a6ad-2817073940ff' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,118+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain '965928c0-317d-44ff-8eea-9e9b41a943ae' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,118+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain '7050c6ec-6cb6-4daf-8d17-579bb080d9e2' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,118+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain '363b1236-96ba-4819-bfe3-fe118159704d' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,118+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain '9513ce5c-b650-4165-991d-9d0a400d220f' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,119+02 INFO  [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' - Updating Storage Domain 'ef54696c-22ee-45db-aa94-4a1928b5504c' status from 'Active' to 'Unknown', reason: null
2019-02-04 09:38:30,149+02 WARN  [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (default task-76) [13279c2b] EVENT_ID: SYSTEM_CHANGE_STORAGE_POOL_STATUS_PROBLEMATIC_SEARCHING_NEW_SPM(986), Data Center is being initialized, please wait for initialization to complete.

....

2019-02-04 09:38:39,240+02 WARN  [org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksInfoVDSCommand] (EE-ManagedThreadFactory-engine-Thread-13705) [13279c2b] The task type in the vdsm response is 'create_lease' and does not  appear in the AsyncTaskType enum




*** From SPM log (create_lease task id = 3b72ca01-d07b-43e5-8fe0-6f436cf0907b):

2019-02-04 09:38:38,583+0200 DEBUG (jsonrpc/4) [storage.TaskManager.Task] (Task='3b72ca01-d07b-43e5-8fe0-6f436cf0907b') moving from state preparing -> state preparing (task:602)
2019-02-04 09:38:38,583+0200 INFO  (jsonrpc/4) [vdsm.api] START create_lease(lease={u'sd_id': u'0e0366a3-25d2-4652-b169-1672fddd769e', u'lease_id': u'902224d0-77e4-4478-af66-8d2e0db5d46c'}) from=::ffff:10.46.16.255,43276, flow_id=13279c2b, task_id=3b72ca01-d07b-43e5-8fe0-6f436cf0907b (api:48)

....

2019-02-04 09:38:38,876+0200 DEBUG (tasks/5) [storage.TaskManager.Task] (Task='3b72ca01-d07b-43e5-8fe0-6f436cf0907b') Task.run: running job 0: create_lease: <bound method StoragePool.create_lease of <vdsm.storage.sp.StoragePool object at 0x7f6d482e1990>> (args: (<vdsm.storage.types.lease_id object at 0x7f6d30756390>,) kwargs: {}) (task:914)
2019-02-04 09:38:38,876+0200 DEBUG (tasks/5) [storage.TaskManager.Task] (Task='3b72ca01-d07b-43e5-8fe0-6f436cf0907b') Job.run: running create_lease: <bound method StoragePool.create_lease of <vdsm.storage.sp.StoragePool object at 0x7f6d482e1990>> (args: (<vdsm.storage.types.lease_id object at 0x7f6d30756390>,) kwargs: {}) callback None (task:333)

....

2019-02-04 09:38:38,894+0200 DEBUG (tasks/5) [storage.TaskManager.Task] (Task='3b72ca01-d07b-43e5-8fe0-6f436cf0907b') moving from state finished -> state finished (task:602)

....

2019-02-04 09:38:39,221+0200 DEBUG (jsonrpc/7) [storage.TaskManager] Entry. taskID: 3b72ca01-d07b-43e5-8fe0-6f436cf0907b (taskManager:103)
2019-02-04 09:38:39,221+0200 DEBUG (jsonrpc/7) [storage.TaskManager] Return. Response: {'code': 0, 'message': '1 jobs completed successfully', 'taskState': 'finished', 'taskResult': 'success', 'taskID': '3b72ca01-d07b-43e5-8fe0-6f436cf0907b'} (taskManager:106)

....

2019-02-04 09:39:39,572+0200 INFO  (jsonrpc/4) [vdsm.api] START clearTask(taskID=u'3b72ca01-d07b-43e5-8fe0-6f436cf0907b', spUUID=None, options=None) from=::ffff:10.46.16.255,34620, task_id=330d9dd1-bedb-46ec-8ff8-b1ae027c97d6 (api:48)
2019-02-04 09:39:39,572+0200 DEBUG (jsonrpc/4) [storage.TaskManager] Entry. taskID: 3b72ca01-d07b-43e5-8fe0-6f436cf0907b (taskManager:171)

....

2019-02-04 09:53:59,332+0200 INFO  (jsonrpc/4) [vdsm.api] FINISH lease_info error=No such lease 7f5c77da-b4eb-4170-9a2e-43fa81982c48 from=::ffff:10.46.16.255,43276, flow_id=vms_create_4738c5fc-4156-4d9e, task_id=f4fadab4-37bb-42dc-8ed9-1d618dee6a0d (api:52)
2019-02-04 09:53:59,332+0200 ERROR (jsonrpc/4) [storage.TaskManager.Task] (Task='f4fadab4-37bb-42dc-8ed9-1d618dee6a0d') Unexpected error (task:875)
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/vdsm/storage/task.py", line 882, in _run
    return fn(*args, **kargs)
  File "<string>", line 2, in lease_info
  File "/usr/lib/python2.7/site-packages/vdsm/common/api.py", line 50, in method
    ret = func(*args, **kwargs)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/hsm.py", line 3701, in lease_info
    info = dom.lease_info(lease.lease_id)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/sd.py", line 670, in lease_info
    return vol.lookup(lease_id)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/xlease.py", line 553, in lookup
    raise NoSuchLease(lease_id)
NoSuchLease: No such lease 7f5c77da-b4eb-4170-9a2e-43fa81982c48
2019-02-04 09:53:59,332+0200 DEBUG (jsonrpc/4) [storage.TaskManager.Task] (Task='f4fadab4-37bb-42dc-8ed9-1d618dee6a0d') Task._run: f4fadab4-37bb-42dc-8ed9-1d618dee6a0d ({u'sd_id': u'0e0366a3-25d2-4652-b169-1672fddd769e', u'lease_id': u'7f5c77da-b4eb-4170-9a2e-43fa81982c48'},) {} failed - stopping task (task:894)
2019-02-04 09:53:59,332+0200 DEBUG (jsonrpc/4) [storage.TaskManager.Task] (Task='f4fadab4-37bb-42dc-8ed9-1d618dee6a0d') stopping in state failed (force False) (task:1256)
2019-02-04 09:53:59,332+0200 DEBUG (jsonrpc/4) [storage.TaskManager.Task] (Task='f4fadab4-37bb-42dc-8ed9-1d618dee6a0d') ref 1 aborting True (task:1002)
2019-02-04 09:53:59,332+0200 INFO  (jsonrpc/4) [storage.TaskManager.Task] (Task='f4fadab4-37bb-42dc-8ed9-1d618dee6a0d') aborting: Task is aborted: u'No such lease 7f5c77da-b4eb-4170-9a2e-43fa81982c48' - code 100 (task:1181)
2019-02-04 09:53:59,332+0200 DEBUG (jsonrpc/4) [storage.TaskManager.Task] (Task='f4fadab4-37bb-42dc-8ed9-1d618dee6a0d') Prepare: aborted: No such lease 7f5c77da-b4eb-4170-9a2e-43fa81982c48 (task:1186)
2019-02-04 09:53:59,332+0200 DEBUG (jsonrpc/4) [storage.TaskManager.Task] (Task='f4fadab4-37bb-42dc-8ed9-1d618dee6a0d') ref 0 aborting True (task:1002)
2019-02-04 09:53:59,332+0200 DEBUG (jsonrpc/4) [storage.TaskManager.Task] (Task='f4fadab4-37bb-42dc-8ed9-1d618dee6a0d') Task._doAbort: force False (task:937)
2019-02-04 09:53:59,332+0200 DEBUG (jsonrpc/4) [storage.ResourceManager.Owner] Owner.cancelAll requests {} (resourceManager:948)
2019-02-04 09:53:59,332+0200 DEBUG (jsonrpc/4) [storage.TaskManager.Task] (Task='f4fadab4-37bb-42dc-8ed9-1d618dee6a0d') moving from state failed -> state aborting (task:602)
2019-02-04 09:53:59,332+0200 DEBUG (jsonrpc/4) [storage.TaskManager.Task] (Task='f4fadab4-37bb-42dc-8ed9-1d618dee6a0d') _aborting: recover policy none (task:557)
2019-02-04 09:53:59,332+0200 DEBUG (jsonrpc/4) [storage.TaskManager.Task] (Task='f4fadab4-37bb-42dc-8ed9-1d618dee6a0d') moving from state failed -> state failed (task:602)
2019-02-04 09:53:59,332+0200 DEBUG (jsonrpc/4) [storage.ResourceManager.Owner] Owner.releaseAll requests {} resources {} (resourceManager:911)
2019-02-04 09:53:59,332+0200 DEBUG (jsonrpc/4) [storage.ResourceManager.Owner] Owner.cancelAll requests {} (resourceManager:948)
2019-02-04 09:53:59,333+0200 ERROR (jsonrpc/4) [storage.Dispatcher] FINISH lease_info error=No such lease 7f5c77da-b4eb-4170-9a2e-43fa81982c48 (dispatcher:85)
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/vdsm/storage/dispatcher.py", line 72, in wrapper
    result = ctask.prepare(func, *args, **kwargs)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/task.py", line 108, in wrapper
    return m(self, *a, **kw)
  File "/usr/lib/python2.7/site-packages/vdsm/storage/task.py", line 1189, in prepare
    raise self.error
NoSuchLease: No such lease 7f5c77da-b4eb-4170-9a2e-43fa81982c48

Comment 8 Yosi Ben Shimon 2019-02-10 10:05:35 UTC
Created attachment 1528720 [details]
logs

Comment 9 Yosi Ben Shimon 2019-02-10 10:09:52 UTC
Environment details regarding comment #7:
- HE env
- ovirt-engine-4.3.0.4-0.1.el7.noarch
- vdsm-4.30.8-2.el7ev.x86_64
- TestCase17618 executed on FC

Comment 10 Eyal Shenitzky 2019-02-11 08:41:01 UTC
(In reply to Yosi Ben Shimon from comment #7)
> Happened again during automation execution.
> Failed to create VM due to creation of lease failure.
> * attached relevant logs
> 
> *** From art log:
> Request to create the VM at 2019-02-04 09:38:29,670 using correlation-Id:
> vms_create_b658132a-53bd-4186
> 
> 
> *** From engine log:
> 
> 
> 2019-02-04 09:38:29,765+02 INFO  [org.ovirt.engine.core.bll.AddVmCommand]
> (default task-76) [vms_create_b658132a-53bd-4186] Lock Acquired to object
> 'EngineLock:{exclusiveLocks='[vm_0_TestCase17618_0409382851=VM_NAME]',
> sharedLocks='[62828ecd-a85f-4457-96b3-4ab56bd24e52=TEMPLATE,
> 862c8030-a7d4-41cb-b414-acec7cf4da7e=DISK]'}'
> 2019-02-04 09:38:29,820+02 INFO  [org.ovirt.engine.core.bll.AddVmCommand]
> (default task-76) [vms_create_b658132a-53bd-4186] Running command:
> AddVmCommand internal: false. Entities affected :  ID:
> 3390da29-630d-4094-9871-1f2b2a855741 Type: ClusterAction group CREATE_VM
> with role type USER,  ID: 62828ecd-a85f-4457-96b3-4ab56bd24e52 Type:
> VmTemplateAction group CREATE_VM with role type USER,  ID:
> 0e0366a3-25d2-4652-b169-1672fddd769e Type: StorageAction group CREATE_DISK
> with role type USER
> 2019-02-04 09:38:29,872+02 INFO 
> [org.ovirt.engine.core.bll.storage.lease.AddVmLeaseCommand] (default
> task-76) [vms_create_b658132a-53bd-4186] Running command: AddVmLeaseCommand
> internal: true. Entities affected :  ID:
> 0e0366a3-25d2-4652-b169-1672fddd769e Type: Storage
> 2019-02-04 09:38:29,916+02 INFO 
> [org.ovirt.engine.core.bll.storage.lease.GetVmLeaseInfoCommand] (default
> task-76) [vms_create_b658132a-53bd-4186] Running command:
> GetVmLeaseInfoCommand internal: true. Entities affected :  ID:
> 0e0366a3-25d2-4652-b169-1672fddd769e Type: Storage
> 2019-02-04 09:38:29,920+02 INFO 
> [org.ovirt.engine.core.vdsbroker.vdsbroker.GetVmLeaseInfoVDSCommand]
> (default task-76) [vms_create_b658132a-53bd-4186] START,
> GetVmLeaseInfoVDSCommand(
> VmLeaseVDSParameters:{storagePoolId='1edf774c-e65f-422c-8d6f-de2623d60545',
> ignoreFailoverLimit='false', leaseId='902224d0-77e4-4478-af66-8d2e0db5d46c',
> storageDomainId='0e0366a3-25d2-4652-b169-1672fddd769e'}), log id: 580483c
> 2019-02-04 09:38:29,942+02 ERROR
> [org.ovirt.engine.core.vdsbroker.vdsbroker.GetVmLeaseInfoVDSCommand]
> (default task-76) [vms_create_b658132a-53bd-4186] Failed in
> 'GetVmLeaseInfoVDS' method
> 2019-02-04 09:38:29,954+02 ERROR
> [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
> (default task-76) [vms_create_b658132a-53bd-4186] EVENT_ID:
> IRS_BROKER_COMMAND_FAILURE(10,803), VDSM command GetVmLeaseInfoVDS failed:
> No such lease 902224d0-77e4-4478-af66-8d2e0db5d46c
> 2019-02-04 09:38:29,954+02 ERROR
> [org.ovirt.engine.core.vdsbroker.irsbroker.IrsBrokerCommand] (default
> task-76) [vms_create_b658132a-53bd-4186]
> IrsBroker::Failed::GetVmLeaseInfoVDS: IRSGenericException:
> IRSErrorException: Failed to GetVmLeaseInfoVDS, error = No such lease
> 902224d0-77e4-4478-af66-8d2e0db5d46c, code = 100
> 2019-02-04 09:38:29,994+02 INFO 
> [org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStopVDSCommand] (default
> task-76) [vms_create_b658132a-53bd-4186] START, SpmStopVDSCommand(HostName =
> host_mixed_3,
> SpmStopVDSCommandParameters:{hostId='33d753d5-41eb-4955-94e3-8a3608965245',
> storagePoolId='1edf774c-e65f-422c-8d6f-de2623d60545'}), log id: 1dbbd1b5
> 2019-02-04 09:38:30,001+02 ERROR
> [org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand]
> (default task-76) [vms_create_b658132a-53bd-4186] Failed in
> 'HSMGetAllTasksStatusesVDS' method
> 2019-02-04 09:38:30,002+02 ERROR
> [org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStopVDSCommand] (default
> task-76) [vms_create_b658132a-53bd-4186] SpmStopVDSCommand::Not stopping SPM
> on vds 'host_mixed_3', pool id '1edf774c-e65f-422c-8d6f-de2623d60545' as
> there are uncleared tasks 'Task 'f5a37282-725f-4557-b955-002221314089',
> status 'finished'
> Task 'd28fcc5d-19f4-41b5-8617-526aa43f9839', status 'finished'
> Task '0a4eaedd-a6f5-4242-ad27-e66cba7ba46d', status 'running'
> Task '9bb8e164-0acd-43cb-855b-c64362a2175a', status 'finished'
> Task '14675551-c22a-4f3c-8b6c-820fd3ab8c94', status 'finished'
> Task '3ca33109-5477-4747-9869-45826063a83f', status 'finished'
> Task '3482e4b9-af12-4701-a5dc-be9dd9305634', status 'running''
> 2019-02-04 09:38:30,002+02 INFO 
> [org.ovirt.engine.core.vdsbroker.vdsbroker.SpmStopVDSCommand] (default
> task-76) [vms_create_b658132a-53bd-4186] FINISH, SpmStopVDSCommand, return:
> , log id: 1dbbd1b5
> 2019-02-04 09:38:30,002+02 ERROR
> [org.ovirt.engine.core.vdsbroker.irsbroker.IrsProxy] (default task-76)
> [vms_create_b658132a-53bd-4186] IRS failover failed - can't allocate vds
> server
> 2019-02-04 09:38:30,002+02 INFO 
> [org.ovirt.engine.core.vdsbroker.vdsbroker.GetVmLeaseInfoVDSCommand]
> (default task-76) [vms_create_b658132a-53bd-4186] FINISH,
> GetVmLeaseInfoVDSCommand, return: , log id: 580483c
> 2019-02-04 09:38:30,002+02 ERROR
> [org.ovirt.engine.core.bll.storage.lease.GetVmLeaseInfoCommand] (default
> task-76) [vms_create_b658132a-53bd-4186] Failure in getting lease info for
> VM '902224d0-77e4-4478-af66-8d2e0db5d46c' from storage domains
> '0e0366a3-25d2-4652-b169-1672fddd769e', message: EngineException:
> org.ovirt.engine.core.vdsbroker.irsbroker.IRSErrorException:
> IRSGenericException: IRSErrorException: Failed to GetVmLeaseInfoVDS, error =
> No such lease 902224d0-77e4-4478-af66-8d2e0db5d46c, code = 100 (Failed with
> error GeneralException and code 100)

This error is expected due to the latest optimization in the VM leases flow.
Before creating new lease the engine verifies that the lease doesn't already exist.

So this isn't the reason for the 'Unknown' situation.

You can see later that the operation for creating the lease initiated:
[org.ovirt.engine.core.vdsbroker.vdsbroker.AddVmLeaseVDSCommand] (default
task-76) [vms_create_b658132a-53bd-4186] START, AddVmLeaseVDSCommand(
VmLeaseVDSParameters:{storagePoolId='1edf774c-e65f-422c-8d6f-de2623d60545',
ignoreFailoverLimit='false', leaseId='902224d0-77e4-4478-af66-8d2e0db5d46c',
storageDomainId='0e0366a3-25d2-4652-b169-1672fddd769e'}), log id: 349f06c6

There are previous errors in the logs that indicate on failures on the same host:
2019-02-04 09:38:01,577+02 ERROR [org.ovirt.engine.core.vdsbroker.vdsbroker.HSMGetAllTasksStatusesVDSCommand] (EE-ManagedThreadFactory-engineScheduled-Thread-5) [] Failed in 'HSMGetAllTasksStatusesVDS' method
2019-02-04 09:38:01,587+02 ERROR [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (EE-ManagedThreadFactory-engineScheduled-Thread-5) [] EVENT_ID: VDS_BROKER_COMMAND_FAILURE(10,802), VDSM host_mixed_3 command HSMGetAllTasksStatusesVDS failed: low level Image copy failed: (u'Destination volume b49d32fb-d3c5-4281-a834-e02a31a47f16 error: Failed reload: 57c361bb-af8b-4ab4-a7f2-dd4de9dada07',)
 

> 
> 
> *** after that, all storage domain switched to 'Unknown':
> 
> 
> 2019-02-04 09:38:30,093+02 INFO 
> [org.ovirt.engine.core.vdsbroker.vdsbroker.AddVmLeaseVDSCommand] (default
> task-76) [vms_create_b658132a-53bd-4186] START, AddVmLeaseVDSCommand(
> VmLeaseVDSParameters:{storagePoolId='1edf774c-e65f-422c-8d6f-de2623d60545',
> ignoreFailoverLimit='false', leaseId='902224d0-77e4-4478-af66-8d2e0db5d46c',
> storageDomainId='0e0366a3-25d2-4652-b169-1672fddd769e'}), log id: 349f06c6
> 2019-02-04 09:38:30,114+02 INFO 
> [org.ovirt.engine.core.bll.storage.pool.SetStoragePoolStatusCommand]
> (default task-76) [13279c2b] Running command: SetStoragePoolStatusCommand
> internal: true. Entities affected :  ID:
> 1edf774c-e65f-422c-8d6f-de2623d60545 Type: StoragePool
> 2019-02-04 09:38:30,116+02 INFO 
> [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default
> task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' -
> Updating Storage Domain '114c6492-0331-4883-9633-0a5beb567fda' status from
> 'Active' to 'Unknown', reason: null
> 2019-02-04 09:38:30,116+02 INFO 
> [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default
> task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' -
> Updating Storage Domain '2ed446e7-30a2-4f11-993c-c03ab82867d6' status from
> 'Active' to 'Unknown', reason: null
> 2019-02-04 09:38:30,117+02 INFO 
> [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default
> task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' -
> Updating Storage Domain 'ff8fd019-1a64-45d3-846d-1d817d2f9832' status from
> 'Active' to 'Unknown', reason: null
> 2019-02-04 09:38:30,117+02 INFO 
> [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default
> task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' -
> Updating Storage Domain '37f59475-0371-47e9-9aef-997e6429d9ea' status from
> 'Active' to 'Unknown', reason: null
> 2019-02-04 09:38:30,117+02 INFO 
> [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default
> task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' -
> Updating Storage Domain 'e86ef26d-671d-48cb-874b-65ff33194cfe' status from
> 'Active' to 'Unknown', reason: null
> 2019-02-04 09:38:30,117+02 INFO 
> [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default
> task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' -
> Updating Storage Domain 'c4580ad6-8997-4968-adba-cde96cdd2c52' status from
> 'Active' to 'Unknown', reason: null
> 2019-02-04 09:38:30,117+02 INFO 
> [org.ovirt.engine.core.vdsbroker.storage.StoragePoolDomainHelper] (default
> task-76) [13279c2b] Storage Pool '1edf774c-e65f-422c-8d6f-de2623d60545' -

Comment 11 Tal Nisan 2019-03-11 15:19:22 UTC
A corner case, definitely not a blocker.
Also Yosi, please answer the needinfo

Comment 12 Tal Nisan 2019-03-18 09:35:51 UTC
Needinfo hanging for more than a month, closing as insufficient data

Comment 13 Yosi Ben Shimon 2019-04-02 13:23:01 UTC
(In reply to Tal Nisan from comment #12)
> Needinfo hanging for more than a month, closing as insufficient data

There is no needInfo here from my side. Just an explanation from Eyal regarding the logs I attached.
If this is not a bug, then please close it as NOTABUG.
If it's fixed, please close it as CURRENT_RELEASE.


Note You need to log in before you can comment on or make changes to this bug.