Bug 1203891
| Summary: | RHEV disconnects when remotely connecting to windows machines. | ||
|---|---|---|---|
| Product: | Red Hat Enterprise Virtualization Manager | Reporter: | Robert McSwain <rmcswain> |
| Component: | vdsm | Assignee: | Vinzenz Feenstra [evilissimo] <vfeenstr> |
| Status: | CLOSED ERRATA | QA Contact: | sefi litmanovich <slitmano> |
| Severity: | high | Docs Contact: | |
| Priority: | high | ||
| Version: | 3.5.0 | CC: | bazulay, gklein, lpeer, lsurette, lveyde, mavital, mgoldboi, michal.skrivanek, pstehlik, rbalakri, Rhev-m-bugs, rmcswain, vfeenstr, ycui, yeylon, ykaul |
| Target Milestone: | ovirt-3.6.0-rc | ||
| Target Release: | 3.6.0 | ||
| Hardware: | x86_64 | ||
| OS: | Windows | ||
| Whiteboard: | |||
| Fixed In Version: | v4.17.2 | Doc Type: | Bug Fix |
| Doc Text: |
On the public internet, there are many random attempts to expose well-known ports on servers, in order for VNC to gain access to the machine behind the ports. These remote connection attempts triggered disconnect events which locked the console screen, even if the connection had not established a valid VNC or SPICE session. Now, only the client IP and port of the current known session can disconnect the console.
|
Story Points: | --- |
| Clone Of: | Environment: | ||
| Last Closed: | 2016-03-09 19:33:26 UTC | Type: | Bug |
| Regression: | --- | Mount Type: | --- |
| Documentation: | --- | CRM: | |
| Verified Versions: | Category: | --- | |
| oVirt Team: | Virt | RHEL 7.3 requirements from Atomic Host: | |
| Cloudforms Team: | --- | Target Upstream Version: | |
| Embargoed: | |||
| Bug Depends On: | 1236581, 1236585 | ||
| Bug Blocks: | 1262958 | ||
|
Description
Robert McSwain
2015-03-19 21:20:31 UTC
I am not sure what's the use case here, however if you're using spice and RDP side by side at the same time then this is the expected behaviour. Also if you would use multiple RDP sessions on the same machine it usually locks the previously logged on session and logs in the new one. The guest agent won't randomly lock the desktop, there must have been a reason for it, which is triggered from the HOST side, someone must have used spice and disconnected for example. Robert: Please try to get a bit more detailed information of what they are exactly doing, right now in this state this issue has insufficient data to understand what is going on. Thank you. Vinzenz, I've asked the customer and will let you know as soon as I've got more information for us. I think you'd have to rule out the possibility of someone else connecting via spice *after* you connected over rdp So try to reproduce on a VM which no one else has access to Michal, Here's what the customer had to say: ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Yes I am aware of that, and it looks to be the same behavior, however there is no one pulling up the console when that happens. The RHEVH hosts are on public ip's is it possible that maby someone is trying to make a malicious connection and even tho they dont authenticate it is locking the session? ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ I'll be pressing them to investigate the logins to their Windows systems. If they've been compromised, that's definitely outside of our need to investigate. More updates from the customer after asking some more questions about this:
1. There is no-one logging into the vm. The vm is not public, however the host is. The lock is coming from VDSM. He did some testing and checked the vdsm log when it happened. VDSM is calling desktopLock, and no-one was in the RHEV-M interface and no changes were being made:
Thread-21::DEBUG::2015-06-02 14:35:08,358::fileSD::261::Storage.Misc.excCmd::(getReadDelay) SUCCESS: <err> = '0+1 records in\n0+1 records out\n337 bytes (337 B) copied, 0.000307218 s, 1.1 MB/s\n'; <rc> = 0
Thread-19::DEBUG::2015-06-02 14:35:08,378::fileSD::261::Storage.Misc.excCmd::(getReadDelay) SUCCESS: <err> = '0+1 records in\n0+1 records out\n345 bytes (345 B) copied, 0.000396301 s, 871 kB/s\n'; <rc> = 0
libvirtEventLoop::DEBUG::2015-06-02 14:35:08,383::clientIF::555::vm.Vm::(dispatchLibvirtEvents) vmId=`c199573a-db43-42ba-a0bc-03f5cb7967d5`::graphics event phase 2 localAddr {'node': '107.150.183.127', 'service': '5900', 'family': 0} remoteAddr {'node': '211.219.77.166', 'service': '26605', 'family': 0}authScheme vnc subject []
Thread-18::DEBUG::2015-06-02 14:35:08,404::fileSD::261::Storage.Misc.excCmd::(getReadDelay) SUCCESS: <err> = '0+1 records in\n0+1 records out\n560 bytes (560 B) copied, 0.0016427 s, 341 kB/s\n'; <rc> = 0
Thread-173393::INFO::2015-06-02 14:35:08,545::logUtils::47::dispatcher::(wrapper) Run and protect: getVolumeSize, Return response: {'truesize': '217345686016', 'apparentsize': '250059350016'}
Thread-173393::DEBUG::2015-06-02 14:35:08,561::task::1191::Storage.TaskManager.Task::(prepare) Task=`0bec65df-c5cb-474c-9093-0b1f07ec5368`::finished: {'truesize': '217345686016', 'apparentsize': '250059350016'}
Thread-173393::DEBUG::2015-06-02 14:35:08,568::task::595::Storage.TaskManager.Task::(_updateState) Task=`0bec65df-c5cb-474c-9093-0b1f07ec5368`::moving from state preparing -> state finished
Thread-173393::DEBUG::2015-06-02 14:35:08,576::resourceManager::940::Storage.ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {}
Thread-173393::DEBUG::2015-06-02 14:35:08,583::resourceManager::977::Storage.ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {}
Thread-173393::DEBUG::2015-06-02 14:35:08,592::task::993::Storage.TaskManager.Task::(_decref) Task=`0bec65df-c5cb-474c-9093-0b1f07ec5368`::ref 0 aborting False
Thread-173393::DEBUG::2015-06-02 14:35:08,713::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-284425::DEBUG::2015-06-02 14:35:10,340::task::595::Storage.TaskManager.Task::(_updateState) Task=`81d92c83-9042-4ddd-acb7-565d7a386a11`::moving from state init -> state preparing
Thread-284425::INFO::2015-06-02 14:35:10,341::logUtils::44::dispatcher::(wrapper) Run and protect: getVolumeSize(sdUUID='a3ebdb74-5120-4406-92a5-da34ec2d3089', spUUID='3664c1fc-469a-11e2-a679-077225634133', imgUUID='e82124ac-b90f-4765-aa6c-134630e846d1', volUUID='c4370e6e-088c-417a-a3ad-3091553dba48', options=None)
Thread-284425::INFO::2015-06-02 14:35:10,369::logUtils::47::dispatcher::(wrapper) Run and protect: getVolumeSize, Return response: {'truesize': '10024629248', 'apparentsize': '107374182400'}
Thread-284425::DEBUG::2015-06-02 14:35:10,369::task::1191::Storage.TaskManager.Task::(prepare) Task=`81d92c83-9042-4ddd-acb7-565d7a386a11`::finished: {'truesize': '10024629248', 'apparentsize': '107374182400'}
Thread-284425::DEBUG::2015-06-02 14:35:10,370::task::595::Storage.TaskManager.Task::(_updateState) Task=`81d92c83-9042-4ddd-acb7-565d7a386a11`::moving from state preparing -> state finished
Thread-284425::DEBUG::2015-06-02 14:35:10,370::resourceManager::940::Storage.ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {}
Thread-284425::DEBUG::2015-06-02 14:35:10,370::resourceManager::977::Storage.ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {}
Thread-284425::DEBUG::2015-06-02 14:35:10,370::task::993::Storage.TaskManager.Task::(_decref) Task=`81d92c83-9042-4ddd-acb7-565d7a386a11`::ref 0 aborting False
Thread-284425::DEBUG::2015-06-02 14:35:10,477::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-388631::DEBUG::2015-06-02 14:35:10,492::guestagent::356::vm.Vm::(desktopLock) vmId=`c199573a-db43-42ba-a0bc-03f5cb7967d5`::desktopLock called
Thread-382562::DEBUG::2015-06-02 14:35:13,353::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-12935::DEBUG::2015-06-02 14:35:14,191::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-346966::DEBUG::2015-06-02 14:35:14,408::task::595::Storage.TaskManager.Task::(_updateState) Task=`64040735-db79-4e1d-b598-ba29a7fc7318`::moving from state init -> state preparing
Thread-15608::DEBUG::2015-06-02 14:35:14,405::task::595::Storage.TaskManager.Task::(_updateState) Task=`57aa5c8d-3bf0-4ca8-8efe-f472a340b828`::moving from state init -> state preparing
Thread-346966::INFO::2015-06-02 14:35:14,455::logUtils::44::dispatcher::(wrapper) Run and protect: getVolumeSize(sdUUID='a3ebdb74-5120-4406-92a5-da34ec2d3089', spUUID='3664c1fc-469a-11e2-a679-077225634133', imgUUID='104a9993-6d26-44a5-9bd7-830dad17e6a4', volUUID='b571df60-036a-4cbd-9a92-f0cc13ece43f', options=None)
Thread-15608::INFO::2015-06-02 14:35:14,464::logUtils::44::dispatcher::(wrapper) Run and protect: getVolumeSize(sdUUID='a3ebdb74-5120-4406-92a5-da34ec2d3089', spUUID='3664c1fc-469a-11e2-a679-077225634133', imgUUID='e3035913-7c66-4da4-aded-a36971e7459e', volUUID='28fde3da-6caa-41bf-a914-7efe0965dda1', options=None)
Thread-337638::DEBUG::2015-06-02 14:35:14,491::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-15608::INFO::2015-06-02 14:35:14,464::logUtils::44::dispatcher::(wrapper) Run and protect: getVolumeSize(sdUUID='a3ebdb74-5120-4406-92a5-da34ec2d3089', spUUID='3664c1fc-469a-11e2-a679-077225634133', imgUUID='e3035913-7c66-4da4-aded-a36971e7459e', volUUID='28fde3da-6caa-41bf-a914-7efe0965dda1', options=None)
Thread-337638::DEBUG::2015-06-02 14:35:14,491::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-159174::DEBUG::2015-06-02 14:35:14,503::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-346966::INFO::2015-06-02 14:35:14,656::logUtils::47::dispatcher::(wrapper) Run and protect: getVolumeSize, Return response: {'truesize': '24137152512', 'apparentsize': '107374182400'}
Thread-346966::DEBUG::2015-06-02 14:35:14,660::task::1191::Storage.TaskManager.Task::(prepare) Task=`64040735-db79-4e1d-b598-ba29a7fc7318`::finished: {'truesize': '24137152512', 'apparentsize': '107374182400'}
Thread-346966::DEBUG::2015-06-02 14:35:14,664::task::595::Storage.TaskManager.Task::(_updateState) Task=`64040735-db79-4e1d-b598-ba29a7fc7318`::moving from state preparing -> state finished
Thread-346966::DEBUG::2015-06-02 14:35:14,670::resourceManager::940::Storage.ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {}
Thread-346966::DEBUG::2015-06-02 14:35:14,678::resourceManager::977::Storage.ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {}
Thread-15608::INFO::2015-06-02 14:35:14,678::logUtils::47::dispatcher::(wrapper) Run and protect: getVolumeSize, Return response: {'truesize': '12123011584', 'apparentsize': '536870912000'}
Thread-346966::DEBUG::2015-06-02 14:35:14,685::task::993::Storage.TaskManager.Task::(_decref) Task=`64040735-db79-4e1d-b598-ba29a7fc7318`::ref 0 aborting False
Thread-15608::DEBUG::2015-06-02 14:35:14,687::task::1191::Storage.TaskManager.Task::(prepare) Task=`57aa5c8d-3bf0-4ca8-8efe-f472a340b828`::finished: {'truesize': '12123011584', 'apparentsize': '536870912000'}
Thread-15608::DEBUG::2015-06-02 14:35:14,691::task::595::Storage.TaskManager.Task::(_updateState) Task=`57aa5c8d-3bf0-4ca8-8efe-f472a340b828`::moving from state preparing -> state finished
Thread-15608::DEBUG::2015-06-02 14:35:14,694::resourceManager::940::Storage.ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {}
Thread-15608::DEBUG::2015-06-02 14:35:14,696::resourceManager::977::Storage.ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {}
Thread-15608::DEBUG::2015-06-02 14:35:14,700::task::993::Storage.TaskManager.Task::(_decref) Task=`57aa5c8d-3bf0-4ca8-8efe-f472a340b828`::ref 0 aborting False
Thread-347193::DEBUG::2015-06-02 14:35:14,716::task::595::Storage.TaskManager.Task::(_updateState) Task=`49afd9c4-6529-47a6-94b9-0a765068bb42`::moving from state init -> state preparing
Thread-347193::INFO::2015-06-02 14:35:14,720::logUtils::44::dispatcher::(wrapper) Run and protect: repoStats(options=None)
Thread-347193::INFO::2015-06-02 14:35:14,726::logUtils::47::dispatcher::(wrapper) Run and protect: repoStats, Return response: {'a3ebdb74-5120-4406-92a5-da34ec2d3089': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.0016427', 'lastCheck': '6.1', 'valid': True}, 'a3f64f81-6c83-40c8-a229-91b63b750b39': {'code': 0, 'version': 0, 'acquired': True, 'delay': '0.000396301', 'lastCheck': '6.2', 'valid': True}, '3d330965-7312-4600-8635-9280b11439a9': {'code': 0, 'version': 0, 'acquired': True, 'delay': '0.000307218', 'lastCheck': '6.2', 'valid': True}}
Thread-347193::DEBUG::2015-06-02 14:35:14,729::task::1191::Storage.TaskManager.Task::(prepare) Task=`49afd9c4-6529-47a6-94b9-0a765068bb42`::finished: {'a3ebdb74-5120-4406-92a5-da34ec2d3089': {'code': 0, 'version': 3, 'acquired': True, 'delay': '0.0016427', 'lastCheck': '6.1', 'valid': True}, 'a3f64f81-6c83-40c8-a229-91b63b750b39': {'code': 0, 'version': 0, 'acquired': True, 'delay': '0.000396301', 'lastCheck': '6.2', 'valid': True}, '3d330965-7312-4600-8635-9280b11439a9': {'code': 0, 'version': 0, 'acquired': True, 'delay': '0.000307218', 'lastCheck': '6.2', 'valid': True}}
Thread-347193::DEBUG::2015-06-02 14:35:14,731::task::595::Storage.TaskManager.Task::(_updateState) Task=`49afd9c4-6529-47a6-94b9-0a765068bb42`::moving from state preparing -> state finished
Thread-347193::DEBUG::2015-06-02 14:35:14,732::resourceManager::940::Storage.ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {}
Thread-347193::DEBUG::2015-06-02 14:35:14,732::resourceManager::977::Storage.ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {}
Thread-347193::DEBUG::2015-06-02 14:35:14,733::task::993::Storage.TaskManager.Task::(_decref) Task=`49afd9c4-6529-47a6-94b9-0a765068bb42`::ref 0 aborting False
Thread-346966::DEBUG::2015-06-02 14:35:14,803::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-15608::DEBUG::2015-06-02 14:35:14,815::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-75508::DEBUG::2015-06-02 14:35:16,410::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-354388::DEBUG::2015-06-02 14:35:16,411::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-176899::DEBUG::2015-06-02 14:35:16,420::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-379601::DEBUG::2015-06-02 14:35:17,391::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-176899::DEBUG::2015-06-02 14:35:16,420::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-387300::DEBUG::2015-06-02 14:35:17,395::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-386587::DEBUG::2015-06-02 14:35:17,408::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-21::DEBUG::2015-06-02 14:35:20,196::fileSD::261::Storage.Misc.excCmd::(getReadDelay) SUCCESS: <err> = '0+1 records in\n0+1 records out\n337 bytes (337 B) copied, 0.000334745 s, 1.0 MB/s\n'; <rc> = 0
Thread-388632::DEBUG::2015-06-02 14:35:23,536::guestagent::356::vm.Vm::(desktopLock) vmId=`c199573a-db43-42ba-a0bc-03f5cb7967d5`::desktopLock called
Thread-19::DEBUG::2015-06-02 14:35:23,541::fileSD::261::Storage.Misc.excCmd::(getReadDelay) SUCCESS: <err> = '0+1 records in\n0+1 records out\n345 bytes (345 B) copied, 0.000252761 s, 1.4 MB/s\n'; <rc> = 0
Thread-284425::DEBUG::2015-06-02 14:35:27,469::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-382562::DEBUG::2015-06-02 14:35:30,428::task::595::Storage.TaskManager.Task::(_updateState) Task=`0c7f7ade-9bf4-4748-8b2f-725d6a0ac6ca`::moving from state init -> state preparing
Thread-382562::DEBUG::2015-06-02 14:35:30,453::resourceManager::977::Storage.ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {}
Thread-382562::DEBUG::2015-06-02 14:35:30,453::task::993::Storage.TaskManager.Task::(_decref) Task=`0c7f7ade-9bf4-4748-8b2f-725d6a0ac6ca`::ref 0 aborting False
Thread-12935::INFO::2015-06-02 14:35:32,633::logUtils::47::dispatcher::(wrapper) Run and protect: getVolumeSize, Return response: {'truesize': '10091293184', 'apparentsize': '107374182400'}
Thread-12935::DEBUG::2015-06-02 14:35:32,636::task::1191::Storage.TaskManager.Task::(prepare) Task=`184ab8dd-ab1c-4b4a-a89e-6accbf80bad4`::finished: {'truesize': '10091293184', 'apparentsize': '107374182400'}
Thread-12935::DEBUG::2015-06-02 14:35:32,638::task::595::Storage.TaskManager.Task::(_updateState) Task=`184ab8dd-ab1c-4b4a-a89e-6accbf80bad4`::moving from state preparing -> state finished
Thread-12935::DEBUG::2015-06-02 14:35:32,639::resourceManager::940::Storage.ResourceManager.Owner::(releaseAll) Owner.releaseAll requests {} resources {}
Thread-12935::DEBUG::2015-06-02 14:35:32,642::resourceManager::977::Storage.ResourceManager.Owner::(cancelAll) Owner.cancelAll requests {}
Thread-12935::DEBUG::2015-06-02 14:35:32,645::task::993::Storage.TaskManager.Task::(_decref) Task=`184ab8dd-ab1c-4b4a-a89e-6accbf80bad4`::ref 0 aborting False
libvirtEventLoop::DEBUG::2015-06-02 14:35:32,735::clientIF::555::vm.Vm::(dispatchLibvirtEvents) vmId=`c199573a-db43-42ba-a0bc-03f5cb7967d5`::graphics event phase 0 localAddr {'node': '107.150.183.127', 'service': '5900', 'family': 0} remoteAddr {'node': '175.126.104.121', 'service': '34179', 'family': 0}authScheme vnc subject []
Thread-12935::DEBUG::2015-06-02 14:35:32,794::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
libvirtEventLoop::DEBUG::2015-06-02 14:35:33,282::clientIF::555::vm.Vm::(dispatchLibvirtEvents) vmId=`c199573a-db43-42ba-a0bc-03f5cb7967d5`::graphics event phase 2 localAddr {'node': '107.150.183.127', 'service': '5900', 'family': 0} remoteAddr {'node': '175.126.104.121', 'service': '34179', 'family': 0}authScheme vnc subject []
Thread-176899::DEBUG::2015-06-02 14:35:34,373::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-75508::DEBUG::2015-06-02 14:35:34,378::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-15608::DEBUG::2015-06-02 14:35:34,378::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-346966::DEBUG::2015-06-02 14:35:34,380::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-354388::DEBUG::2015-06-02 14:35:34,381::libvirtconnection::143::root::(wrapper) Unknown libvirterror: ecode: 80 edom: 20 level: 2 message: metadata not found: Requested metadata element is not present
Thread-21::DEBUG::2015-06-02 14:35:34,472::fileSD::261::Storage.Misc.excCmd::(getReadDelay) /bin/dd if=/rhev/data-center/mnt/10.10.10.10:_tank_isos/3d330965-7312-4600-8635-9280b11439a9/dom_md/metadata iflag=direct of=/dev/null bs=4096 count=1 (cwd None)
Thread-388633::DEBUG::2015-06-02 14:35:37,523::guestagent::356::vm.Vm::(desktopLock) vmId=`c199573a-db43-42ba-a0bc-03f5cb7967d5`::desktopLock called
Thread-379601::DEBUG::2015-06-02 14:35:37,603::task::595::Storage.TaskManager.Task::(_updateState) Task=`561a17b9-242c-4708-9db4-fcaffc21ffee`::moving from state init -> state preparing
Thread-387300::DEBUG::2015-06-02 14:35:37,603::task::595::Storage.TaskManager.Task::(_updateState) Task=`bc94fb3b-deb0-416e-a814-5d8d3eeca392`::moving from state init -> state preparing
Thread-21::DEBUG::2015-06-02 14:35:37,613::fileSD::261::Storage.Misc.excCmd::(getReadDelay) SUCCESS: <err> = '0+1 records in\n0+1 records out\n337 bytes (337 B) copied, 0.00466438 s, 72.2 kB/s\n'; <rc> = 0
Thread-379601::INFO::2015-06-02 14:35:37,665::logUtils::44::dispatcher::(wrapper) Run and protect: getVolumeSize(sdUUID='a3ebdb74-5120-4406-92a5-da34ec2d3089', spUUID='3664c1fc-469a-11e2-a679-077225634133', imgUUID='db1eb225-36a9-482b-ae5e-1a96c9677715', volUUID='d96bf344-e854-4752-891f-9188422e9f34', options=None)
Thread-387300::INFO::2015-06-02 14:35:37,676::logUtils::44::dispatcher::(wrapper) Run and protect: getVolumeSize(sdUUID='a3ebdb74-5120-4406-92a5-da34ec2d3089', spUUID='3664c1fc-469a-11e2-a679-077225634133', imgUUID='132de174-f84f-45c7-9b93-d110098ef850', volUUID='afaf4228-6344-4205-aa79-285eacd5aa35', options=None)
2. He checked the security log in Windows and there are no entries created when the machine becomes locked.
Verified with rhevm-3.6.0.1-0.1.el6.noarch.
host: RHEL 7.2
vdsm-4.17.9-1.el7ev.noarch
libvirt-daemon-driver-nwfilter-1.2.17-12.el7.x86_64
libvirt-daemon-driver-storage-1.2.17-12.el7.x86_64
libvirt-daemon-config-nwfilter-1.2.17-12.el7.x86_64
libvirt-client-1.2.17-12.el7.x86_64
libvirt-daemon-driver-secret-1.2.17-12.el7.x86_64
libvirt-lock-sanlock-1.2.17-12.el7.x86_64
libvirt-daemon-1.2.17-12.el7.x86_64
libvirt-daemon-driver-nodedev-1.2.17-12.el7.x86_64
libvirt-daemon-kvm-1.2.17-12.el7.x86_64
libvirt-daemon-driver-interface-1.2.17-12.el7.x86_64
libvirt-daemon-driver-network-1.2.17-12.el7.x86_64
libvirt-python-1.2.17-2.el7.x86_64
libvirt-daemon-driver-qemu-1.2.17-12.el7.x86_64
steps:
1. created a windows 7 guest with latest guest-tools installed.
2. while SPICE console is opened for the vm, connected to some other host via ssh and initiated telnet connection to the host and spice secured port:
>>telnet {host_ip} {SPICE_tls-port}
3. Guest desktop wasn't locked and console session was still going as expected with this patch.
4. closed console.
5. started a new console session -> desktop is locked as expected for a new valid session.
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://rhn.redhat.com/errata/RHBA-2016-0362.html |