Note: This bug is displayed in read-only format because the product is no longer active in Red Hat Bugzilla.

Bug 1198493

Summary: [AAA] reduce logout fail log/event message "null@N/A"
Product: Red Hat Enterprise Virtualization Manager Reporter: Yuri Obshansky <yobshans>
Component: ovirt-engineAssignee: Ravi Nori <rnori>
Status: CLOSED CURRENTRELEASE QA Contact: Ondra Machacek <omachace>
Severity: low Docs Contact:
Priority: low    
Version: 3.5.0CC: bcholler, ecohen, iheim, lpeer, lsurette, manfred.landauer, mburman, mtessun, oourfali, pstehlik, rbalakri, Rhev-m-bugs, rnori, yeylon, yobshans
Target Milestone: ovirt-3.6.2   
Target Release: 3.6.0   
Hardware: x86_64   
OS: Linux   
Whiteboard: infra
Fixed In Version: Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2016-01-13 15:47:13 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: Infra RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On: 1224285    
Bug Blocks:    
Attachments:
Description Flags
Screenshot
none
engine.log file none

Description Yuri Obshansky 2015-03-04 09:29:06 UTC
Description of problem:
Error "Failed to log User null@N/A out." appears in Admin portal UI Events tab
when I login to RHEV-M with admin user
Errors in engine log. 
2015-03-04 10:02:54,910 INFO
[org.ovirt.engine.core.bll.aaa.LogoutUserCommand]
(ajp-/127.0.0.1:8702-1) [42dcae22] Running command: LogoutUserCommand
internal: false.
2015-03-04 10:02:54,915 ERROR
[org.ovirt.engine.core.bll.aaa.LogoutUserCommand]
(ajp-/127.0.0.1:8702-1) [42dcae22] Transaction rolled-back for command:
org.ovirt.engine.core.bll.aaa.LogoutUserCommand.
2015-03-04 10:02:54,929 ERROR
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(ajp-/127.0.0.1:8702-1) [42dcae22] Correlation ID: 42dcae22, Call Stack:
null, Custom Event ID: -1, Message: Failed to log User null@N/A out.
2015-03-04 10:02:54,981 INFO
[org.ovirt.engine.ui.frontend.server.gwt.plugin.PluginDataManager]
(ajp-/127.0.0.1:8702-2) Reading UI plugin descriptor
[/usr/share/ovirt-engine/ui-plugins/redhat_support_plugin_rhev.json]
2015-03-04 10:02:54,985 INFO
[org.ovirt.engine.ui.frontend.server.gwt.plugin.PluginDataManager]
(ajp-/127.0.0.1:8702-2) Reading UI plugin configuration
[/etc/ovirt-engine/ui-plugins/redhat_support_plugin_rhev-config.json]
2015-03-04 10:02:57,799 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetHardwareInfoVDSCommand]
(DefaultQuartzScheduler_Worker-6) START,
GetHardwareInfoVDSCommand(HostName = HOST-2, HostId =
e8a6ad5a-0f67-4041-b701-dcb1628aaae1,
vds=Host[HOST-2,e8a6ad5a-0f67-4041-b701-dcb1628aaae1]), log id: 460ca11c
2015-03-04 10:02:57,881 INFO
[org.ovirt.engine.core.vdsbroker.vdsbroker.GetHardwareInfoVDSCommand]
(DefaultQuartzScheduler_Worker-6) FINISH, GetHardwareInfoVDSCommand, log
id: 460ca11c
2015-03-04 10:03:01,010 INFO
[org.ovirt.engine.core.bll.aaa.LoginAdminUserCommand]
(ajp-/127.0.0.1:8702-5) Running command: LoginAdminUserCommand internal:
false.
2015-03-04 10:03:01,020 INFO
[org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector]
(ajp-/127.0.0.1:8702-5) Correlation ID: null, Call Stack: null, Custom
Event ID: -1, Message: User admin@internal logged in.

Version-Release number of selected component (if applicable):
RHEV-M 3.5.1-0.1
(build vt14)

How reproducible:
Login to REHV-M admin portal with admin user

Steps to Reproduce:
1.
2.
3.

Actual results:
Errors appeared in Admin Portal Events Tan and in engine log

Expected results:
No errors

Additional info:

Comment 1 Yuri Obshansky 2015-03-04 09:29:43 UTC
Created attachment 997794 [details]
Screenshot

Comment 2 Yuri Obshansky 2015-03-04 09:32:53 UTC
Created attachment 997796 [details]
engine.log file

Comment 3 Oved Ourfali 2015-03-08 09:10:56 UTC
*** Bug 1189016 has been marked as a duplicate of this bug. ***

Comment 6 Ravi Nori 2015-10-22 20:02:43 UTC
Unable to reproduce this on master, please retest

Comment 12 Pavel Stehlik 2016-01-13 15:47:13 UTC
Closing old bugs. In case still happens, pls reopen.