Bug 1483401
| Summary: | Frequent traceback on MetadataDiskDescriptionHandler | |||
|---|---|---|---|---|
| Product: | Red Hat Enterprise Virtualization Manager | Reporter: | Germano Veit Michel <gveitmic> | |
| Component: | ovirt-engine | Assignee: | Idan Shaby <ishaby> | |
| Status: | CLOSED ERRATA | QA Contact: | Kevin Alon Goldblatt <kgoldbla> | |
| Severity: | medium | Docs Contact: | ||
| Priority: | unspecified | |||
| Version: | 4.1.4 | CC: | gveitmic, lsurette, ratamir, rbalakri, Rhev-m-bugs, srevivo, tnisan, ykaul, ylavi | |
| Target Milestone: | ovirt-4.2.0 | Keywords: | ZStream | |
| Target Release: | --- | Flags: | lsvaty:
testing_plan_complete-
|
|
| Hardware: | x86_64 | |||
| OS: | Linux | |||
| Whiteboard: | ||||
| Fixed In Version: | Doc Type: | No Doc Update | ||
| Doc Text: |
undefined
|
Story Points: | --- | |
| Clone Of: | ||||
| : | 1486293 (view as bug list) | Environment: | ||
| Last Closed: | 2018-05-15 17:43:37 UTC | Type: | Bug | |
| Regression: | --- | Mount Type: | --- | |
| Documentation: | --- | CRM: | ||
| Verified Versions: | Category: | --- | ||
| oVirt Team: | Storage | RHEL 7.3 requirements from Atomic Host: | ||
| Cloudforms Team: | --- | Target Upstream Version: | ||
| Embargoed: | ||||
| Bug Depends On: | ||||
| Bug Blocks: | 1486293 | |||
Commit e84631956b02ce355e9b0654a6451f6e8ad747be (RHV 4.0) introduced this. Seems like we need to change the logging there to DEBUG, as not having a JSON description is pretty common. Verified with the following code: ------------------------------------- ovirt-engine-4.2.0-0.5.master.el7.noarch vdsm-4.20.8-53.gitc3edfc0.el7.centos.x86_64 Verified with the following scenario: ------------------------------------- Steps to Reproduce: Several ways. I think the easiest is to deploy Hosted-Engine and trigger the auto-import. Or do this on an existing env: 1. Create NFS Share 2. Create Export Domain on NFS share 3. Maintenance and Detach Export Domain 4. Use virt-v2v to populate it: virt-v2v -o rhev -os 10.64.24.33:/exports/data7 rhel7.3 5. Attach Export Domain >>>>> the warning trace backs reported in the description are not reported in the log. It has been replaced with " Could not parse the description" Moved to VERIFY Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHEA-2018:1488 BZ<2>Jira Resync sync2jira sync2jira sync2jira sync2jira |
Description of problem: Engine logs have several of these when attaching Data/Export Domains, when Importing Hosted-Engine Storage Domain etc... 2017-08-21 13:41:45,435+10 WARN [org.ovirt.engine.core.bll.storage.disk.image.GetUnregisteredDiskQuery] (org.ovirt.thread.pool-6-thread-48) [b56aacc1-c32e-49be-8c67-15897edb9783] Exception while parsing JSON for disk. Exception: '{}': org.codehaus.jackson.JsonParseException: Unexpected character ('g' (code 103)): expected a valid value (number, String, array, object, 'true', 'false' or 'null') at [Source: java.io.StringReader@3556c5d1; line: 1, column: 2] at org.codehaus.jackson.JsonParser._constructError(JsonParser.java:1433) [jackson-core-asl.jar:1.9.13.redhat-3] at org.codehaus.jackson.impl.JsonParserMinimalBase._reportError(JsonParserMinimalBase.java:521) [jackson-core-asl.jar:1.9.13.redhat-3] at org.codehaus.jackson.impl.JsonParserMinimalBase._reportUnexpectedChar(JsonParserMinimalBase.java:442) [jackson-core-asl.jar:1.9.13.redhat-3] at org.codehaus.jackson.impl.ReaderBasedParser._handleUnexpectedValue(ReaderBasedParser.java:1198) [jackson-core-asl.jar:1.9.13.redhat-3] at org.codehaus.jackson.impl.ReaderBasedParser.nextToken(ReaderBasedParser.java:485) [jackson-core-asl.jar:1.9.13.redhat-3] at org.codehaus.jackson.map.ObjectMapper._initForReading(ObjectMapper.java:2770) [jackson-mapper-asl.jar:1.9.13.redhat-3] at org.codehaus.jackson.map.ObjectMapper._readMapAndClose(ObjectMapper.java:2718) [jackson-mapper-asl.jar:1.9.13.redhat-3] at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1877) [jackson-mapper-asl.jar:1.9.13.redhat-3] at org.ovirt.engine.core.utils.JsonHelper.jsonToMap(JsonHelper.java:41) [utils.jar:] at org.ovirt.engine.core.bll.storage.disk.image.MetadataDiskDescriptionHandler.enrichDiskByJsonDescription(MetadataDiskDescriptionHandler.java:248) [bll.jar:] at org.ovirt.engine.core.bll.storage.disk.image.GetUnregisteredDiskQuery.executeQueryCommand(GetUnregisteredDiskQuery.java:105) [bll.jar:] at org.ovirt.engine.core.bll.QueriesCommandBase.executeCommand(QueriesCommandBase.java:110) [bll.jar:] at org.ovirt.engine.core.dal.VdcCommandBase.execute(VdcCommandBase.java:33) [dal.jar:] at org.ovirt.engine.core.bll.executor.DefaultBackendQueryExecutor.execute(DefaultBackendQueryExecutor.java:14) [bll.jar:] at org.ovirt.engine.core.bll.Backend.runQueryImpl(Backend.java:582) [bll.jar:] at org.ovirt.engine.core.bll.Backend.runInternalQuery(Backend.java:545) [bll.jar:] It happens when the metadata for the disk is not in json format, like any of these: DESCRIPTION=HostedEngineConfigurationImage DESCRIPTION=hosted-engine.lockspace DESCRIPTION=hosted-engine.metadata DESCRIPTION=Hosted Engine Image DESCRIPTION=generated by virt-v2v 1.36.5fedora_26,release_1.fc26,libvirt Version-Release number of selected component (if applicable): rhevm-4.1.4.2-0.1.el7.noarch How reproducible: 100% Steps to Reproduce: Several ways. I think the easiest is to deploy Hosted-Engine and trigger the auto-import. Or do this on an existing env: 1. Create NFS Share 2. Create Export Domain on NFS share 3. Maintenance and Detach Export Domain 4. Use virt-v2v to populate it: virt-v2v -o rhev -os 10.64.24.33:/exports/data7 rhel7.3 5. Attach Export Domain Actual results: Operation succeeds, but logs are polluted with warnings tracebacks that can deviate the user/support from the actual problem.