Bug 956878 - Snappy compression cannot be used with IBM JRE
Snappy compression cannot be used with IBM JRE
Status: CLOSED CURRENTRELEASE
Product: RHQ Project
Classification: Other
Component: Core Server, Installer, Launch Scripts, Plugins, Monitoring (Show other bugs)
4.6
Unspecified Unspecified
unspecified Severity urgent (vote)
: ---
: RHQ 4.8
Assigned To: John Sanda
Mike Foley
:
Depends On:
Blocks: 951619
  Show dependency treegraph
 
Reported: 2013-04-25 16:48 EDT by John Sanda
Modified: 2013-09-11 05:52 EDT (History)
2 users (show)

See Also:
Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of:
Environment:
Last Closed: 2013-09-11 05:52:17 EDT
Type: Bug
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)

  None (edit)
Description John Sanda 2013-04-25 16:48:42 EDT
Description of problem:
The DataStax Cassandra client driver is used in several places including the installer, the Cassandra agent plugin, and in the core server. The driver supports snappy compression using the snappy-java library. When running with IBM JRE versions 1.6 or 1.7, snappy fails to load its native library. Wherever in code we create a com.datastax.driver.core.Session object we need to make sure we do not use compression when running on an IBM JRE.


Version-Release number of selected component (if applicable):


How reproducible:


Steps to Reproduce:
1.
2.
3.
  
Actual results:


Expected results:


Additional info:
Comment 1 John Sanda 2013-04-25 16:51:23 EDT
See https://bugzilla.redhat.com/show_bug.cgi?id=907485#c6 for additional info where testing on this originated.
Comment 2 John Sanda 2013-04-26 22:01:02 EDT
Armine I am moving this to ON_QA. I know that you have already tested, but I will let you close it out.
Comment 3 John Sanda 2013-05-02 16:14:20 EDT
This also impacts SSTable compression. By default sstable compression uses snappy. Cassandra also provides out of the box support for Java zip and LZ4 compression. Using snappy for sstable compression with an IBM JRE causes Cassandra to throw exceptions like this,

INFO [SSTableBatchOpen:1] 2013-05-02 14:42:42,436 SSTableReader.java (line 164) Opening /var/lib/rhq/storage/data/system/schema_keyspaces/system-schema_keyspaces-ib-1 (260 bytes)
ERROR [SSTableBatchOpen:1] 2013-05-02 14:42:42,485 CassandraDaemon.java (line 132) Exception in thread Thread[SSTableBatchOpen:1,5,main]
java.lang.RuntimeException: Cannot create CompressionParameters for stored parameters
        at org.apache.cassandra.io.compress.CompressionMetadata.<init>(CompressionMetadata.java:99)
        at org.apache.cassandra.io.compress.CompressionMetadata.create(CompressionMetadata.java:63)
        at org.apache.cassandra.io.util.CompressedSegmentedFile$Builder.complete(CompressedSegmentedFile.java:51)
        at org.apache.cassandra.io.sstable.SSTableReader.load(SSTableReader.java:404)
        at org.apache.cassandra.io.sstable.SSTableReader.open(SSTableReader.java:198)
        at org.apache.cassandra.io.sstable.SSTableReader.open(SSTableReader.java:149)
        at org.apache.cassandra.io.sstable.SSTableReader$1.run(SSTableReader.java:238)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:482)
        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:345)
        at java.util.concurrent.FutureTask.run(FutureTask.java:177)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1156)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:626)
        at java.lang.Thread.run(Thread.java:780)
Caused by: org.apache.cassandra.exceptions.ConfigurationException: SnappyCompressor.create() threw an error: java.lang.NoClassDefFoundError org.xerial.snappy.Snappy (initialization failure)
        at org.apache.cassandra.io.compress.CompressionParameters.createCompressor(CompressionParameters.java:179)
        at org.apache.cassandra.io.compress.CompressionParameters.<init>(CompressionParameters.java:71)
        at org.apache.cassandra.io.compress.CompressionMetadata.<init>(CompressionMetadata.java:95)
        ... 12 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:88)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
        at java.lang.reflect.Method.invoke(Method.java:613)
        at org.apache.cassandra.io.compress.CompressionParameters.createCompressor(CompressionParameters.java:156)
        ... 14 more
Caused by: java.lang.NoClassDefFoundError: org.xerial.snappy.Snappy (initialization failure)
        at java.lang.J9VMInternals.initialize(J9VMInternals.java:176)
        at org.apache.cassandra.io.compress.SnappyCompressor.create(SnappyCompressor.java:45)
        ... 19 more
Caused by: org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY] null
        at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:229)
        at org.xerial.snappy.Snappy.<clinit>(Snappy.java:44)
        at java.lang.J9VMInternals.initializeImpl(Native Method)
        at java.lang.J9VMInternals.initialize(J9VMInternals.java:236)
        at org.apache.cassandra.service.CassandraDaemon.setup(CassandraDaemon.java:150)
        at org.apache.cassandra.service.CassandraDaemon.activate(CassandraDaemon.java:366)
        at org.apache.cassandra.service.CassandraDaemon.main(CassandraDaemon.java:409)
 INFO [SSTableBatchOpen:1] 2013-05-02 14:42:42,537 SSTableReader.java (line 164) Opening /var/lib/rhq/storage/data/system/schema_columnfamilies/system-schema_columnfamilies-ib-5 (5427 bytes)
ERROR [SSTableBatchOpen:1] 2013-05-02 14:42:42,545 CassandraDaemon.java (line 132) Exception in thread Thread[SSTableBatchOpen:1,5,main]
java.lang.RuntimeException: Cannot create CompressionParameters for stored parameters
        at org.apache.cassandra.io.compress.CompressionMetadata.<init>(CompressionMetadata.java:99)
        at org.apache.cassandra.io.compress.CompressionMetadata.create(CompressionMetadata.java:63)
        at org.apache.cassandra.io.util.CompressedSegmentedFile$Builder.complete(CompressedSegmentedFile.java:51)
        at org.apache.cassandra.io.sstable.SSTableReader.load(SSTableReader.java:404)
        at org.apache.cassandra.io.sstable.SSTableReader.open(SSTableReader.java:198)
        at org.apache.cassandra.io.sstable.SSTableReader.open(SSTableReader.java:149)
        at org.apache.cassandra.io.sstable.SSTableReader$1.run(SSTableReader.java:238)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:482)
        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:345)
        at java.util.concurrent.FutureTask.run(FutureTask.java:177)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1156)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:626)
        at java.lang.Thread.run(Thread.java:780)


I am moving this back to ON_DEV.
Comment 4 Armine Hovsepyan 2013-05-02 16:36:24 EDT
Note: Cassandra storage (with server and agent) installation worked correct with x64 arch environment - 10.34.131.9  (ibm jre)
Comment 5 John Sanda 2013-05-02 19:55:08 EDT
There is an issue filed against the snappy-java project that I believe will address the problem - https://github.com/xerial/snappy-java/issues/34. The issue is not just whether we are running on a 32 bit vs 64 arch. It has more to due with how the JVM reports it. snappy-java relies on the os.arch system property to get the arch. On the machine that produced the exception in comment 3, the IBM JVM reports the arch as x86. On that same machine OpenJDK reports i386.

After some investigation, I have learned that tables in the system keyspace cannot be altered. And since snappy is the default compression used for sstables, this means our only option at the moment to work around this is to fix snappy-java.

I want to reiterate that as far as we know, this issue is limited to IBM Java on a 32 bit arch. Furthermore, I am not sure if it is all 32 bit arches. The arch shell command returns i686 on the machine from comment 3.
Comment 6 John Sanda 2013-05-02 19:57:27 EDT
On the 64 bit machine that Armine mentioned in comment 4, IBM Java returns amd64 for the os.arch property. The arch shell command returns x86_64.
Comment 7 John Sanda 2013-05-03 14:18:03 EDT
After investigation I think that the error in comment 3 is a false positive. The only way I am able to reproduce is by installing and starting Cassandra with OpenJDK (or Oracle JDK, just not IBM JDK), stopping Cassandra, and then starting it back up with IBM Java. When Cassandra is initially deployed with a Java runtime where the snappy library can be loaded, the system tables are created and configured with compression. When we start Cassandra back up with IBM Java, Cassandra will not be able to read the data files since snappy cannot be loaded with IBM Java. 

Had IBM Java been used all the way through, this would not happened. If for whatever reason snappy cannot be loaded, Cassandra disables compression; so the system tables would not have been created with compression.

In summary, I do think there is a problem here, with respect to comment 3. I am moving this back to ON_QA.
Comment 8 Armine Hovsepyan 2013-05-03 16:15:57 EDT
since the investigation process was done with John, re-tested with new env (10.16.23.163)  - non-reproducible.

marking bug as verified.

thank you.
Comment 9 Heiko W. Rupp 2013-09-11 05:52:17 EDT
Bulk closing of old issues now that HRQ 4.9 is in front of the door.

If you think the issue has not been solved, then please open a new bug and mention this one in the description.

Note You need to log in before you can comment on or make changes to this bug.