运行spark应用程序时出现以下错误:
16/01/08 15:25:48 INFO SchemaMetadata: Entering synchronized block to initiate
16/01/08 15:25:48 INFO SchemaMetadata: Initializing SchemaMetadata
16/01/08 15:25:48 INFO SchemaMetadata: Schema initialized from database
16/01/08 15:25:48 INFO SchemaMetadata: Registering for notifications
16/01/08 15:25:49 WARN HConnectionManager$HConnectionImplementation: Encountered problems when prefetch hbase:meta table:
org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString
at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:210)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:121)
at org.apache.hadoop.hbase.client.HTable.getRowOrBefore(HTable.java:714)
at org.apache.hadoop.hbase.client.MetaScanner.metaScan(MetaScanner.java:144)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.prefetchRegionCache(HConnectionManager.java:1159)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1223)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1111)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1068)
at org.apache.hadoop.hbase.client.AsyncProcess.findDestLocation(AsyncProcess.java:361)
at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:306)
at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:964)
at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1252)
at org.apache.hadoop.hbase.client.HTable.put(HTable.java:924)
Caused by: java.lang.IllegalAccessError: class com.google.protobuf.HBaseZeroCopyByteString cannot access its superclass com.google.protobuf.LiteralByteString
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.hadoop.hbase.protobuf.RequestConverter.buildRegionSpecifier(RequestConverter.java:930)
at org.apache.hadoop.hbase.protobuf.RequestConverter.buildGetRowOrBeforeRequest(RequestConverter.java:133)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1497)
at org.apache.hadoop.hbase.client.HTable$2.call(HTable.java:710)
at org.apache.hadoop.hbase.client.HTable$2.call(HTable.java:708)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:114)
... 31 more
我发现这里的问题是类加载器从JVM中较早的Spark程序集jar加载LiteralByteString类,然后从用户jar加载HBaseZeroCopyByteString类。这就是它造成问题的原因。
Spark提供了一个属性来首先在spark.executor.userClassPathFirst
的JVM中加载用户jar,但遗憾的是,这也没有获得好运。
如果有人遇到并解决了这个问题,请提供解决方法。
答案 0 :(得分:0)
它看起来在运行spark工作时他们没有在class-path中找到hbase-protocol jar。
请尝试以下选项来解决特定问题,
在spark-submit
中添加以下属性- driver-java-options“-Dspark.driver.extraClassPath = / usr / hdp / 2.5.4.0-121 / hbase / lib / hbase-protocol-1.1.2.2.5.4.0-121。 jar“\
- conf“spark.executor.extraClassPath = / usr / hdp / 2.5.4.0-121 / hbase / lib / hbase-protocol-1.1.2.2.5.4.0-121.jar”\ / p>
希望你的问题能得到解决。如果有任何疑问,请回复我。