Spark 2.3 java.lang.NoSuchMethodError:io.netty.buffer.PooledByteBufAllocator.metric

时间:2018-05-17 10:15:43

标签: apache-spark

SPARK 2.3正在抛出以下异常。谁能请帮忙!!我尝试添加JAR

308 [Driver] ERROR org.apache.spark.deploy.yarn.ApplicationMaster - User class引发异常:java.lang.NoSuchMethodError:io.netty.buffer.PooledByteBufAllocator.metric()Lio / netty / buffer / PooledByteBufAllocatorMetric;     java.lang.NoSuchMethodError:io.netty.buffer.PooledByteBufAllocator.metric()Lio / netty / buffer / PooledByteBufAllocatorMetric;         在org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80)         在org.apache.spark.network.util.NettyMemoryMetrics。(NettyMemoryMetrics.java:76)         在org.apache.spark.network.client.TransportClientFactory。(TransportClientFactory.java:109)         在org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)         在org.apache.spark.rpc.netty.NettyRpcEnv。(NettyRpcEnv.scala:71)         在org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)         在org.apache.spark.rpc.RpcEnv $ .create(RpcEnv.scala:57)         在org.apache.spark.SparkEnv $ .create(SparkEnv.scala:249)         在org.apache.spark.SparkEnv $ .createDriverEnv(SparkEnv.scala:175)         在org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)         在org.apache.spark.SparkContext。(SparkContext.scala:423)         在org.apache.spark.api.java.JavaSparkContext。(JavaSparkContext.scala:58)         在com.voicebase.etl.HBasePhoenixPerformance2.main(HBasePhoenixPerformance2.java:55)         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)         at java.lang.reflect.Method.invoke(Method.java:498)         在org.apache.spark.deploy.yarn.ApplicationMaster $$ anon $ 4.run(ApplicationMaster.scala:706)     315 [main] ERROR org.apache.spark.deploy.yarn.ApplicationMaster - 未捕获的异常:     org.apache.spark.SparkException:awaitResult中抛出异常:         在org.apache.spark.util.ThreadUtils $ .awaitResult(ThreadUtils.scala:205)         在org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:486)         在org.apache.spark.deploy.yarn.ApplicationMaster.org $ apache $ spark $ deploy $ yarn $ ApplicationMaster $$ runImpl(ApplicationMaster.scala:345)         在org.apache.spark.deploy.yarn.ApplicationMaster $$ anonfun $ run $ 2.apply $ mcV $ sp(ApplicationMaster.scala:260)         在org.apache.spark.deploy.yarn.ApplicationMaster $$ anonfun $ run $ 2.apply(ApplicationMaster.scala:260)         在org.apache.spark.deploy.yarn.ApplicationMaster $$ anonfun $ run $ 2.apply(ApplicationMaster.scala:260)         在org.apache.spark.deploy.yarn.ApplicationMaster $$ anon $ 5.run(ApplicationMaster.scala:800)         at java.security.AccessController.doPrivileged(Native Method)         在javax.security.auth.Subject.doAs(Subject.java:422)         在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)         在org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:799)         在org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:259)         在org.apache.spark.deploy.yarn.ApplicationMaster $ .main(ApplicationMaster.scala:824)         在org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)     引起:java.util.concurrent.ExecutionException:盒装错误

3 个答案:

答案 0 :(得分:1)

由于Hadoop和Spark为Netty编译的版本不匹配,因此此问题困扰。因此,您可以按照此操作。

Similar Issue , solved by manually compiling the Spark by using specific version of Netty

Suhas推荐的另一种方法是,将SPARK_HOME / jars文件夹的内容复制到各种lib文件夹中,或者仅将HADOOP_HOME / share / hadoop内的yarn文件夹中的内容复制即可。但这是一个肮脏的解决方法。因此,也许两者都使用最新版本或手动编译它们。

答案 1 :(得分:1)

aws-java-sdk需要较旧版本的Netty。删除所有netty jar并从项目中删除aws-java-sdk解决了问题。

答案 2 :(得分:0)

我找到了解决方案。这是因为hadoop二进制文件是用较旧版本编译的,需要我们替换它们。通过替换它们,我没有遇到hadoop的任何问题。

您需要将路径netty-3.6.2.Final.jar中的netty-all-4.0.23.Final.jar$HADOOP_HOME\share\hadoop替换为netty-all-4.1.17.Final.jarnetty-3.9.9.Final.jar

这解决了我的问题。如果您有其他解决方案,请共享。