java.lang.NoSuchMethodError:在纱线簇上火花提交时

时间:2018-07-26 16:35:54

标签: scala apache-spark sbt yarn

我有一个在本地模式下正确运行的spark应用程序。当我在纱线簇上运行火花提交时,出现以下错误:

18/07/26 18:12:38 ERROR ApplicationMaster: User class threw exception: java.lang.NoSuchMethodError: org.apache.http.impl.client.HttpClientBuilder.setSSLContext(Ljavax/net/ssl/SSLContext;)Lorg/apache/http/impl/client/HttpClientBuilder;
java.lang.NoSuchMethodError: org.apache.http.impl.client.HttpClientBuilder.setSSLContext(Ljavax/net/ssl/SSLContext;)Lorg/apache/http/impl/client/HttpClientBuilder;
        at fr.test.ssl.SecrityHttpClient$.getHttpClientWithoutSSL(SecrityHttpClient.scala:23)
        at fr.test.processor.HttpProcessor$.execute(HttpProcessor.scala:36)
        at fr.test.engine.RequestEngine$$anonfun$executeHttpRequest$2.apply(RequestEngine.scala:28)
        at fr.test.engine.RequestEngine$$anonfun$executeHttpRequest$2.apply(RequestEngine.scala:21)
        at scala.collection.immutable.List.foreach(List.scala:318)
        at fr.test.engine.RequestEngine$.executeHttpRequest(RequestEngine.scala:21)
        at fr.test.launcher.Launcher$.executeRequestList(Launcher.scala:20)
        at fr.test.launcher.Launcher$.main(Launcher.scala:10)
        at fr.test.launcher.Launcher.main(Launcher.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:559)
18/07/26 18:12:38 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.NoSuchMethodError: org.apache.http.impl.client.HttpClientBuilder.setSSLContext(Ljavax/net/ssl/SSLContext;)Lorg/apache/http/impl/client/HttpClientBuilder;)

似乎找不到httpclient依赖项。这是我的构建

import aether.AetherKeys._

name := "my_app"

organization := "fr.test"
version := "0.1"
scalaVersion := "2.10.6"

val httpclientVersion = "4.5.6"
val slickVersion = "3.1.1"
val hikariCPVersion = "2.4.6"

libraryDependencies += "com.google.code.gson" % "gson" % "2.8.5"
libraryDependencies += "com.typesafe.slick" %% "slick-hikaricp" % slickVersion exclude("com.zaxxer", "HikariCP-java6")
libraryDependencies += "com.zaxxer" % "HikariCP" % hikariCPVersion

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.2"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.6.2"
libraryDependencies += "com.springml" %% "spark-sftp" % "1.0.2"
// logging
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.7"
libraryDependencies += "com.sndyuk" % "logback-more-appenders" % "1.4.2"

// https://mvnrepository.com/artifact/com.databricks/spark-csv
libraryDependencies += "com.databricks" %% "spark-csv" % "1.5.0"

libraryDependencies += "org.apache.httpcomponents" % "httpclient" % httpclientVersion

libraryDependencies += "org.postgresql" % "postgresql" % "9.4.1208"

aetherOldVersionMethod := true
overridePublishSettings

mainClass in assembly := Some("fr.test.Launcher")
assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case "reference.conf" => MergeStrategy.concat
  case x => MergeStrategy.first
}

// Configure assembly artifact to be published
artifact in(Compile, assembly) := {
  val art = (artifact in(Compile, assembly)).value
  art.withClassifier(Some("assembly"))
}
addArtifact(artifact in(Compile, assembly), assembly)

我打开了由汇编命令生成的胖jar,并在jar中找到了依赖项,所以我不明白为什么会收到此错误。我们根据群集上安装的scala和spark版本使用正确的scala和spark版本。 这是我的火花提交:

spark-submit --class fr.test.Launcher \
--master yarn-cluster \
--num-executors 4 \
--driver-memory 10g \
--executor-memory 5g \
--queue dlk_dev \
--files /home/my_user/my_app_2.10/-SNAPSHOT/application--SNAPSHOT.conf#app.conf \
--conf "spark.driver.extraJavaOptions=-verbose:class" \
--conf "spark.executor.extraJavaOptions=-verbose:class" \
/home/my_user/my_app_2.10/-SNAPSHOT/my_app_2.10--SNAPSHOT.jar 

我也在纱线记录中找到了这条线

[Loaded org.apache.http.impl.client.HttpClientBuilder from file:/data/5/yarn/local/filecache/216/spark-hdp-assembly.jar]

你有什么主意吗?

1 个答案:

答案 0 :(得分:1)

spark带有自己的http客户端。您使用的http-client版本可能与纱线群集中部署的版本不同。您可以将spark config选项spark.executor.userClassPathFirst设置为true,以使您的用户提供的jar(在这种情况下,仅是您的uber jar)首先放在类路径上。这应该允许您首先获取您的httpclient版本。