在jenkins中找不到Spark-version-info.properties

时间:2017-03-12 19:06:41

标签: java apache-spark maven-3 jenkins-plugins sparkcore

我正在开发一个使用 spark-core lib 的插件。当我将它作为java应用程序运行时,它是可以的,但是当我在Jenkins中运行插件时,它会显示一个错误,表示

    java.lang.ExceptionInInitializerError
    at org.apache.spark.package$.<init>(package.scala:91)
    at org.apache.spark.package$.<clinit>(package.scala)
    at org.apache.spark.SparkContext$$anonfun$3.apply(SparkContext.scala:185)
    at org.apache.spark.SparkContext$$anonfun$3.apply(SparkContext.scala:185)
    at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
    at org.apache.spark.SparkContext.logInfo(SparkContext.scala:74)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:185)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2275)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
    at com.plugin.goettingen_plugin.HelloWorldBuilder.perform(HelloWorldBuilder.java:88)
    at hudson.tasks.BuildStepCompatibilityLayer.perform(BuildStepCompatibilityLayer.java:75)
    at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
    at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:785)
    at hudson.model.Build$BuildExecution.build(Build.java:205)
    at hudson.model.Build$BuildExecution.doRun(Build.java:162)
    at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:537)
    at hudson.model.Run.execute(Run.java:1741)
    at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
    at hudson.model.ResourceController.execute(ResourceController.java:98)
    at hudson.model.Executor.run(Executor.java:408)
Caused by: org.apache.spark.SparkException: Error while locating file spark-version-info.properties
    at org.apache.spark.package$SparkBuildInfo$.liftedTree1$1(package.scala:75)
    at org.apache.spark.package$SparkBuildInfo$.<init>(package.scala:61)
    at org.apache.spark.package$SparkBuildInfo$.<clinit>(package.scala)
    ... 23 more
Caused by: java.lang.NullPointerException
    at java.util.Properties$LineReader.readLine(Properties.java:434)
    at java.util.Properties.load0(Properties.java:353)
    at java.util.Properties.load(Properties.java:341)
    at org.apache.spark.package$SparkBuildInfo$.liftedTree1$1(package.scala:64)
    ... 25 more

我正在使用以下代码启动一个spark会话:

SparkSession sparkSession = SparkSession.builder().appName("DP-App").master("local[2]").getOrCreate();

spark-core lib通过以下代码搜索名为package.java的类,该代码返回null:

InputStream resourceStream = Thread.currentThread().getContextClassLoader().getResourceAsStream("spark-version-info.properties");

由于 spark-version-info.properties 存在于 spark-core lib 中,我试图将文件转移到Web-INF但仍然无法加载文件。 有没有其他方法可以加载文件并绕过库中的上述代码?

我的依赖关系是:

  <dependencies>
    <dependency>
      <groupId>org.jenkins-ci.plugins</groupId>
      <artifactId>credentials</artifactId>
      <version>1.9.4</version>
    </dependency>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-mllib_2.11</artifactId>
      <version>2.0.1</version>
</dependency>
  </dependencies>

3 个答案:

答案 0 :(得分:2)

您缺少spark-version-info.properties个文件。

所以只需在./core/target/extra-resources

下创建一个
λ ~/workspace/big_data/spark/ master* ./build/spark-build-info ./core/target/extra-resources 2.1.1
λ ~/workspace/big_data/spark/ master* cat ./core/target/extra-resources/spark-version-info.properties
version=2.1.1
user=chanhle
revision=dec9aa3b37c01454065a4d8899859991f43d4c66
branch=master
date=2017-06-07T15:12:48Z
url=https://github.com/apache/spark

在IntelliJ上调试Spark时,我也遇到了同样的问题。

答案 1 :(得分:1)

我在IntelliJ IDE中使用工作表遇到了同样的问题。通过禁用“在编译器进程中运行工作表”来解决它

enter image description here

答案 2 :(得分:0)

I found that if you load Apache Spark through a classloader that is different than the Thread.currentThread().getContextClassLoader() value, then this version file will not be found. If you are doing custom classloading, then you should make sure that you set the classloader that Spark was loaded through using the Thread.currentThread().setContextClassLoader(myCustomLoader) call. This solved the problem for me.