我正在关注这些instructions以构建spark
。当我这样做时:
mvn -DskipTests clean package -e
我的系统详情:
处理器:英特尔®酷睿™i5-5300U CPU @ 2.30GHz×4
操作系统类型:64位Ubuntu 15.04
JVM:OpenJDK 64位服务器VM(24.79-b02)
Java:版本1.7.0_79
Scala版本:2.11.4
获得关注error
:
[INFO] Spark Project Parent POM ........................... SUCCESS [ 4.121 s]
[INFO] Spark Project Test Tags ............................ SUCCESS [ 2.811 s]
[INFO] Spark Project Sketch ............................... SUCCESS [ 11.832 s]
[INFO] Spark Project Launcher ............................. SUCCESS [ 10.080 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 8.214 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 6.070 s]
[INFO] Spark Project Unsafe ............................... SUCCESS [ 16.173 s]
[INFO] Spark Project Core ................................. SUCCESS [04:26 min]
[INFO] Spark Project GraphX ............................... SUCCESS [05:40 min]
[INFO] Spark Project Streaming ............................ SUCCESS [12:02 min]
[INFO] Spark Project Catalyst ............................. FAILURE [11:01 min]
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project Docker Integration Tests ............. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Twitter ..................... SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Assembly .............. SKIPPED
[INFO] Spark Project External Akka ........................ SKIPPED
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External MQTT Assembly ............... SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project External Kafka ....................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 34:11 min
[INFO] Finished at: 2016-02-18T11:31:49+05:30
[INFO] Final Memory: 77M/758M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-catalyst_2.11: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. NullPointerException -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-catalyst_2.11: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:145)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
... 20 more
Caused by: java.lang.NullPointerException
at scala.tools.nsc.typechecker.Typers$Typer.typedFunction$1(Typers.scala:5280)
at scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1(Typers.scala:5320)
at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5352)
at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5359)
at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5395)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5422)
at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5369)
at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5373)
at scala.tools.nsc.typechecker.Typers$Typer.typedArg(Typers.scala:3164)
at scala.tools.nsc.typechecker.PatternTypers$PatternTyper$class.typedArgWithFormal$1(PatternTypers.scala:112)
at scala.tools.nsc.typechecker.PatternTypers$PatternTyper$$anonfun$2.apply(PatternTypers.scala:115)
at scala.tools.nsc.typechecker.PatternTypers$PatternTyper$$anonfun$2.apply(PatternTypers.scala:115)
at scala.runtime.Tuple2Zipped$$anonfun$map$extension$1.apply(Tuple2Zipped.scala:46)
at scala.runtime.Tuple2Zipped$$anonfun$map$extension$1.apply(Tuple2Zipped.scala:44)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.runtime.Tuple2Zipped$.map$extension(Tuple2Zipped.scala:44)
at scala.tools.nsc.typechecker.PatternTypers$PatternTyper$class.typedArgsForFormals(PatternTypers.scala:115)
at scala.tools.nsc.typechecker.Typers$Typer.typedArgsForFormals(Typers.scala:111)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$handleMonomorphicCall$1(Typers.scala:3470)
答案 0 :(得分:2)
看一下构建Spark的official docs。
如果要使用Scala 2.11编译Spark,请尝试以下操作(假设您位于源目录的根目录中):
./dev/change-scala-version.sh 2.11
./build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests -Dscala-2.11 clean package