当我编译spark1.6.1源代码时,发生了如下错误:
Running org.apache.spark.JavaAPISuite
Exception in thread "Executor task launch worker-0" java.lang.IllegalStateException: RpcEnv already stopped.
at org.apache.spark.rpc.netty.Dispatcher.postMessage(Dispatcher.scala:159)
at org.apache.spark.rpc.netty.Dispatcher.postOneWayMessage(Dispatcher.scala:131)
at org.apache.spark.rpc.netty.NettyRpcEnv.send(NettyRpcEnv.scala:192)
at org.apache.spark.rpc.netty.NettyRpcEndpointRef.send(NettyRpcEnv.scala:516)
at org.apache.spark.scheduler.local.LocalBackend.statusUpdate(LocalBackend.scala:151)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:317)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
任何人都知道这个问题以及如何解决它?
答案 0 :(得分:0)
您不需要运行单元测试来编译Spark。
如果您使用的是SBT,只需运行build/sbt assembly
即可。如果您使用的是maven,只需运行build/mvn package -DskipTests
。