Spark提交本地Executor无法获取jar

时间:2016-08-28 18:28:31

标签: scala apache-spark

我试图从他们的文档中运行一个Spark示例:

https://spark.apache.org/docs/1.2.0/quick-start.html

每当我尝试使用自包含应用程序时,我都会得到以下输出:

    func buildLoginButton() -> UIButton
{
    let button = UIButton()
    button.backgroundColor = .greenColor()
    button.setTitle("Test Button", forState: .Normal)
    self.view.addSubview(button)

    let c1 = NSLayoutConstraint(item: button, attribute: NSLayoutAttribute.CenterX, relatedBy: NSLayoutRelation.Equal, toItem: view, attribute: NSLayoutAttribute.CenterX, multiplier: 1, constant: 0)

    let c2 = NSLayoutConstraint(item: button, attribute: NSLayoutAttribute.CenterY, relatedBy: NSLayoutRelation.Equal, toItem: view, attribute: NSLayoutAttribute.CenterY, multiplier: 1, constant: 0)

    button.widthAnchor.constraintEqualToAnchor(nil, constant: 200).active = true
    button.heightAnchor.constraintEqualToAnchor(nil, constant: 100).active = true

     NSLayoutConstraint.activateConstraints([c1, c2])
    view.bringSubviewToFront(button)

    return button
}

执行时,我等了一段时间

16/08/28 13:18:30 INFO SparkContext: Running Spark version 1.5.1
16/08/28 13:18:30 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/08/28 13:18:31 INFO SecurityManager: Changing view acls to: alejandrohernandez
16/08/28 13:18:31 INFO SecurityManager: Changing modify acls to: alejandrohernandez
16/08/28 13:18:31 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(alejandrohernandez); users with modify permissions: Set(alejandrohernandez)
16/08/28 13:18:31 INFO Slf4jLogger: Slf4jLogger started
16/08/28 13:18:31 INFO Remoting: Starting remoting
16/08/28 13:18:31 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.15.3:56988]
16/08/28 13:18:31 INFO Utils: Successfully started service 'sparkDriver' on port 56988.
16/08/28 13:18:31 INFO SparkEnv: Registering MapOutputTracker
16/08/28 13:18:31 INFO SparkEnv: Registering BlockManagerMaster
16/08/28 13:18:31 INFO DiskBlockManager: Created local directory at /private/var/folders/lb/78w91_l123n0cvprhmldkxhc0000gp/T/blockmgr-be8bedf7-96fe-425b-8344-c668110905eb
16/08/28 13:18:31 INFO MemoryStore: MemoryStore started with capacity 530.0 MB
16/08/28 13:18:31 INFO HttpFileServer: HTTP File server directory is /private/var/folders/lb/78w91_l123n0cvprhmldkxhc0000gp/T/spark-a122037d-3228-4e53-b3dd-6d7213187df0/httpd-e3388b36-1605-4cc5-a4c1-def1b7660570
16/08/28 13:18:31 INFO HttpServer: Starting HTTP Server
16/08/28 13:18:31 INFO Utils: Successfully started service 'HTTP file server' on port 56989.
16/08/28 13:18:31 INFO SparkEnv: Registering OutputCommitCoordinator
16/08/28 13:18:31 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/08/28 13:18:31 INFO SparkUI: Started SparkUI at http://192.168.15.3:4040
16/08/28 13:18:31 INFO SparkContext: Added JAR file:/Users/alejandrohernandez/repos/AssetBreakdownUploader/target/scala-2.10/AssetBreakdownUploader-0.1-SNAPSHOT.jar at http://192.168.15.3:56989/jars/AssetBreakdownUploader-0.1-SNAPSHOT.jar with timestamp 1472408311863
16/08/28 13:18:31 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
16/08/28 13:18:31 INFO Executor: Starting executor ID driver on host localhost
16/08/28 13:18:31 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 56990.
16/08/28 13:18:31 INFO NettyBlockTransferService: Server created on 56990
16/08/28 13:18:31 INFO BlockManagerMaster: Trying to register BlockManager
16/08/28 13:18:31 INFO BlockManagerMasterEndpoint: Registering block manager localhost:56990 with 530.0 MB RAM, BlockManagerId(driver, localhost, 56990)
16/08/28 13:18:31 INFO BlockManagerMaster: Registered BlockManager
16/08/28 13:18:32 INFO MemoryStore: ensureFreeSpace(108600) called with curMem=0, maxMem=555755765
16/08/28 13:18:32 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 106.1 KB, free 529.9 MB)
16/08/28 13:18:32 INFO MemoryStore: ensureFreeSpace(11386) called with curMem=108600, maxMem=555755765
16/08/28 13:18:32 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 11.1 KB, free 529.9 MB)
16/08/28 13:18:32 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:56990 (size: 11.1 KB, free: 530.0 MB)
16/08/28 13:18:32 INFO SparkContext: Created broadcast 0 from textFile at Main.scala:12
16/08/28 13:18:33 INFO FileInputFormat: Total input paths to process : 1
16/08/28 13:18:33 INFO SparkContext: Starting job: count at Main.scala:13
16/08/28 13:18:33 INFO DAGScheduler: Got job 0 (count at Main.scala:13) with 1 output partitions
16/08/28 13:18:33 INFO DAGScheduler: Final stage: ResultStage 0(count at Main.scala:13)
16/08/28 13:18:33 INFO DAGScheduler: Parents of final stage: List()
16/08/28 13:18:33 INFO DAGScheduler: Missing parents: List()
16/08/28 13:18:33 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at filter at Main.scala:13), which has no missing parents
16/08/28 13:18:33 INFO MemoryStore: ensureFreeSpace(3224) called with curMem=119986, maxMem=555755765
16/08/28 13:18:33 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 3.1 KB, free 529.9 MB)
16/08/28 13:18:33 INFO MemoryStore: ensureFreeSpace(1925) called with curMem=123210, maxMem=555755765
16/08/28 13:18:33 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 1925.0 B, free 529.9 MB)
16/08/28 13:18:33 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:56990 (size: 1925.0 B, free: 530.0 MB)
16/08/28 13:18:33 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:861
16/08/28 13:18:33 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at filter at Main.scala:13)
16/08/28 13:18:33 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
16/08/28 13:18:33 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 2258 bytes)
16/08/28 13:18:33 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
16/08/28 13:18:33 INFO Executor: Fetching http://192.168.15.3:56989/jars/AssetBreakdownUploader-0.1-SNAPSHOT.jar with timestamp 1472408311863
16/08/28 13:19:33 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.net.SocketTimeoutException: connect timed out
    at java.net.PlainSocketImpl.socketConnect(Native Method)
    at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
    at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
    at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
    at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    at java.net.Socket.connect(Socket.java:589)
    at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
    at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
    at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
    at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
    at sun.net.www.http.HttpClient.New(HttpClient.java:308)
    at sun.net.www.http.HttpClient.New(HttpClient.java:326)
    at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
    at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
    at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
    at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
    at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:555)
    at org.apache.spark.util.Utils$.fetchFile(Utils.scala:369)
    at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:405)
    at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:397)
    at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
    at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
    at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
    at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:397)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
16/08/28 13:19:33 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.net.SocketTimeoutException: connect timed out
    at java.net.PlainSocketImpl.socketConnect(Native Method)
    at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
    at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
    at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
    at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    at java.net.Socket.connect(Socket.java:589)
    at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
    at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
    at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
    at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
    at sun.net.www.http.HttpClient.New(HttpClient.java:308)
    at sun.net.www.http.HttpClient.New(HttpClient.java:326)
    at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
    at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
    at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
    at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
    at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:555)
    at org.apache.spark.util.Utils$.fetchFile(Utils.scala:369)
    at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:405)
    at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:397)
    at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
    at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
    at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
    at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:397)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

16/08/28 13:19:33 ERROR TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job
16/08/28 13:19:33 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
16/08/28 13:19:33 INFO TaskSchedulerImpl: Cancelling stage 0
16/08/28 13:19:33 INFO DAGScheduler: ResultStage 0 (count at Main.scala:13) failed in 60.069 s
16/08/28 13:19:33 INFO DAGScheduler: Job 0 failed: count at Main.scala:13, took 60.144276 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.net.SocketTimeoutException: connect timed out
    at java.net.PlainSocketImpl.socketConnect(Native Method)
    at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
    at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
    at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
    at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    at java.net.Socket.connect(Socket.java:589)
    at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
    at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
    at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
    at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
    at sun.net.www.http.HttpClient.New(HttpClient.java:308)
    at sun.net.www.http.HttpClient.New(HttpClient.java:326)
    at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
    at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
    at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
    at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
    at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:555)
    at org.apache.spark.util.Utils$.fetchFile(Utils.scala:369)
    at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:405)
    at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:397)
    at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
    at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
    at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
    at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:397)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
    at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1283)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1271)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1270)
    at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1270)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
    at scala.Option.foreach(Option.scala:236)
    at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1496)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1458)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1447)
    at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
    at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1822)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1835)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1848)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1919)
    at org.apache.spark.rdd.RDD.count(RDD.scala:1121)
    at com.ooyala.uploader.Main$.main(Main.scala:13)
    at com.ooyala.uploader.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.SocketTimeoutException: connect timed out
    at java.net.PlainSocketImpl.socketConnect(Native Method)
    at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
    at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
    at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
    at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
    at java.net.Socket.connect(Socket.java:589)
    at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
    at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
    at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
    at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
    at sun.net.www.http.HttpClient.New(HttpClient.java:308)
    at sun.net.www.http.HttpClient.New(HttpClient.java:326)
    at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169)
    at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1105)
    at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999)
    at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933)
    at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:555)
    at org.apache.spark.util.Utils$.fetchFile(Utils.scala:369)
    at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:405)
    at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:397)
    at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
    at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
    at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
    at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:397)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:193)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
16/08/28 13:19:33 INFO SparkContext: Invoking stop() from shutdown hook
16/08/28 13:19:33 INFO SparkUI: Stopped Spark web UI at http://192.168.15.3:4040
16/08/28 13:19:33 INFO DAGScheduler: Stopping DAGScheduler
16/08/28 13:19:33 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/08/28 13:19:33 INFO MemoryStore: MemoryStore cleared
16/08/28 13:19:33 INFO BlockManager: BlockManager stopped
16/08/28 13:19:33 INFO BlockManagerMaster: BlockManagerMaster stopped
16/08/28 13:19:33 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/08/28 13:19:33 INFO SparkContext: Successfully stopped SparkContext
16/08/28 13:19:33 INFO ShutdownHookManager: Shutdown hook called
16/08/28 13:19:33 INFO ShutdownHookManager: Deleting directory /private/var/folders/lb/78w91_l123n0cvprhmldkxhc0000gp/T/spark-a122037d-3228-4e53-b3dd-6d7213187df0

直到超时发生。关于可能发生什么的任何想法?

0 个答案:

没有答案