没有看到火花提交的输出

时间:2019-02-02 11:02:21

标签: apache-spark

我正在使用Spark运行spark-submit随附的一些示例。我正在使用Ubuntu虚拟机。我尝试运行的课程如下

object SparkPi {
  def main(args: Array[String]) {
    val spark = SparkSession
      .builder
      .appName("Spark Pi")
      .getOrCreate()
    val slices = if (args.length > 0) args(0).toInt else 2
    val n = math.min(100000L * slices, Int.MaxValue).toInt // avoid overflow
    val count = spark.sparkContext.parallelize(1 until n, slices).map { i =>
      val x = random * 2 - 1
      val y = random * 2 - 1
      if (x*x + y*y <= 1) 1 else 0
    }.reduce(_ + _)
    println(s"Pi is roughly ${4.0 * count / (n - 1)}")
    spark.stop()
  }
}

要运行上述代码,我使用spark-submit脚本,如下所示:

manu@manu-VirtualBox:~/spark-2.4.0-bin-hadoop2.7$ ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master local ./examples/jars/spark-examples_2.11-2.4.0.jar 10

我看到的是以下内容(对于大跟踪转储,我们深表歉意),但看不到打印内容"Pi is roughly。我也没有看到任何错误。为什么我看不到输出

2019-02-02 10:56:43 WARN  Utils:66 - Your hostname, manu-VirtualBox resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface enp0s3)
2019-02-02 10:56:43 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2019-02-02 10:56:44 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-02-02 10:56:45 INFO  SparkContext:54 - Running Spark version 2.4.0
2019-02-02 10:56:45 INFO  SparkContext:54 - Submitted application: Spark Pi
2019-02-02 10:56:45 INFO  SecurityManager:54 - Changing view acls to: manu
2019-02-02 10:56:45 INFO  SecurityManager:54 - Changing modify acls to: manu
2019-02-02 10:56:45 INFO  SecurityManager:54 - Changing view acls groups to: 
2019-02-02 10:56:45 INFO  SecurityManager:54 - Changing modify acls groups to: 
2019-02-02 10:56:45 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(manu); groups with view permissions: Set(); users  with modify permissions: Set(manu); groups with modify permissions: Set()
2019-02-02 10:56:46 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 32995.
2019-02-02 10:56:46 INFO  SparkEnv:54 - Registering MapOutputTracker
2019-02-02 10:56:46 INFO  SparkEnv:54 - Registering BlockManagerMaster
2019-02-02 10:56:46 INFO  BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2019-02-02 10:56:46 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2019-02-02 10:56:46 INFO  DiskBlockManager:54 - Created local directory at /tmp/blockmgr-13d95f47-51a8-4d27-8ebd-15cb0ee3d61a
2019-02-02 10:56:46 INFO  MemoryStore:54 - MemoryStore started with capacity 413.9 MB
2019-02-02 10:56:46 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2019-02-02 10:56:46 INFO  log:192 - Logging initialized @4685ms
2019-02-02 10:56:47 INFO  Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
2019-02-02 10:56:47 INFO  Server:419 - Started @5030ms
2019-02-02 10:56:47 INFO  AbstractConnector:278 - Started ServerConnector@46c6297b{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-02-02 10:56:47 INFO  Utils:54 - Successfully started service 'SparkUI' on port 4040.
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3f2049b6{/jobs,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6b85300e{/jobs/json,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3aaf4f07{/jobs/job,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@18e8473e{/jobs/job/json,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5a2f016d{/stages,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1a38ba58{/stages/json,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3ad394e6{/stages/stage,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1deb2c43{/stages/stage/json,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3bb9efbc{/stages/pool,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1cefc4b3{/stages/pool/json,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2b27cc70{/storage,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6f6a7463{/storage/json,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1bdaa23d{/storage/rdd,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@79f227a9{/storage/rdd/json,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6ca320ab{/environment,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@50d68830{/environment/json,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1e53135d{/executors,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7674a051{/executors/json,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3a7704c{/executors/threadDump,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6754ef00{/executors/threadDump/json,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@619bd14c{/static,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@106faf11{/,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@70f43b45{/api,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2c282004{/jobs/job/kill,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@22ee2d0{/stages/stage/kill,null,AVAILABLE,@Spark}
2019-02-02 10:56:47 INFO  SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://10.0.2.15:4040
2019-02-02 10:56:47 INFO  SparkContext:54 - Added JAR file:/home/manu/spark-2.4.0-bin-hadoop2.7/./examples/jars/spark-examples_2.11-2.4.0.jar at spark://10.0.2.15:32995/jars/spark-examples_2.11-2.4.0.jar with timestamp 1549105007905
2019-02-02 10:56:48 INFO  Executor:54 - Starting executor ID driver on host localhost
2019-02-02 10:56:48 INFO  Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42123.
2019-02-02 10:56:48 INFO  NettyBlockTransferService:54 - Server created on 10.0.2.15:42123
2019-02-02 10:56:48 INFO  BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2019-02-02 10:56:48 INFO  BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, 10.0.2.15, 42123, None)
2019-02-02 10:56:48 INFO  BlockManagerMasterEndpoint:54 - Registering block manager 10.0.2.15:42123 with 413.9 MB RAM, BlockManagerId(driver, 10.0.2.15, 42123, None)
2019-02-02 10:56:48 INFO  BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, 10.0.2.15, 42123, None)
2019-02-02 10:56:48 INFO  BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, 10.0.2.15, 42123, None)
2019-02-02 10:56:49 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7e46d648{/metrics/json,null,AVAILABLE,@Spark}
2019-02-02 10:56:49 INFO  SparkContext:54 - Starting job: reduce at SparkPi.scala:38
2019-02-02 10:56:50 INFO  DAGScheduler:54 - Got job 0 (reduce at SparkPi.scala:38) with 10 output partitions
2019-02-02 10:56:50 INFO  DAGScheduler:54 - Final stage: ResultStage 0 (reduce at SparkPi.scala:38)
2019-02-02 10:56:50 INFO  DAGScheduler:54 - Parents of final stage: List()
2019-02-02 10:56:50 INFO  DAGScheduler:54 - Missing parents: List()
2019-02-02 10:56:50 INFO  DAGScheduler:54 - Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34), which has no missing parents
2019-02-02 10:56:50 INFO  MemoryStore:54 - Block broadcast_0 stored as values in memory (estimated size 1936.0 B, free 413.9 MB)
2019-02-02 10:56:50 INFO  MemoryStore:54 - Block broadcast_0_piece0 stored as bytes in memory (estimated size 1256.0 B, free 413.9 MB)
2019-02-02 10:56:50 INFO  BlockManagerInfo:54 - Added broadcast_0_piece0 in memory on 10.0.2.15:42123 (size: 1256.0 B, free: 413.9 MB)
2019-02-02 10:56:50 INFO  SparkContext:54 - Created broadcast 0 from broadcast at DAGScheduler.scala:1161
2019-02-02 10:56:50 INFO  DAGScheduler:54 - Submitting 10 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9))
2019-02-02 10:56:50 INFO  TaskSchedulerImpl:54 - Adding task set 0.0 with 10 tasks
2019-02-02 10:56:51 INFO  TaskSetManager:54 - Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7866 bytes)
2019-02-02 10:56:51 INFO  Executor:54 - Running task 0.0 in stage 0.0 (TID 0)
2019-02-02 10:56:51 INFO  Executor:54 - Fetching spark://10.0.2.15:32995/jars/spark-examples_2.11-2.4.0.jar with timestamp 1549105007905
2019-02-02 10:56:51 INFO  TransportClientFactory:267 - Successfully created connection to /10.0.2.15:32995 after 110 ms (0 ms spent in bootstraps)
2019-02-02 10:56:51 INFO  Utils:54 - Fetching spark://10.0.2.15:32995/jars/spark-examples_2.11-2.4.0.jar to /tmp/spark-3c47ed54-5a7a-4785-84e3-4b834b94b238/userFiles-f31cec9c-5bb9-41d0-b8c3-e18abe2be54a/fetchFileTemp4213110830681726950.tmp
2019-02-02 10:56:51 INFO  Executor:54 - Adding file:/tmp/spark-3c47ed54-5a7a-4785-84e3-4b834b94b238/userFiles-f31cec9c-5bb9-41d0-b8c3-e18abe2be54a/spark-examples_2.11-2.4.0.jar to class loader
2019-02-02 10:56:52 INFO  Executor:54 - Finished task 0.0 in stage 0.0 (TID 0). 910 bytes result sent to driver
2019-02-02 10:56:52 INFO  TaskSetManager:54 - Starting task 1.0 in stage 0.0 (TID 1, localhost, executor driver, partition 1, PROCESS_LOCAL, 7866 bytes)
2019-02-02 10:56:52 INFO  Executor:54 - Running task 1.0 in stage 0.0 (TID 1)
2019-02-02 10:56:52 INFO  TaskSetManager:54 - Finished task 0.0 in stage 0.0 (TID 0) in 1113 ms on localhost (executor driver) (1/10)
2019-02-02 10:56:52 INFO  Executor:54 - Finished task 1.0 in stage 0.0 (TID 1). 867 bytes result sent to driver
2019-02-02 10:56:52 INFO  TaskSetManager:54 - Starting task 2.0 in stage 0.0 (TID 2, localhost, executor driver, partition 2, PROCESS_LOCAL, 7866 bytes)
2019-02-02 10:56:52 INFO  Executor:54 - Running task 2.0 in stage 0.0 (TID 2)
2019-02-02 10:56:52 INFO  TaskSetManager:54 - Finished task 1.0 in stage 0.0 (TID 1) in 271 ms on localhost (executor driver) (2/10)
2019-02-02 10:56:52 INFO  Executor:54 - Finished task 2.0 in stage 0.0 (TID 2). 824 bytes result sent to driver
2019-02-02 10:56:52 INFO  TaskSetManager:54 - Starting task 3.0 in stage 0.0 (TID 3, localhost, executor driver, partition 3, PROCESS_LOCAL, 7866 bytes)
2019-02-02 10:56:52 INFO  Executor:54 - Running task 3.0 in stage 0.0 (TID 3)
2019-02-02 10:56:52 INFO  TaskSetManager:54 - Finished task 2.0 in stage 0.0 (TID 2) in 199 ms on localhost (executor driver) (3/10)
2019-02-02 10:56:52 INFO  Executor:54 - Finished task 3.0 in stage 0.0 (TID 3). 867 bytes result sent to driver
2019-02-02 10:56:52 INFO  TaskSetManager:54 - Starting task 4.0 in stage 0.0 (TID 4, localhost, executor driver, partition 4, PROCESS_LOCAL, 7866 bytes)
2019-02-02 10:56:52 INFO  Executor:54 - Running task 4.0 in stage 0.0 (TID 4)
2019-02-02 10:56:52 INFO  TaskSetManager:54 - Finished task 3.0 in stage 0.0 (TID 3) in 204 ms on localhost (executor driver) (4/10)
2019-02-02 10:56:52 INFO  Executor:54 - Finished task 4.0 in stage 0.0 (TID 4). 824 bytes result sent to driver
2019-02-02 10:56:52 INFO  TaskSetManager:54 - Starting task 5.0 in stage 0.0 (TID 5, localhost, executor driver, partition 5, PROCESS_LOCAL, 7866 bytes)
2019-02-02 10:56:52 INFO  TaskSetManager:54 - Finished task 4.0 in stage 0.0 (TID 4) in 178 ms on localhost (executor driver) (5/10)
2019-02-02 10:56:52 INFO  Executor:54 - Running task 5.0 in stage 0.0 (TID 5)
2019-02-02 10:56:53 INFO  Executor:54 - Finished task 5.0 in stage 0.0 (TID 5). 824 bytes result sent to driver
2019-02-02 10:56:53 INFO  TaskSetManager:54 - Starting task 6.0 in stage 0.0 (TID 6, localhost, executor driver, partition 6, PROCESS_LOCAL, 7866 bytes)
2019-02-02 10:56:53 INFO  TaskSetManager:54 - Finished task 5.0 in stage 0.0 (TID 5) in 145 ms on localhost (executor driver) (6/10)
2019-02-02 10:56:53 INFO  Executor:54 - Running task 6.0 in stage 0.0 (TID 6)
2019-02-02 10:56:53 INFO  Executor:54 - Finished task 6.0 in stage 0.0 (TID 6). 867 bytes result sent to driver
2019-02-02 10:56:53 INFO  TaskSetManager:54 - Starting task 7.0 in stage 0.0 (TID 7, localhost, executor driver, partition 7, PROCESS_LOCAL, 7866 bytes)
2019-02-02 10:56:53 INFO  TaskSetManager:54 - Finished task 6.0 in stage 0.0 (TID 6) in 212 ms on localhost (executor driver) (7/10)
2019-02-02 10:56:53 INFO  Executor:54 - Running task 7.0 in stage 0.0 (TID 7)
2019-02-02 10:56:53 INFO  Executor:54 - Finished task 7.0 in stage 0.0 (TID 7). 867 bytes result sent to driver
2019-02-02 10:56:53 INFO  TaskSetManager:54 - Starting task 8.0 in stage 0.0 (TID 8, localhost, executor driver, partition 8, PROCESS_LOCAL, 7866 bytes)
2019-02-02 10:56:53 INFO  TaskSetManager:54 - Finished task 7.0 in stage 0.0 (TID 7) in 152 ms on localhost (executor driver) (8/10)
2019-02-02 10:56:53 INFO  Executor:54 - Running task 8.0 in stage 0.0 (TID 8)
2019-02-02 10:56:53 INFO  Executor:54 - Finished task 8.0 in stage 0.0 (TID 8). 867 bytes result sent to driver
2019-02-02 10:56:53 INFO  TaskSetManager:54 - Starting task 9.0 in stage 0.0 (TID 9, localhost, executor driver, partition 9, PROCESS_LOCAL, 7866 bytes)
2019-02-02 10:56:53 INFO  TaskSetManager:54 - Finished task 8.0 in stage 0.0 (TID 8) in 103 ms on localhost (executor driver) (9/10)
2019-02-02 10:56:53 INFO  Executor:54 - Running task 9.0 in stage 0.0 (TID 9)
2019-02-02 10:56:53 INFO  Executor:54 - Finished task 9.0 in stage 0.0 (TID 9). 867 bytes result sent to driver
2019-02-02 10:56:53 INFO  TaskSetManager:54 - Finished task 9.0 in stage 0.0 (TID 9) in 79 ms on localhost (executor driver) (10/10)
2019-02-02 10:56:53 INFO  TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose tasks have all completed, from pool 
2019-02-02 10:56:53 INFO  DAGScheduler:54 - ResultStage 0 (reduce at SparkPi.scala:38) finished in 3.287 s
2019-02-02 10:56:53 INFO  DAGScheduler:54 - Job 0 finished: reduce at SparkPi.scala:38, took 3.700842 s
Pi is roughly 3.142931142931143
2019-02-02 10:56:53 INFO  AbstractConnector:318 - Stopped Spark@46c6297b{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-02-02 10:56:53 INFO  SparkUI:54 - Stopped Spark web UI at http://10.0.2.15:4040
2019-02-02 10:56:53 INFO  MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2019-02-02 10:56:53 INFO  MemoryStore:54 - MemoryStore cleared
2019-02-02 10:56:53 INFO  BlockManager:54 - BlockManager stopped
2019-02-02 10:56:53 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
2019-02-02 10:56:53 INFO  OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2019-02-02 10:56:53 INFO  SparkContext:54 - Successfully stopped SparkContext
2019-02-02 10:56:53 INFO  ShutdownHookManager:54 - Shutdown hook called
2019-02-02 10:56:53 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-3c47ed54-5a7a-4785-84e3-4b834b94b238
2019-02-02 10:56:54 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-2dcaa58b-d605-40dc-8dd8-df55607f1a59

2 个答案:

答案 0 :(得分:0)

不要在控制台上打印,而是尝试将结果保存到文件中。由于没有执行过程中标准输出,这将是很难找出结果,但我可以在输出中看到的结果。

答案 1 :(得分:0)

在我看来,正如@ruslangm所说,预期的输出实际上在这里:

question screenshot

也许我们没有得到问题。