Spark未连接到独立集群中的主IP地址

时间:2019-11-23 11:58:41

标签: scala apache-spark

我正在建立一个Spark独立集群并创建一个Spark会话。以下scala代码已用于创建Spark会话:

 val session = SparkSession.builder()
               .master("spark://master_ip:7077")
               .getOrCreate()

我还更改了spark-env.sh以在主计算机和从计算机中都指定SPARK_MASTER_HOST。但是代码未运行,并引发以下错误/堆栈跟踪:

9/11/23 12:46:53 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://master_ip:7077...
19/11/23 12:46:53 INFO TransportClientFactory: Successfully created connection to /master_ip:7077 after 15 ms (0 ms spent in bootstraps)
19/11/23 12:47:13 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://master_ip:7077...
19/11/23 12:47:33 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://master_ip:7077...
19/11/23 12:47:53 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
19/11/23 12:47:53 WARN StandaloneSchedulerBackend: Application ID is not initialized yet.
19/11/23 12:47:53 INFO SparkUI: Stopped Spark web UI at http://localhost:4040
19/11/23 12:47:53 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46455.
19/11/23 12:47:53 INFO NettyBlockTransferService: Server created on localhost:46455
19/11/23 12:47:53 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/11/23 12:47:53 INFO StandaloneSchedulerBackend: Shutting down all executors
19/11/23 12:47:53 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
19/11/23 12:47:53 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null) because has not yet connected to master
19/11/23 12:47:53 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/11/23 12:47:53 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, localhost, 46455, None)
19/11/23 12:47:53 INFO MemoryStore: MemoryStore cleared
19/11/23 12:47:53 INFO BlockManager: BlockManager stopped
19/11/23 12:47:53 INFO BlockManagerMasterEndpoint: Registering block manager localhost:46455 with 1929.9 MB RAM, BlockManagerId(driver, localhost, 46455, None)
19/11/23 12:47:53 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, localhost, 46455, None)
19/11/23 12:47:53 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, localhost, 46455, None)
19/11/23 12:47:53 INFO BlockManagerMaster: BlockManagerMaster stopped
19/11/23 12:47:53 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/11/23 12:47:53 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
    at scala.Predef$.require(Predef.scala:281)
    at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:516)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$5(SparkSession.scala:935)
    at scala.Option.getOrElse(Option.scala:138)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
    at query.rewrite.QueryRewriteDemo1$.main(QueryRewriteDemo1.scala:12)
    at query.rewrite.QueryRewriteDemo1.main(QueryRewriteDemo1.scala)
19/11/23 12:47:53 INFO SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
    at scala.Predef$.require(Predef.scala:281)
    at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:516)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$5(SparkSession.scala:935)
    at scala.Option.getOrElse(Option.scala:138)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
    at query.rewrite.QueryRewriteDemo1$.main(QueryRewriteDemo1.scala:12)
    at query.rewrite.QueryRewriteDemo1.main(QueryRewriteDemo1.scala)

我什至检查了此解决方案here。但是似乎这不是问题所在,因为两端的版本对我来说都是相同的(2.4.4)。有人可以帮我在这里找到问题吗?

0 个答案:

没有答案