成功创建Spark上下文后,Livy会话停留在开始状态

时间:2020-06-03 14:37:39

标签: apache-spark ubuntu livy apache-spark-standalone

我一直在尝试使用可在Ubuntu 18.04上运行的Livy 0.7服务器创建一个新的Spark会话。 在同一台机器上,我有一个正在运行的Spark集群,其中有2个工人,并且能够创建正常的Spark会话。

我的问题是,在对Livy服务器运行以下请求后,会话仍停留在启动状态:

host = 'http://localhost:8998'
data = {'kind': 'spark'}
headers = {'Content-Type': 'application/json'}
r = requests.post(host + '/sessions', data=json.dumps(data), headers=headers)
r.json()

我可以看到会话正在启动,并从会话日志中创建了spark会话:

20/06/03 13:52:31 INFO SparkEntries: Spark context finished initialization in 5197ms
20/06/03 13:52:31 INFO SparkEntries: Created Spark session.
20/06/03 13:52:46 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (xxx.xx.xx.xxx:1828) with ID 0
20/06/03 13:52:47 INFO BlockManagerMasterEndpoint: Registering block manager xxx.xx.xx.xxx:1830 with 434.4 MB RAM, BlockManagerId(0, xxx.xx.xx.xxx, 1830, None)

以及来自spark master UI:

spark runing applications

,在到达livy.rsc.server.idle-timeout之后,会话日志将输出:

20/06/03 14:28:04 WARN RSCDriver: Shutting down RSC due to idle timeout (10m).
20/06/03 14:28:04 INFO SparkUI: Stopped Spark web UI at http://172.17.52.209:4040
20/06/03 14:28:04 INFO StandaloneSchedulerBackend: Shutting down all executors
20/06/03 14:28:04 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
20/06/03 14:28:04 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/06/03 14:28:04 INFO MemoryStore: MemoryStore cleared
20/06/03 14:28:04 INFO BlockManager: BlockManager stopped
20/06/03 14:28:04 INFO BlockManagerMaster: BlockManagerMaster stopped
20/06/03 14:28:04 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/06/03 14:28:04 INFO SparkContext: Successfully stopped SparkContext
20/06/03 14:28:04 INFO SparkContext: SparkContext already stopped.

,然后死了:( enter image description here

我已经尝试过增加驱动程序超时,但是没有运气,也没有发现任何类似的已知问题。 我猜这与Spark驱动程序与rsc的连接性有关,但我不知道在哪里配置

有人知道原因/解决方案吗?

感谢!

0 个答案:

没有答案
相关问题