无法从Java连接到Kubernetes中运行的Spark

时间:2018-03-19 11:44:51

标签: java apache-spark kubernetes remote-access remote-server

我已经安装了Kuberenetes(适用于Windows 10的minikube)并使用helm在那里添加了Spark:

.\helm.exe install --name spark-test stable/spark

然后我使用

暴露了Spark主端口7077
.\kubectl.exe expose deployment spark-test-master --port=7070 --name=spark-master-ext --type=NodePort

例如,我的UI在http://<MINIKUBE_IP>:31905/上运行,并且spark master暴露于&lt; MINIKUBE_IP&gt;:32473。为了检查,我做了:

.\minikube-windows-amd64.exe service spark-master-ext

但是当我用Java做的时候:

SparkConf conf = new SparkConf().setMaster("spark://192.168.1.168:32473").setAppName("Data Extractor");

我有:

18/03/19 13:57:29 WARN AppClient$ClientEndpoint: Could not connect to 192.168.1.168:32473: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@192.168.1.168:32473]
18/03/19 13:57:29 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://sparkMaster@192.168.1.168:32473] has failed, address is now gated for [5000] ms. Reason: [Association failed with [akka.tcp://sparkMaster@192.168.1.168:32473]] Caused by: [Connection refused: no further information: /192.168.1.168:32473]
18/03/19 13:57:29 WARN AppClient$ClientEndpoint: Failed to connect to master 192.168.1.168:32473
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Anchor(akka.tcp://sparkMaster@192.168.1.168:32473/), Path(/user/Master)]

任何想法,如何在Minikube上运行的Spark上运行Java Spark作业?

1 个答案:

答案 0 :(得分:1)

看起来Spark的Helm图表确实已过时(1.5.1),因此我在本地安装了2.3.0并且运行没有任何问题。案件结案,抱歉:)