在我的Cluster Kubernates中,我的spark提交产生了此错误:
Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused:
localhost/127.0.0.1:7078
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:714)
at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:327)
at
io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:334)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:688)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:635)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:552)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:514)
at io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:748)
“我的火花提交”的一部分是:
spark-submit \
--master $K8S_MASTER_URL \
--deploy-mode cluster \
--conf spark.executor.instances=$N_EXECUTORS \
--conf spark.kubernetes.container.image=$SPARK_IMAGE \
--conf spark.kubernetes.authenticate.driver.serviceAccountName=$SPARK_SERVICE_ACCOUNT \
--conf spark.kubernetes.node.selector.spark=true \
--conf spark.kubernetes.container.image.pullPolicy=Always \
--conf spark.kubernetes.namespace=$SPARK_NAMESPACE \
--conf spark.driver.memory=$DRIVER_MEMORY \
--conf spark.executor.memory=$EXECUTOR_MEMORY \
--conf spark.kubernetes.driver.pod.name=dbloader-events-$TS \
--conf spark.kubernetes.pyspark.pythonVersion="3"\
--conf spark.executorEnv.PYSPARK_PYTHON=/usr/bin/python3 \
--conf spark.driverEnv.PYSPARK_PYTHON=/usr/bin/python3 \
--conf spark.driverEnv.PYTHONPATH=/usr/bin \
--conf spark.executorEnv.PYTHONPATH=/usr/bin \
--conf spark.driverEnv.PYSPARK_DRIVER_PYTHON=/usr/bin \
--jars $JAR_DEPENDENCIES
使用spark-submit,我创建了一个在python中启动工作spark的pod。但是这项工作显示了提到的错误 我需要帮助。