当注入istio sidecar时pyspark不起作用

时间:2019-02-20 14:51:04

标签: kubernetes pyspark istio

我们在Pod中运行pyspark,在其中启动独立模式启动spark。如果注入了istio sidecar,驱动程序将无法连接到执行程序。 当我们禁用Sidecar注入时,效果很好。

istio是否也阻止Pod间通信?,如果是,该如何解决?有什么办法将Pod间通讯列入白名单?

在执行pyspark任务时注入sidecar时出错:

    at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:1070)
      at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
      at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
      at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
      at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
      at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
      at java.lang.Thread.run(Thread.java:748)
 Caused by: java.nio.channels.ClosedChannelException
      at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)

 Driver stacktrace:
      at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1887)
      at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1875)
      at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1874)
      at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
      at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
      at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1874)
      at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
      at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
      at scala.Option.foreach(Option.scala:257)
      at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
      at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2108)
      at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2057)
      at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2046)
      at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
      at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
      at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
      at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:166)
      ... 32 more
 Caused by: java.io.IOException: Failed to send RPC /jars/com.fasterxml.jackson.core_jackson-databind-2.6.6.jar to default-scheduler-59cb49d65f-ht7j6/192.168.1.197:35688: java.nio.channels.ClosedChannelException
      at org.apache.spark.network.client.TransportClient$2.handleFailure(TransportClient.java:163)
      at org.apache.spark.network.client.TransportClient$StdChannelListener.operationComplete(TransportClient.java:334)
      at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
      at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481)
      at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420)
      at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:122)

default-scheduler-59cb49d65f-ht7j6是窗格名称。

0 个答案:

没有答案