将Spark应用程序提交到Spark REST URL时,总是得到如下例外:
18/04/13 11:54:29 ERROR TransportResponseHandler: Still have 1 requests outstanding when connection from /10.11.9.2:6066 is closed
18/04/13 11:54:29 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master 10.11.9.2:6066
org.apache.spark.SparkException: Exception thrown in awaitResult:
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100)
at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108)
at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
18/04/13 11:55:09 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:91)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:524)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2516)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:918)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:910)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:910)
at io.kf.etl.context.Context$$anonfun$getSparkSession$2.apply(Context.scala:76)
at io.kf.etl.context.Context$$anonfun$getSparkSession$2.apply(Context.scala:59)
at scala.Option.map(Option.scala:146)
at io.kf.etl.context.Context$.getSparkSession(Context.scala:59)
at io.kf.etl.context.Context$.sparkSession$lzycompute(Context.scala:20)
at io.kf.etl.context.Context$.sparkSession(Context.scala:20)
at io.kf.etl.processors.common.inject.ProcessorInjectModule.sparkSession$lzycompute(ProcessorInjectModule.scala:8)
at io.kf.etl.processors.common.inject.ProcessorInjectModule.sparkSession(ProcessorInjectModule.scala:8)
at io.kf.etl.processors.download.inject.DownloadInjectModule.getContext(DownloadInjectModule.scala:40)
at io.kf.etl.processors.download.inject.DownloadInjectModule.getProcessor(DownloadInjectModule.scala:54)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.google.inject.internal.ProviderMethod.get(ProviderMethod.java:104)
at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:40)
at com.google.inject.internal.InjectorImpl$4$1.call(InjectorImpl.java:978)
at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1024)
at com.google.inject.internal.InjectorImpl$4.get(InjectorImpl.java:974)
at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1013)
at io.kf.etl.ETLMain$.delayedEndpoint$io$kf$etl$ETLMain$1(ETLMain.scala:42)
at io.kf.etl.ETLMain$delayedInit$body.apply(ETLMain.scala:17)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at io.kf.etl.ETLMain$.main(ETLMain.scala:17)
at io.kf.etl.ETLMain.main(ETLMain.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:58)
at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
18/04/13 11:55:09 INFO SparkContext: SparkContext already stopped.
18/04/13 11:55:09 INFO SparkContext: Successfully stopped SparkContext
我在MacOS上运行Spark 2.2.1
配置如下所示:
SPARK_LOCAL_IP=10.11.9.2
SPARK_MASTER_HOST=10.11.9.2
提交命令行是
${SPARK_HOME}/bin/spark-submit --master spark://10.11.9.2:6066 --deploy-mode cluster --class ....
如果我将申请提交到7077号港口,一切都很好。
答案 0 :(得分:0)
隐藏的REST API不应与spark-submit
一起使用。相反,所有参数和作业定义都应作为http请求提交给http://rest-ip:6066/v1/submissions/create
。
答案 1 :(得分:0)
我自己想通了。
提交命令行很好,但是当我初始化SparkSession时,我也传递了spark://10.11.9.2:6066
作为主字符串。
如果传入spark://10.11.9.2:7077
,一切都运作良好。