我已经用Apache Spark和
一切稳定。但是http://X.X.X.X:4040/executors/
SparkUI executors个端点抛出java.io.FileNotFoundException
,但找不到/opt/x/x!/BOOT-INF/lib/spark-core_2.11-2.2.0.jar
。我检查了inner jar。此问题仅在Linux上发生,on Windows可以正常工作。
2019-04-23 07:01:24,038 WARN [org.spark_project.jetty.servlet.ServletHandler] [SparkUI-36] -
org.spark_project.jetty.servlet.ServletHolder$1: org.glassfish.jersey.server.internal.scanning.ResourceFinderException: java.io.FileNotFound
Exception: /opt/x/x.jar!/BOOT-INF/lib/spark-core_2.11-2.2.0.jar (No such file or directory)
at org.spark_project.jetty.servlet.ServletHolder.makeUnavailable(ServletHolder.java:594)
at org.spark_project.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:658)
at org.spark_project.jetty.servlet.ServletHolder.getServlet(ServletHolder.java:496)
at org.spark_project.jetty.servlet.ServletHolder.ensureInstance(ServletHolder.java:788)
at org.spark_project.jetty.servlet.ServletHolder.prepare(ServletHolder.java:773)
at org.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:578)
at org.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
at org.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511)
at org.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
at org.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.spark_project.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:461)
at org.spark_project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
at org.spark_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.spark_project.jetty.server.Server.handle(Server.java:524)
at org.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:319)
at org.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:253)
at org.spark_project.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)
at org.spark_project.jetty.io.FillInterest.fillable(FillInterest.java:95)
at org.spark_project.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
at org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
at org.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
at org.spark_project.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.glassfish.jersey.server.internal.scanning.ResourceFinderException: java.io.FileNotFoundException: /opt/x/x.jar!/BOOT-INF/lib/spark-core_2.11-2.2.0.jar (No such file or directory)
at org.glassfish.jersey.server.internal.scanning.JarZipSchemeResourceFinderFactory.create(JarZipSchemeResourceFinderFactory.java:90)
答案 0 :(得分:0)
问题的原因是泽西岛的局限性-它不能应对嵌套的JAR文件。您需要将Spring Boot配置为在应用启动时自动解压缩所有包含spark-core
资源的JAR。就我而言(我正在使用spark-core_2.12
),解决方案是将以下部分添加到pom.xml
文件中:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<requiresUnpack>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
</dependency>
</requiresUnpack>
</configuration>
</plugin>