我正在使用Play WsClient将请求发送到Spark驱动程序前端的Spray服务器端点。有问题的电话在这里:
.container {
display: flex;
flex-wrap: wrap;
}
.container > .label {
width: 100%;
}
.main > div {
display: inline-block;
}
执行时,我收到此错误
<div class="container">
<div class="label">label1</div>
<div class="label">label2</div>
<div class="main">
<div class="inner-one">inner-one</div>
<div class="inner-two">inner-two</div>
<div class="inner-three">inner-three</div>
</div>
<div class="label">label3</div>
<div class="label">label4</div>
</div>
看起来WSClient正在挑选一个不包含相关功能的Netty版本。
当我使用2.2-SNAPSHOT版本的Spark编译应用程序时会出现此问题,但在使用2.1版本编译时则不会。我不知道为什么这种改变会产生影响。 Spark驱动程序是我的sbt build中的一个单独项目。
我怀疑这与应用程序及其依赖项的打包有关。以下是我在sbt中尝试重新认识的内容:
def serializeDataset(requestUrl: String, recipe: Recipe): Future[(Option[String], String, Int)] = {
ws.url(requestUrl).post(Json.toJson(recipe)).map { response =>
val code = (response.json \ "code").as[Int]
code match {
case OK => ((response.json \ "uuid").asOpt[String], (response.json \ "schema").as[String], code)
case _ => ((response.json \ "message").asOpt[String], "", code)
}
}
}
)
将fire语句添加到spark进口中,如下所示:
Caused by: java.lang.NoSuchMethodError: io.netty.util.internal.PlatformDependent.newAtomicIntegerFieldUpdater(Ljava/lang/Class;Ljava/lang/String;)Ljava/util/concurrent/atomic/AtomicIntegerFieldUpdater;
at org.asynchttpclient.netty.NettyResponseFuture.<clinit>(NettyResponseFuture.java:52)
at org.asynchttpclient.netty.request.NettyRequestSender.newNettyResponseFuture(NettyRequestSender.java:311)
at org.asynchttpclient.netty.request.NettyRequestSender.newNettyRequestAndResponseFuture(NettyRequestSender.java:193)
at org.asynchttpclient.netty.request.NettyRequestSender.sendRequestWithCertainForceConnect(NettyRequestSender.java:129)
at org.asynchttpclient.netty.request.NettyRequestSender.sendRequest(NettyRequestSender.java:107)
at org.asynchttpclient.DefaultAsyncHttpClient.execute(DefaultAsyncHttpClient.java:216)
at org.asynchttpclient.DefaultAsyncHttpClient.executeRequest(DefaultAsyncHttpClient.java:184)
at play.api.libs.ws.ahc.AhcWSClient.executeRequest(AhcWS.scala:45)
at play.api.libs.ws.ahc.AhcWSRequest$.execute(AhcWS.scala:90)
at play.api.libs.ws.ahc.AhcWSRequest$$anon$2.execute(AhcWS.scala:166)
at play.api.libs.ws.ahc.AhcWSRequest.execute(AhcWS.scala:168)
at play.api.libs.ws.WSRequest$class.post(WS.scala:510)
at play.api.libs.ws.ahc.AhcWSRequest.post(AhcWS.scala:107)
at webservices.DataFrameService.serializeDataset(DataFrameService.scala:36)
更改了play-ws模块添加到项目依赖项的顺序(将其移至最后,将其移至开头)
任何帮助都非常感激。
答案 0 :(得分:0)
在进一步审查中,我发现Play项目中的Spark库存在挥之不去的依赖关系。我删除了它,它似乎正在工作。