我在使用Spark的Alluxio上有一个奇怪的错误。我从Alluxio用Spark阅读了20.000个文件,它可以工作。但是我从Alluxio用Spark读取了40.000个文件,它确实有效。我使用Alluxio 1.2,Spark 1.6.0并使用文件API读取数据:FileSystem fs = FileSystem.Factory.get(); AlluxioURI path = new AlluxioURI(/partition0); ...
16/08/19 16:08:40 INFO logger.type: Client registered with FileSystemMasterClient master @ master/127.0.0.1:19998
16/08/19 16:08:41 ERROR logger.type: Frame size (17277505) larger than max length (16777216)!
org.apache.thrift.transport.TTransportException: Frame size (17277505) larger than max length (16777216)!
at org.apache.thrift.transport.TFramedTransport.readFrame(TFramedTransport.java:137)
at org.apache.thrift.transport.TFramedTransport.read(TFramedTransport.java:101)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
at org.apache.thrift.protocol.TProtocolDecorator.readMessageBegin(TProtocolDecorator.java:135)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
at alluxio.thrift.FileSystemMasterClientService$Client.recv_listStatus(FileSystemMasterClientService.java:503)
at alluxio.thrift.FileSystemMasterClientService$Client.listStatus(FileSystemMasterClientService.java:489)
at alluxio.client.file.FileSystemMasterClient$8.call(FileSystemMasterClient.java:220)
at alluxio.client.file.FileSystemMasterClient$8.call(FileSystemMasterClient.java:216)
at alluxio.AbstractClient.retryRPC(AbstractClient.java:324)
at alluxio.client.file.FileSystemMasterClient.listStatus(FileSystemMasterClient.java:216)
at alluxio.client.file.BaseFileSystem.listStatus(BaseFileSystem.java:195)
at alluxio.client.file.BaseFileSystem.listStatus(BaseFileSystem.java:186)
at Main.main(Main.java:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Exception in thread "main" java.io.IOException: Failed after 32 retries.
at alluxio.AbstractClient.retryRPC(AbstractClient.java:334)
at alluxio.client.file.FileSystemMasterClient.listStatus(FileSystemMasterClient.java:216)
at alluxio.client.file.BaseFileSystem.listStatus(BaseFileSystem.java:195)
at alluxio.client.file.BaseFileSystem.listStatus(BaseFileSystem.java:186)
at Main.main(Main.java:119)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
这不是alluxio.security.authentication.type
问题,因为我在本地运行Alluxio并且Alluxio主地址是正确的。我不明白为什么它不适用于40.000文件,而它可以使用20.000个文件。
我还修改了alluxio.network.thrift.frame.size.bytes.max
但未修改结果..
答案 0 :(得分:0)
此问题可能由不同的原因引起:
请仔细检查Alluxio主地址的端口是否正确。 Alluxio master的默认侦听端口是端口19998,而导致此错误消息的常见错误是由于在主地址中使用了错误的端口(例如,使用端口19999,这是Alluxio master的默认Web UI端口)。
请确保Alluxio客户端和主服务器的安全设置一致。 Alluxio通过配置alluxio.security.authentication.type提供了不同的方法来验证用户。如果此属性在服务器和客户端之间配置了不同的值(例如,一个使用默认值NOSASL而另一个使用SIMPLE自定义),则会发生此错误。请阅读配置设置,了解如何自定义Alluxio群集和应用程序。
Apache-Spark和Alluxio之间的配置。您必须更改Spark的JVM环境才能在alluxio / conf / alluxio-site.properties中运行alluxio.network.thrift.frame.size.bytes.max。为此,您必须在spark-env.sh中添加导出SPARK_CLASSPATH = $ {ALLUXIO_HOME} / conf:$ {SPARK_CLASSPATH}或使用spark-submit命令添加--driver-class-path pathAlluxio / conf
对我来说,这是第三个解决方案