在我的spark应用程序中,我将数据帧保存为镶木地板文件,如下所示,
comp_df.write.mode("overwrite").saveAsTable("cdr_step1", format="parquet", path="/data/intermediate_data/cdr_step1/")
如果我的数据框大小很小,这可以正常工作。但随着数据集大小的增加,我收到以下错误。 我在互联网上检查了这个问题,在大多数地方,人们解决了这个问题,改变了他们的代码设计。在我的情况下,我只有一行写操作,我不明白我需要改变什么。
17/02/02 13:22:56 ERROR datasources.DefaultWriterContainer: Job job_201702021228_0000 aborted.
17/02/02 13:22:56 INFO yarn.YarnAllocator: Driver requested a total number of 0 executor(s).
17/02/02 13:22:56 WARN scheduler.TaskSetManager: Lost task 1979.0 in stage 3.0 (TID 1984, slv3.cdh-prod.xxxx.com, executor 86): org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException): No lease on /data/intermediate_data/cdr_step1/_temporary/0/_temporary/attempt_201702021322_0003_m_001979_0/part-r-01979-9fe33b7c-0b14-4e63-8e96-6e83aabbe807.gz.parquet (inode 2144221): File does not exist. Holder DFSClient_NONMAPREDUCE_-1523564925_148 does not have any open files.
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkLease(FSNamesystem.java:3635)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.analyzeFileState(FSNamesystem.java:3438)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3294)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:679)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.addBlock(AuthorizationProviderProxyClientProtocol.java:214)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:489)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080)