我想将文本文件写入HDFS。 必须动态生成将文件写入HDFS的路径。 如果文件路径(包括文件名)是新的,则应创建该文件并将文本写入其中。 如果文件路径(包括文件)已存在,则必须将该字符串附加到现有文件。
我使用了以下代码。文件创建工作正常。但无法将文本附加到现有文件中。
def writeJson(uri: String, Json: JValue, time: Time): Unit = {
val path = new Path(generateFilePath(Json, time))
val conf = new Configuration()
conf.set("fs.defaultFS", uri)
conf.set("dfs.replication", "1")
conf.set("dfs.support.append", "true")
conf.set("dfs.client.block.write.replace-datanode-on-failure.enable","false")
val Message = compact(render(Json))+"\n"
try{
val fileSystem = FileSystem.get(conf)
if(fileSystem.exists(path).equals(true)){
println("File exists.")
val outputStream = fileSystem.append(path)
val bufferedWriter = new BufferedWriter(new OutputStreamWriter(outputStream))
bufferedWriter.write(Message.toString)
bufferedWriter.close()
println("Appended to file in path : " + path)
}
else {
println("File does not exist.")
val outputStream = fileSystem.create(path, true)
val bufferedWriter = new BufferedWriter(new OutputStreamWriter(outputStream))
bufferedWriter.write(Message.toString)
bufferedWriter.close()
println("Created file in path : " + path)
}
}catch{
case e:Exception=>
e.printStackTrace()
}
}
Hadoop版本:2.7.0
每当必须执行追加时,都会生成以下错误:
org.apache.hadoop.ipc.RemoteException(java.lang.ArrayIndexOutOfBoundsException)
答案 0 :(得分:1)
我可以看到3种可能性:
hdfs
提供的外部命令,请参阅:
https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/HDFSCommands.html。甚至WebHDFS REST功能:https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/WebHDFS.html hadoop-hdfs
库http://mvnrepository.com/artifact/org.apache.hadoop/hadoop-hdfs/2.7.1提供的hdfs API