如何在Scalar spark中将以空格分隔的文件转换为CSV文件?

时间:2019-06-25 02:41:02

标签: scala apache-spark-sql

我有一个CSV文件。 这是我的输入内容:

a _ \_ \ b_c b\_c "

现在,我想将以空格分隔的文件转换为CSV文件。我该怎么办?

未指定的字段被视为“字符串0”,不包含在内 用引号引起来。

这是规格:

1.The string "_" by itself is converted to a null string.
( -n option changes "_" )

2.The string \c is converted to c.

3.The backslash character \ by itself is converted to a space

4.The underscore is converted to a space if it occurs in a string.
( -s option changes "_" )

5.\n at the end of a line is converted automatically to \r\n.

6.Within String 1, " is converted to "".

我想要具有以下期望的输出结果。请帮助我。

"a","","_"," ","b c","b_c",""""

1 个答案:

答案 0 :(得分:1)

这些要求让我有些困惑,但是您可以尝试使用此方法(产生预期的输出):

import scala.util.matching.Regex

val input = "a _ \\_ \\ b_c b\\_c \""

// List of replacements required (first replacement will be apply first)
val replacements: List[(Regex, String)] = List(
  ("""^_$""".r,         ""),
  ("""(?<!\\)_""".r,    " "),
  ("""\\(.)""".r,       "$1"),
  ("""\\""".r,          " "),
  (""""""".r,           "\"\""))

def applyReplacements(inputString: String, replacements: List[(Regex, String)]): String =
  replacements match {
    case Nil =>
      inputString
    case replacement :: tail => 
      applyReplacements(
        replacement._1.replaceAllIn(inputString, replacement._2),
        tail)
  }

def processLine(input: String): String = {
  val inputArray = input.split(" ")
  val outputArray = inputArray.map(x => applyReplacements(x, replacements))
  val finalLine = outputArray.map(x => s"""\"${x}\"""").mkString(",")

  // Use s"${finalLine}\r\n" instead if you need the '\r\n' ending
  finalLine
}

processLine(input)
// output:
// String = "a","","_"," ","b c","b_c",""""

可能您将不得不进行一些修改以使其完全适应您的要求(这对我来说还不是很清楚)。

如果需要在Spark RDD上应用它,则必须将processLine放在map中,以便它处理RDD中的每一行。

希望有帮助。