为什么" \\ s"用" rlike"在Spark SQL中不起作用?

时间:2018-01-08 07:47:22

标签: apache-spark-sql

" \ s"不起作用,如下面的代码:

scala> case class Item(id: Int, name:String)
defined class Item

scala> val df = Seq((1, "hello"), (2, " "))
df: Seq[(Int, String)] = List((1,hello), (2," "))

scala> df.toDF("id", "name").as[Item].filter("name rlike '\\s'").show()
+---+----+
| id|name|
+---+----+
+---+----+

1 个答案:

答案 0 :(得分:0)

作为LanguageManual UDF describedrlike需要双重转义模式\\s

在火花壳中,Scala逃脱了一次。因此需要更多\

scala> df.toDF("id", "name").as[Item].filter("name rlike '\\\\s'").show()
+---+----+
| id|name|
+---+----+
|  2|    |
+---+----+

scala> df.toDF("id", "name").as[Item].filter("name rlike '\\\\w'").show()
+---+-----+
| id| name|
+---+-----+
|  1|hello|
+---+-----+

或者使用没有转义的字符串,例如:

scala> val er = """name rlike '\\s'"""
er: String = name rlike '\\s'

scala> df.toDF("id", "name").as[Item].filter(er).show()
+---+----+
| id|name|
+---+----+
|  2|    |
+---+----+