如下所示:
test: Array[scala.collection.immutable.Map[String,Any]] = Array(
Map(_c3 -> "foobar", _c5 -> "impt", _c0 -> Key1, _c4 -> 20.0, _c1 -> "next", _c2 -> 1.0),
Map(_c3 -> "high", _c5 -> "low", _c0 -> Key2, _c4 -> 19.0, _c1 -> "great", _c2 -> 0.0),
Map(_c3 -> "book", _c5 -> "game", _c0 -> Key3, _c4 -> 42.0, _c1 -> "name", _c2 -> 0.5)
)
如何根据仅包含Key Value
的{{1}}将其转换为_c0
对?
如下所示
Strings
答案 0 :(得分:0)
请查看
test.map(
_.filter(!_._2.toString.matches("[+-]?\\d+.?\\d+"))
).flatMap(
data =>
{
val key = data.getOrElse("_c0", "key_not_found")
data
.filter(_._1 != "_c0")
.map(
key +" "+_._2.toString()
)
}
)
答案 1 :(得分:0)
试试这个方法
import org.apache.spark.sql.functions._
# first extract all values which are string
val rdd = sc.parallelize(test).map(x => (x.getOrElse("_c0","no key").toString -> (x - "_c0").values.filter(_.isInstanceOf[String]).asInstanceOf[List[String]]))
val df = spark.createDataFrame(rdd).toDF("key", "vals")
# use explode function to add new rows
df.withColumn("vals", explode(col("vals"))).show()
答案 2 :(得分:0)
怎么样:
test
.map(row => row.getOrElse(_c0, "") -> (row - _c0).values.filter(_.isInstanceOf[String]))
.flatMap { case (key, innerList) => innerList.map(key -> _) }