将List转换为dataframe spark scala

时间:2017-01-26 04:57:51

标签: scala apache-spark apache-spark-sql spark-dataframe

我有一个包含超过30个字符串的列表。如何将列表转换为数据帧。 我尝试了什么:

例如

Val list=List("a","b","v","b").toDS().toDF()

Output :


+-------+
|  value|
+-------+
|a      |
|b      |
|v      |
|b      |
+-------+


Expected Output is 


  +---+---+---+---+
| _1| _2| _3| _4|
+---+---+---+---+
|  a|  b|  v|  a|
+---+---+---+---+

对此有任何帮助。

3 个答案:

答案 0 :(得分:4)

List("a","b","c","d")表示包含一个字段的记录,因此结果集在每行中显示一个元素。

要获得预期的输出,该行应包含四个字段/元素。因此,我们将列表环绕为List(("a","b","c","d")),代表一行,包含四个字段。 以类似的方式,具有两行的列表为List(("a1","b1","c1","d1"),("a2","b2","c2","d2"))

scala> val list = sc.parallelize(List(("a", "b", "c", "d"))).toDF()
list: org.apache.spark.sql.DataFrame = [_1: string, _2: string, _3: string, _4: string]

scala> list.show
+---+---+---+---+
| _1| _2| _3| _4|
+---+---+---+---+
|  a|  b|  c|  d|
+---+---+---+---+


scala> val list = sc.parallelize(List(("a1","b1","c1","d1"),("a2","b2","c2","d2"))).toDF
list: org.apache.spark.sql.DataFrame = [_1: string, _2: string, _3: string, _4: string]

scala> list.show
+---+---+---+---+
| _1| _2| _3| _4|
+---+---+---+---+
| a1| b1| c1| d1|
| a2| b2| c2| d2|
+---+---+---+---+

答案 1 :(得分:3)

为了使用toDF,我们必须导入

import spark.sqlContext.implicits._

请参阅以下代码

val spark = SparkSession.
builder.master("local[*]")
  .appName("Simple Application")
.getOrCreate()

import spark.sqlContext.implicits._

val lstData = List(List("vks",30),List("harry",30))
val mapLst = lstData.map{case List(a:String,b:Int) => (a,b)}
val lstToDf = spark.sparkContext.parallelize(mapLst).toDF("name","age")
lstToDf.show

val llist = Seq(("bob", "2015-01-13", 4), ("alice", "2015-04- 23",10)).toDF("name","date","duration")
llist.show

答案 2 :(得分:0)

这可以做到:

val data = List(("Value1", "Cvalue1", 123, 2254, 22),("Value1", "Cvalue2", 124, 2255, 23));
val df = spark.sparkContext.parallelize(data).toDF("Col1", "Col2", "Expend1", "Expend2","Expend3");
val cols=Array("Expend1","Expend2","Expend3");
val df1=df
        .withColumn("keys",lit(cols))
        .withColumn("values",array($"Expend1",$"Expend2",$"Expend3"))
        .select($"col1",$"col2",explode_outer(map_from_arrays($"keys", $"values")))
        .show(false)