在Spark DataFrames中创建随机要素数组

时间:2018-03-13 21:51:42

标签: arrays scala apache-spark dataframe vector

创建ALS模型时,我们可以提取userFactors DataFrame和itemFactors DataFrame。这些DataFrame包含一个带有Array的列。

我想生成一些随机数据并将其与userFactors DataFrame联合起来。

这是我的代码:

 val df1: DataFrame  = Seq((123, 456, 4.0), (123, 789, 5.0), (234, 456, 4.5), (234, 789, 1.0)).toDF("user", "item", "rating")
val model1 = (new ALS()
 .setImplicitPrefs(true)
 .fit(df1))

val iF = model1.itemFactors
val uF = model1.userFactors

然后我使用VectorAssembler创建一个随机数据框,使用此函数:

def makeNew(df: DataFrame, rank: Int): DataFrame = {
    var df_dummy = df
    var i: Int = 0
    var inputCols: Array[String] = Array()
    for (i <- 0 to rank) {
       df_dummy = df_dummy.withColumn("feature".concat(i.toString), rand())
       inputCols = inputCols :+ "feature".concat(i.toString)
      }
    val assembler = new VectorAssembler()
      .setInputCols(inputCols)
      .setOutputCol("userFeatures")
    val output = assembler.transform(df_dummy)
    output.select("user", "userFeatures")
  }

然后,我使用新的用户ID创建DataFrame,并添加随机向量和偏差:

val usersDf: DataFrame = Seq(567), (678)).toDF("user")
var usersFactorsNew: DataFrame = makeNew(usersDf, 20)

当我合并两个DataFrame时出现问题。

usersFactorsNew.union(uF)产生错误:

 org.apache.spark.sql.AnalysisException: Union can only be performed on tables with the compatible column types. struct<type:tinyint,size:int,indices:array<int>,values:array<double>> <> array<float> at the second column of the second table;;

如果我打印架构,则uF DataFrame会将Array[Float]类型的要素向量和usersFactorsNew DataFrame作为Vector类型的要素向量。

我的问题是如何将Vector的类型更改为数组以执行联合。

我尝试写这个udf但收效甚微:

val toArr: org.apache.spark.ml.linalg.Vector => Array[Double] = _.toArray
val toArrUdf = udf(toArr)

也许VectorAssembler不是此任务的最佳选择。但是,目前,这是我找到的唯一选择。我希望得到更好的建议。

2 个答案:

答案 0 :(得分:1)

您可以直接使用VectorAssembler,而不是创建虚拟数据帧并使用UDF生成随机要素向量。 userFactors模型中的ALS将返回Array[Float],因此UDF的输出应与该值匹配。

val createRandomArray = udf((rank: Int) => {
  Array.fill(rank)(Random.nextFloat)
})

请注意,这将在区间[0.0,1.0]中给出数字(与问题中使用的rand()相同),如果需要其他数字,则修改为合适。

使用等级3和userDf

val usersFactorsNew = usersDf.withColumn("userFeatures", createRandomArray(lit(3)))

将给出如下数据帧(当然是随机特征值)

+----+----------------------------------------------------------+
|user|userFeatures                                              |
+----+----------------------------------------------------------+
|567 |[0.6866711267486822,0.7257031656127676,0.983562255688249] |
|678 |[0.7013908820314967,0.41029552817665327,0.554591149586789]|
+----+----------------------------------------------------------+

现在应该可以将此数据框与uF数据帧相连接。

UDF无效的原因应该是Array[Double] while you need an数组[Float] for the联合. It should be possible to fix with a地图(_。toFloat)`。< / p>

val toArr: org.apache.spark.ml.linalg.Vector => Array[Float] = _.toArray.map(_.toFloat)
val toArrUdf = udf(toArr)

答案 1 :(得分:1)

您的所有流程都是正确的。即使udf功能正在成功运行。您需要做的就是将makeNew函数的最后一部分更改为

def makeNew(df: DataFrame, rank: Int): DataFrame = {
  var df_dummy = df
  var i: Int = 0
  var inputCols: Array[String] = Array()
  for (i <- 0 to rank) {
    df_dummy = df_dummy.withColumn("feature".concat(i.toString), rand())
    inputCols = inputCols :+ "feature".concat(i.toString)
  }
  val assembler = new VectorAssembler()
    .setInputCols(inputCols)
    .setOutputCol("userFeatures")
  val output = assembler.transform(df_dummy)
  output.select(col("id"), toArrUdf(col("userFeatures")).as("features"))
}

你应该做得很好,以便你这样做(我创建了具有id列的userDf和非用户列

val usersDf: DataFrame = Seq((567), (678)).toDF("id")
var usersFactorsNew: DataFrame = makeNew(usersDf, 20)
usersFactorsNew.union(uF).show(false)

你应该得到

+---+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|id |features                                                                                                                                                                                                                                                                                                                                                                                                                                  |
+---+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|567|[0.8259185719733708, 0.327713892339658, 0.049547223031371046, 0.056661808506210054, 0.5846626163454274, 0.038497936270104005, 0.8970865088803417, 0.8840660648882804, 0.837866669938156, 0.9395263094918058, 0.09179528484355126, 0.4915430644129799, 0.11083447052043116, 0.5122858182953718, 0.4302683812966408, 0.3862741815833828, 0.6189322403095068, 0.3000371006293433, 0.09331299668168902, 0.7421838728601371, 0.855867963988993]|
|678|[0.7686514248005568, 0.5473580740023187, 0.072945344124282, 0.36648594574355287, 0.9780202082328863, 0.5289221651923784, 0.3719451099963028, 0.2824660794505932, 0.4873197501260199, 0.9364676464120849, 0.011539929543513794, 0.5240615794930654, 0.6282546154521298, 0.995256022569878, 0.6659179561266975, 0.8990775317754092, 0.08650071017556926, 0.5190186149992805, 0.056345335742325475, 0.6465357505620791, 0.17913532817943245] |
|123|[0.04177388548851013, 0.26762014627456665, -0.19617630541324615, 0.34298020601272583, 0.19632814824581146, -0.2748605012893677, 0.07724890112876892, 0.4277132749557495, 0.1927199512720108, -0.40271613001823425]                                                                                                                                                                                                                        |
|234|[0.04139673709869385, 0.26520395278930664, -0.19440513849258423, 0.3398836553096771, 0.1945556253194809, -0.27237895131111145, 0.07655145972967148, 0.42385169863700867, 0.19098000228405, -0.39908021688461304]                                                                                                                                                                                                                          |
+---+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+