Spark 1.6 - 仅删除1项的项目集

时间:2016-12-20 11:40:21

标签: apache-spark market-basket-analysis

我有以下代码:

val df = sqlContext.sql("SELECT Transaction_ID,Product_ID FROM Transactions as tmp")
val rawDict = df.select('Product_ID).distinct().sort('Product_ID)
val dictCounts = rawDict.groupBy('Product_ID).count().filter(col("count") >= 2)
val sigCounts = dictCounts.filter('count === 1)
val dupCounts = dictCounts.filter('count > 1) 
val sigDescs = rawDict.join(sigCounts, "Product_ID").drop('count)
val invoiceToStockCode = df.select('Transaction_ID, 'Product_ID).distinct()
val baskets = invoiceToStockCode.groupBy('Transaction_ID).agg(collect_list('Product_ID).as('StockCodes)).cache()

我试图提取一些关联规则。为此,我需要保证所有交易都由多个产品分组。但是使用我的代码,我只能与一种产品进行交易。

我该如何过滤?

谢谢!

1 个答案:

答案 0 :(得分:0)

虽然我不是100%肯定我得到了你的要求,但我认为这应该有效:

val rawDict = df.select('Product_ID).sort('Product_ID)
val dictCounts = rawDict.groupBy('Product_ID).count()
val valid = dictCounts.filter('count > 1).drop('count) // only consider counts > 1
val singleRemoved=df.join(valid,Seq("Product_ID")).distinct()
val groupedTransactions = singleRemoved.groupBy("Transaction_ID").agg(collect_list("Product_ID"))