我收到以下错误
found : org.apache.spark.sql.Dataset[(Double, Double)]
required: org.apache.spark.rdd.RDD[(Double, Double)]
val testMetrics = new BinaryClassificationMetrics(testScoreAndLabel)
在以下代码中:
val testScoreAndLabel = testResults.
select("Label","ModelProbability").
map{ case Row(l:Double,p:Vector) => (p(1),l) }
val testMetrics = new BinaryClassificationMetrics(testScoreAndLabel)
从错误来看,似乎testScoreAndLabel
的类型为sql.Dataset
,但BinaryClassificationMetrics
需要RDD
。
如何将sql.Dataset
转换为RDD
?
答案 0 :(得分:1)
我做这样的事情
UIViewController
现在只需执行tableView
val testScoreAndLabel = testResults.
select("Label","ModelProbability").
map{ case Row(l:Double,p:Vector) => (p(1),l) }
转换为RDD
testScoreAndLabel