无法在包org.apache.spark.util.random

时间:2015-12-13 16:12:51

标签: apache-spark

我正在尝试使用Spark的XORShiftRandom来生成随机数。代码非常简单:

  1 import org.apache.spark._
  2 import org.apache.spark.util.random.XORShiftRandom
  3 
  4 object randomTest {
  5   def main(args: Array[String]) = {
  6     val x = new XORShiftRandom()
  7   }
  8 }

build.sbt如下:

  1 name := "randomTest"
  2 version := "0.01"
  3 scalaVersion := "2.10.4"
  4 libraryDependencies ++= Seq(
  5   "org.apache.spark" %% "spark-core" % "1.5.0" withSources() withJavadoc(),
  6   "org.apache.spark" %% "spark-mllib" % "1.5.0" withSources() withJavadoc()
  7   )

但是我收到了标题中显示的错误消息。

1 个答案:

答案 0 :(得分:0)

该类被标记为私有的spark包,因此您无法调用构造函数。也许您可以访问一些公共方法documented here

Offending line in source code