sryza / spark-timeseries:NoSuchMethodError:scala.runtime.IntRef.create(I)Lscala / runtime / IntRef;

时间:2016-09-29 16:27:24

标签: scala sbt

我有一个用sbt构建的Scala项目。它使用sryza/spark-timeseries library

我正在尝试运行以下简单代码:

  val tsAirPassengers = new DenseVector(Array(
    112.0,118.0,132.0,129.0,121.0,135.0,148.0,148.0,136.0,119.0,104.0,118.0,115.0,126.0,
    141.0,135.0,125.0,149.0,170.0,170.0,158.0,133.0,114.0,140.0,145.0,150.0,178.0,163.0,
    172.0,178.0,199.0,199.0,184.0,162.0,146.0,166.0,171.0,180.0,193.0,181.0,183.0,218.0,
    230.0,242.0,209.0,191.0,172.0,194.0,196.0,196.0,236.0,235.0,229.0,243.0,264.0,272.0,
    237.0,211.0,180.0,201.0,204.0,188.0,235.0,227.0,234.0,264.0,302.0,293.0,259.0,229.0,
    203.0,229.0,242.0,233.0,267.0,269.0,270.0,315.0,364.0,347.0,312.0,274.0,237.0,278.0,
    284.0,277.0,317.0,313.0,318.0,374.0,413.0,405.0,355.0,306.0,271.0,306.0,315.0,301.0,
    356.0,348.0,355.0,422.0,465.0,467.0,404.0,347.0,305.0,336.0,340.0,318.0,362.0,348.0,
    363.0,435.0,491.0,505.0,404.0,359.0,310.0,337.0,360.0,342.0,406.0,396.0,420.0,472.0,
    548.0,559.0,463.0,407.0,362.0,405.0,417.0,391.0,419.0,461.0,472.0,535.0,622.0,606.0,
    508.0,461.0,390.0,432.0
  ))

val period = 12
val model = HoltWinters.fitModel(tsAirPassengers, period, "additive", "BOBYQA")

它构建良好,但当我尝试运行它时,我收到此错误:

Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.IntRef.create(I)Lscala/runtime/IntRef;
    at com.cloudera.sparkts.models.HoltWintersModel.convolve(HoltWinters.scala:252)
    at com.cloudera.sparkts.models.HoltWintersModel.initHoltWinters(HoltWinters.scala:277)
    at com.cloudera.sparkts.models.HoltWintersModel.getHoltWintersComponents(HoltWinters.scala:190)
.
.
.

此行发生错误:

val model = HoltWinters.fitModel(tsAirPassengers, period, "additive", "BOBYQA")

我的 build.sbt 包括:

name := "acme-project"
version := "0.0.1"
scalaVersion := "2.10.5"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-hive" % "1.6.0",
  "net.liftweb" %% "lift-json" % "2.5+",
  "com.github.seratch" %% "awscala" % "0.3.+",
  "org.apache.spark" % "spark-mllib_2.10" % "1.6.2"
)

我已将 sparkts-0.4.0-SNAPSHOT.jar 放入我项目的 lib 文件夹中。 (我本来希望添加一个libraryDependency,但 spark-ts 似乎不在Maven Central上。)

导致此运行时错误的原因是什么?

1 个答案:

答案 0 :(得分:2)

图书馆需要Scala 2.11,而不是2.10,而Spark 2.0,而不是1.6.2,你可以从

看到
<scala.minor.version>2.11</scala.minor.version>
<scala.complete.version>${scala.minor.version}.8</scala.complete.version>
<spark.version>2.0.0</spark.version>
pom.xml中的

。您可以尝试更改这些并查看它是否仍在编译,查找哪个旧版本的sparkts与您的版本兼容,或者更新项目的Scala和Spark版本(在这种情况下不要错过spark-mllib_2.10

此外,如果您将jar放入lib文件夹,您还必须将其依赖项(及其依赖项等)或libraryDependencies放入其中。而是使用sparkts(IIRC)将mvn install发布到本地存储库,并将其添加到libraryDependencies,这将允许SBT解析其依赖关系。