将java转换为scala代码

时间:2017-03-08 07:29:06

标签: java scala apache-spark

将java转换为scala代码我面临一个奇怪的问题 可以在此处找到一个示例https://gist.github.com/geoHeil/895260a04d3673b9848b345edf388a2d 错误是

[error] src/main/scala/myOrg/CustomInputMapperWKT.scala:17: overriding method call in trait FlatMapFunction of type (x$1: String)java.util.Iterator[Any];
[error]  method call has incompatible type
[error]   override def call(line: String): Iterator[_] = {

尝试将spark java转换为spark scala API时 我正在努力将这个java类https://github.com/DataSystemsLab/GeoSpark/blob/master/src/main/java/org/datasyslab/geospark/showcase/UserSuppliedPolygonMapper.java#L59-L81移植到scala。

其中

class CustomInputMapperWKT extends FlatMapFunction[String, Any] {
....
override def call(line: String): Iterator[_] = {
val result: collection.Seq[Polygon] with Growable[Polygon] = mutable.Buffer[Polygon]()
result.iterator
  }
}

是描述问题的最小样本。

修改

尝试修复可能的输入问题我用其各自类型的多边形替换Any。 但这对福克斯的问题没有帮助。

1 个答案:

答案 0 :(得分:2)

你试过这个签名吗?

override def call(t: String): java.util.Iterator[Any] = {
...

因为此代码示例已成功编译:

import org.apache.spark.api.java.function.FlatMapFunction
import collection.JavaConverters._

class CustomInputMapperWKT extends FlatMapFunction[String, Any] {
  override def call(t: String): java.util.Iterator[Any] = {
    ...
    result.iterator.asJava
  }
}