将Scala代码转换为Java for Spark Partitioner

时间:2015-05-09 08:19:46

标签: java scala apache-spark partitioning

所以我尝试使用SparkJava来实现自定义分区程序,我找到了一个很好的例子,说明如何在线执行此操作,但它使用的是Scala,而我不能为我的生活弄清楚它如何正确地转换成Java,所以我可以尝试实现它。有人可以帮忙吗?以下是我在Scala中找到的示例代码:

class DomainNamePartitioner(numParts: Int) extends Partitioner {
  override def numPartitions: Int = numParts
  override def getPartition(key: Any): Int = {
    val domain = new Java.net.URL(key.toString).getHost()
    val code = (domain.hashCode % numPartitions)
    if (code < 0) {
      code + numPartitions  // Make it non-negative
    } else {
      code
    }
  } 
  // Java equals method to let Spark compare our Partitioner objects
  override def equals(other: Any): Boolean = other match {
    case dnp: DomainNamePartitioner =>
      dnp.numPartitions == numPartitions
    case _ =>
      false
  }
}

1 个答案:

答案 0 :(得分:2)

首先,Scala是写Spark的第一选择。

以下是相应的Java代码(不是唯一版本):

请参阅:https://spark.apache.org/docs/latest/api/java/index.html

class DomainNamePartitioner extends Partitioner{

private int numParts;

public Partitioner()
{

}
public Partitioner(int numParts)
{
  this.numParts = numParts;
}

@Override
public int numPartitions()
{
    return numParts;
}
@Override
public int getPartition(Object key){

   String domain = new Java.net.URL(key.toString).getHost();
   int code = domain.hashCode % numPartitions;
   if (code < 0){
    return code + this.numPartitions();
   }else{
    return code;
   }
}

@Override
public boolean equals(Object obj){

    if (obj instanceof DomainNamePartitioner){
        DomainNamePartitioner pObj = (DomainNamePartitioner)obj;
        return pObj.numPartitions() == this.numPartitions;
    }else{
        return false;
    }
}
}