如何根据spark java中的值对列表进行排序?

时间:2016-12-01 00:18:15

标签: java apache-spark

我想按照这个价格按升序对房屋进行分类。

public class Home implements Serializable{ 

        private double price = Math.Random() * 1000; 

    }

这是我按顺序进行的。

ArrayList<Home> city; // Assume it is initallized with some values
Arrays.sort(this.city,new Comparator<House>(){
                public int compare (House o1, House o2){
                if (o1.getPrice() > o2.getPrice()) {
                    return -1;
                } else if (o1.getPrice() < o2.getPrice()) {
                    return 1;
                }
                return 0;
            }
            });

现在我想使用Apache Spark Java对其进行排序。

  

方法一

  JavaRDD<House> r2 = houseRDD.sortBy(  i -> {return i.getPrice(); }, true, 1 );
  

方法二:

JavaRDD<House> r = populationRDD.sortBy( new Function<House, Double>() {
                private static final long serialVersionUID = 1L;

                @Override
                public Double call(Individual value ) throws Exception {
                    return  value.getPrice();
                }

            }, true, 1 );

上述方法出了什么问题,我没有得到以下异常 -

java.lang.ClassCastException:House无法强制转换为java.lang.Comparable

java.lang.ClassCastException: House cannot be cast to java.lang.Comparable
    at org.spark_project.guava.collect.NaturalOrdering.compare(NaturalOrdering.java:28)
    at scala.math.LowPriorityOrderingImplicits$$anon$7.compare(Ordering.scala:153)
    at scala.math.Ordering$$anon$4.compare(Ordering.scala:111)
    at org.apache.spark.util.collection.Utils$$anon$1.compare(Utils.scala:35)
    at org.spark_project.guava.collect.Ordering.max(Ordering.java:551)
    at org.spark_project.guava.collect.Ordering.leastOf(Ordering.java:667)
    at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
    at org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$30.apply(RDD.scala:1393)
    at org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$30.apply(RDD.scala:1390)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
    at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
    at org.apache.spark.scheduler.Task.run(Task.scala:86)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
  

根据评论新的家庭课程

public class Home implements Serializable,Comparable<Home>{ 

        private double price = Math.Random() * 1000; 

@Override
    public int compareTo(House o) {
        return Double.compare(this.getPrice(),o.getPrice());
    }
    }

2 个答案:

答案 0 :(得分:1)

看来House没有实现类似的界面。

How to implement the Java comparable interface?

答案 1 :(得分:0)

List<Home> homes; // initialize it  with some data
JavaRDD<Individual> homeRDD =  SparkUtil.getSparkContext().parallelize(homes);
   public class Home implements Serializable,Comparable<Home>{ 

        private double price = Math.Random() * 1000; 

         @Override
        public int compareTo(Home o) {
            return Double.compare(this.getPrice(),o.getPrice());
        }
    }
  

现在尝试相同的代码

JavaRDD<House> houseRDD = houseRDD.sortBy(  i -> {return i.getPrice(); }, true, 1 );

houseRDD.top(4); // this will output top 4 houses