我搜索了很多,但我没有找到在java代码中执行aggregateByKey的示例。
我想在按键减少的JavaPairRDD中找到行数。
我读到aggregateByKey是最好的方法,但我使用的是Java而不是scala,我无法在Java中使用它。
请帮助!!!
例如:
input: [(key1,[name:abc,email:def,address:ghi]),(key1,[name:abc,email:def,address:ghi]),(key2,[name:abc,email:def,address:ghi])]
output: [(key1,[name:abc,email:def,address:ghi, count:2]),(key2,[name:abc,email:def,address:ghi, count:1])]
我想和我的示例完全相同,我想在输出行中添加一个额外的列,减少行数。
感谢!!!
答案 0 :(得分:4)
以下是我在java中按键聚合的示例。
JavaPairRDD<String, Row> result = inputDataFrame.javaRDD().mapToPair(new PairFunction<Row, String, Row>() {
private static final long serialVersionUID = 1L;
public Tuple2<String, Row> call(Row tblRow) throws Exception {
String strID= CommonConstant.BLANKSTRING;
Object[] newRow = new Object[schemaSize];
for(String s: matchKey)
{
if(tblRow.apply(finalSchema.get(s))!=null){
strID+= tblRow.apply(finalSchema.get(s)).toString().trim().toLowerCase();
}
}
int rowSize= tblRow.length();
for (int itr = 0; itr < rowSize; itr++)
{
if(tblRow.apply(itr)!=null)
{
newRow[itr] = tblRow.apply(itr);
}
}
newRow[idIndex]= Utils.generateKey(strID);
return new Tuple2<String, Row>(strID,RowFactory.create(newRow));
}
}).aggregateByKey(RowFactory.create(arr), new Function2<Row,Row,Row>(){
private static final long serialVersionUID = 1L;
public Row call(Row argRow1, Row argRow2) throws Exception {
// TODO Auto-generated method stub
Integer rowThreshold= dataSchemaHashMap.get(CommonConstant.STR_TEMPThreshold);
Object[] newRow = new Object[schemaSize];
int rowSize= argRow1.length();
for (int itr = 0; itr < rowSize; itr++)
{
if(argRow1!=null && argRow2!=null)
{
if(argRow1.apply(itr)!=null && argRow2.apply(itr)!=null)
{
if(itr==rowSize-1){
newRow[itr] = Integer.parseInt(argRow1.apply(itr).toString())+Integer.parseInt(argRow2.apply(itr).toString());
}else{
newRow[itr] = argRow2.apply(itr);
}
}
}
}
return RowFactory.create(newRow);
}
}, new Function2<Row,Row,Row>(){
private static final long serialVersionUID = 1L;
public Row call(Row v1, Row v2) throws Exception {
// TODO Auto-generated method stub
return v1;
}
});
JavaRDD<Row> result1 = result.map(new Function<Tuple2<String,Row>, Row>() {
private static final long serialVersionUID = -5480405270683046298L;
public Row call(Tuple2<String, Row> rddRow) throws Exception {
return rddRow._2();
}
});
答案 1 :(得分:0)
数据文件:average.txt
学生姓名,主题,标记
ss,english,80
ss,maths,60
GG,英语180
PP,英语80
PI,英语,80
GG,数学,100
PP,数学810
PI,数学800
问题是使用Java 8中的aggregateByKey火花转换来找到主题明智的平均值。
这是一种方法:
JavaRDD<String> baseRDD = jsc.textFile("average.txt");
JavaPairRDD<String,Integer> studentRDD = baseRDD.mapToPair( s -> new Tuple2<String,Integer>(s.split(",")[1],Integer.parseInt(s.split(",")[2])));
JavaPairRDD<String,Avg> avgRDD = studentRDD.aggregateByKey(new Avg(0,0), (v,x) -> new Avg(v.getSum()+x,v.getNum()+1), (v1,v2) -> new Avg(v1.getSum()+v2.getSum(),v1.getNum()+v2.getNum()));
Map<String,Avg> mapAvg = avgRDD.collectAsMap();
for(Entry<String,Avg> entry : mapAvg.entrySet()){
System.out.println(entry.getKey()+"::"+entry.getValue().getAvg());
}
import java.io.Serializable;
public class Avg implements Serializable{
private static final long serialVersionUID = 1L;
private int sum;
private int num;
public Avg(int sum, int num){
this.sum = sum;
this.num = num;
}
public double getAvg(){ return (this.sum / this.num);}
public int getSum(){ return this.sum; }
public int getNum(){ return this.num; }
}
答案 2 :(得分:-1)
我不确定你要做什么,但我可以提供一个解决方案,提供你需要的输出。 AggregateByKey没有做你期望做的事情,它只是一种结合RDD的方式,而在DataFrame上它与你期望的类似。无论如何,下面的代码可以为您提供所需的输出。
JavaPairRDD<String, Iterable<String>> groups = pairs.groupByKey();
JavaPairRDD<Integer, String> counts = groups.mapToPair(new PairFunction<Tuple2<String, Iterable<String>>, Integer, String>(){
public Tuple2<Integer, String> call(Tuple2<String, Iterable<String>> arg0) throws Exception {
HashMap<String, Integer> counts = new HashMap<String, Integer>();
Iterator<String> itr = arg0._2.iterator();
String val = null;
while(itr.hasNext()){
val = itr.next();
if(counts.get(val) == null){
counts.put(val, 1);
}else{
counts.put(val, counts.get(val)+1);
}
}
return new Tuple2(arg0._1, counts.toString());
}
});
您可以尝试让我知道。请注意,这不是坦率的结合,因为结合不会做这种事情。