如何迭代JavaRDD并返回HashMap?

时间:2016-09-25 00:34:54

标签: java apache-spark

我想为hive" show table extended"创建一个HashMap。输出。我正在尝试下面的方法

   JavaRDD<HashMap<String,String>> hiveOutput = hiveContext.sql("show table extended like sourcing_trg_tbl").toJavaRDD().map(new Function<String,  Map<String,String>>() {
                    @Override
                    public  Map<String,String> call(String row) throws Exception {
                        return splitStringToMap(row);
                    }

                    private  Map<String,String> splitStringToMap(String s) {
                        String[] fields = s.toString().split(Pattern.quote(":"), -1);
                        Map<String,String> hiveMap = new HashMap<String,String>();
                        hiveMap.put(fields[0], fields[1]);
                        return hiveMap;
                    }
                });

                Map<String,String> hiveOutputMap = hiveOutput.collect();

当.map我得到以下错误:

The method map(Function<Row,R>) in the type AbstractJavaRDDLike<Row,JavaRDD<Row>> is not applicable for the arguments (new Function<String,Map<String,String>>(){})

是否无法通过此approch返回Java Map?

0 个答案:

没有答案