在Spark RDD.foreachPartition中创建JanusGraph

时间:2018-06-25 04:31:28

标签: apache-spark serialization janusgraph

我正在尝试使用Spark在RDD.foreachPartition中创建JanusGraph实例,但出现序列化错误。代码如下。

 nodesRDD.foreachPartition(new VoidFunction<Iterator<String>>() {
            @Override
            public void call(Iterator<String> stringIterator) throws Exception {
                String vertexLabel = "person";
                String s = "/home/chgy/janusgraph-0.2.0-hadoop2/conf/gremlin-server/janusgraph-hbase.properties";

                JanusGraph graph = JanusGraphFactory.open(s);
                TransactionBuilder builder = graph.buildTransaction();
                JanusGraphTransaction tx = builder.enableBatchLoading().consistencyChecks(false).start();

                while(stringIterator.hasNext()){
                    String[] colVals = stringIterator.next().split(",");
                    linecount.add(1);

                    List<Object> keyValues = new ArrayList<Object>();
                    HashMap<String, List<Object>> propertyHasValues = new HashMap<>();

                    tx.addVertex(vertexLabel).property("id", Long.parseLong(colVals[0]));
                }
                tx.commit();
                tx.close();
                graph.close();
            }
        });

我对所有这些都很陌生。任何建议/指导都欢迎!

0 个答案:

没有答案