在Apache Camel中处理大数据的最佳策略

时间:2017-07-31 21:33:49

标签: java-8 apache-camel

我正在使用Apache Camel生成月度报告。我有一个MySQL查询,当我的数据库运行时会产生大约500万条记录(每列20列)。查询本身大约需要70分钟才能执行。

为了加快这个过程,我创建了5个seda(worker)路由并使用了multicast().parallelProcessing() 它为不同的时间范围并行查询数据库,然后使用聚合器合并结果。

现在,我可以在我的交换体中看到500万条记录(以List<HashMap<String, Object>>的形式)。当我尝试使用Camel Bindy格式化这个来生成这个数据的csv文件时,我得到了一个GC Overhead Exception。我尝试增加Java堆大小,但转换需要永远。

还有其他方法可以将这些原始数据转换为格式良好的csv文件吗? Java 8流可以用吗?

代码

from("direct://logs/testLogs")
    .routeId("Test_Logs_Route")
    .setProperty("Report", simple("TestLogs-${date:now:yyyyMMddHHmm}"))
    .bean(Logs.class, "buildLogsQuery")                             // bean that generates the logs query
    .multicast()
    .parallelProcessing()
    .to("seda:worker1?waitForTaskToComplete=Always&timeout=0",      // worker routes
        "seda:worker2?waitForTaskToComplete=Always&timeout=0",
        "seda:worker3?waitForTaskToComplete=Always&timeout=0",
        "seda:worker4?waitForTaskToComplete=Always&timeout=0",
        "seda:worker5?waitForTaskToComplete=Always&timeout=0");

我的所有工作路线都是这样的

from("seda:worker4?waitForTaskToComplete=Always")
    .routeId("ParallelProcessingWorker4")
    .log(LoggingLevel.INFO, "Parallel Processing Worker 4 Flow Started")
    .setHeader("WorkerId", constant(4))
    .bean(Logs.class, "testBean")                                   // appends time-clause to the query based in WorkerID
    .to("jdbc:oss-ro-ds")
    .to("seda:resultAggregator?waitForTaskToComplete=Always&timeout=0");

聚合

from("seda:resultAggregator?waitForTaskToComplete=Always&timeout=0")
    .routeId("Aggregator_ParallelProcessing")
    .log(LoggingLevel.INFO, "Aggregation triggered for processor ${header.WorkerId}")
    .aggregate(header("Report"), new ParallelProcessingAggregationStrategy())
    .completionSize(5)
    .to("direct://logs/processResultSet")


from("direct://logs/processResultSet")
    .routeId("Process_Result_Set")
    .bean(Test.class, "buildLogReport");
    .marshal(myLogBindy)
    .to("direct://deliver/ooma");

方法buildLogReport

public void buildLogReport(List<HashMap<String, Object>> resultEntries, Exchange exchange) throws Exception {
        Map<String, Object> headerMap = exchange.getIn().getHeaders();
        ArrayList<MyLogEntry> reportList = new ArrayList<>();

        while(resultEntries != null){
            HashMap<String, Object> resultEntry = resultEntries.get(0);
            MyLogEntry logEntry = new MyLogEntry();

            logEntry.setA((String) resultEntry.get("A"));
            logEntry.setB((String) resultEntry.get("B"));
            logEntry.setC(((BigDecimal) resultEntry.get("C")).toString());
            if (null != resultEntry.get("D"))
                logEntry.setD(((BigInteger) resultEntry.get("D")).toString());
            logEntry.setE((String) resultEntry.get("E"));
            logEntry.setF((String) resultEntry.get("F"));
            logEntry.setG(((BigDecimal) resultEntry.get("G")).toString());
            logEntry.setH((String) resultEntry.get("H"));
            logEntry.setI(((Long) resultEntry.get("I")).toString());
            logEntry.setJ((String) resultEntry.get("J"));
            logEntry.setK(TimeUtils.convertDBToTZ((Date) resultEntry.get("K"), (String) headerMap.get("TZ")));
            logEntry.setL(((BigDecimal) resultEntry.get("L")).toString());
            logEntry.setM((String) resultEntry.get("M"));
            logEntry.setN((String) resultEntry.get("State"));
            logEntry.setO((String) resultEntry.get("Zip"));
            logEntry.setP("\"" + (String) resultEntry.get("Type") + "\"");
            logEntry.setQ((String) resultEntry.get("Gate"));

            reportList.add(logEntry);
            resultEntries.remove(resultEntry);
        }

        // Transform The Exchange Message
        exchange.getIn().setBody(reportList);
    }

0 个答案:

没有答案