解析大小超过350 MB的XML文件时,我遇到“java.lang.OutOfMemoryError:JAVA堆空间”错误。
PFB代码片段:
File file = new File("D:\WS\data.xml");
InputSource source = new InputSource(new FileInputStream(file));
XPathFactory xPathFact = XPathFactory.newInstance();
XPath xPath = xPathFact.newXPath();
XPathExpression expr = xPath.compile("//person");
NodeList nodeList = (NodeList)expr.evaluate(source, XPathConstants.NODESET);
我正在尝试评估的最后一行收到错误。
答案 0 :(得分:0)
您可以尝试使用内存消耗较少的方法,允许动态处理目标节点,实际上不是将所有节点都放在内存中,如果它们不能全部适合您的堆,那么它将导致OOME,它只会保留当前节点一。
以下是使用XMLDog:
的方法XMLDog dog = new XMLDog(null, null, null);
dog.addXPath("//person");
Event event = dog.createEvent();
event.setXMLBuilder(new DOMBuilder());
event.setListener(new InstantEvaluationListener(){
@Override
public void onNodeHit(Expression expression, NodeItem nodeItem){
Node personNode = (Node) nodeItem.xml;
// Treat your person node here
}
@Override
public void finishedNodeSet(Expression expression){}
@Override
public void onResult(Expression expression, Object result){}
});
File file = new File("D:\WS\data.xml");
InputSource source = new InputSource(new FileInputStream(file));
dog.sniff(event, source, false);
有关XMLDog here的详细信息。