我在DB中拥有超过100万个数据。我需要一次从DB 1000行读取并将其写入OutputStream,直到没有数据返回为止。我不确定可以有多少数据。因此,直到到达数据末尾,我才知道文件的长度。
是否有可能将数据批量写入输出流,并应同时将其传输到网络。我试图在这里实现的是避免内存不足。如果我在输出流中保留一百万个数据,它将变为OOM。因此,为避免OOM,我尝试将数据写入网络,并且不应将其保存在任何JVM内存中,因此一旦获得它就应将其写入网络。到目前为止,我编写的代码都在这里。
注意:这是独立代码。
protected void doGet(HttpServletRequest req, HttpServletResponse resp)
throws ServletException, IOException {
// TODO Auto-generated method stub
String header = "Doc Timestamp,Internal DocID,DocType,Sender,Receiver,Routing Status,User Status,External DocID\n";
resp.setHeader("Content-Disposition", "attachment; filename="+"File.csv");
resp.setContentType("text/csv");
OutputStream stream = resp.getOutputStream();
stream.write(header.getBytes());
stream.flush();
for (int i = 0; i < 10000000; i++) {
String docTimeStamp = new Date().toString();
String docid = UUID.randomUUID().toString();
String docType = "ICSCSRin";
String sender = "ACC15796_CWS";
String receiver = "ACC15794_CWS";
String routingStatus = "DONE W/ ERRORS";
String userStatus = "CWS POST_PROCESS";
String snrf = "CWS_TES";
StringBuffer str = new StringBuffer();
str.append(docTimeStamp).append(",").append(docid).append(",").append(docType).append(",").append(sender).append(",").append(receiver).append(",").append(routingStatus).append(",").append(userStatus).append(",").append(snrf).append("\n");
stream.write(str.toString().getBytes());
}
stream.close();
}