我正在尝试使用selectHiveQL Processor从hive获取10M +记录的大型全表,并且确实发现源代码中的converttoCSVStream()方法比获取结果集需要更长的时间。 观察代码:逐行迭代结果集,然后将其添加到输出流中。
当表格大小很小时,它会在几秒钟内完成整个过程,但由于数据很大,需要更长的时间。有什么办法可以优化转换吗? 我试过提取大小为100000/1000/10000/1000。
以下是代码:
while (rs.next()) {
//logger.info("+++++++++++++Inside the While loop+++++++++++++++++");
if (callback != null) {
callback.processRow(rs);
}
List<String> rowValues = new ArrayList<>(nrOfColumns);
for (int i = 1; i <= nrOfColumns; i++) {
final int javaSqlType = meta.getColumnType(i);
final Object value = rs.getObject(i);
//logger.info("+++++++++++++Entering the Switch at +++++++++++++++++");
switch (javaSqlType) {
case CHAR:
case LONGNVARCHAR:
case LONGVARCHAR:
case NCHAR:
case NVARCHAR:
case VARCHAR:
String valueString = rs.getString(i);
if (valueString != null) {
// Removed extra quotes as those are a part of the escapeCsv when required.
StringBuilder sb = new StringBuilder();
if (outputOptions.isQuote()) {
sb.append("\"");
if (outputOptions.isEscape()) {
sb.append(StringEscapeUtils.escapeCsv(valueString));
} else {
sb.append(valueString);
}
sb.append("\"");
rowValues.add(sb.toString());
} else {
if (outputOptions.isEscape()) {
rowValues.add(StringEscapeUtils.escapeCsv(valueString));
} else {
rowValues.add(valueString);
}
}
} else {
rowValues.add("");
}
break;
case ARRAY:
case STRUCT:
case JAVA_OBJECT:
String complexValueString = rs.getString(i);
if (complexValueString != null) {
rowValues.add(StringEscapeUtils.escapeCsv(complexValueString));
} else {
rowValues.add("");
}
break;
default:
if (value != null) {
rowValues.add(value.toString());
} else {
rowValues.add("");
}
}
//logger.info("+++++++++++++Exiting the Switch at +++++++++++++++++" + System.currentTimeMillis());
}
// Write row values
//logger.info("+++++++++++++Writing Row value at+++++++++++++++++" + System.currentTimeMillis());
outStream.write(StringUtils.join(rowValues, outputOptions.getDelimiter()).getBytes(StandardCharsets.UTF_8));
outStream.write("\n".getBytes(StandardCharsets.UTF_8));
nrOfRows++;
if (maxRows > 0 && nrOfRows == maxRows)
break;
}