如何在Oracle数据库中进行批量插入

时间:2019-03-28 15:11:41

标签: java spring oracle jdbc

我需要将批处理批量插入Oracle DB,但是我对如何进行批量处理感到困惑。

String INSERT = "INSERT INTO LOGS(METHOD,USER,START_DATE,RESPONSE_TIME,IS_ERROR) VALUES (?,?,?,?,?)";

private synchronized void saveToDBAndClear(ConcurrentHashMap<Long, Logs> logs) {
    List<Logs> list = new ArrayList<>(logs.values());
    logService.insertLog(list);
    initLogMap();
}


    public void insertLog(List<Logs> logsList) {

        int[] insertedLog = jdbcTemplate.batchUpdate(INSERT, new BatchPreparedStatementSetter() {

            @Override
            public void setValues(PreparedStatement ps, int i) throws SQLException {
                ps.setString(1, logsList.get(i).getMethod());
                ps.setString(2, logsList.get(i).getUser());
                ps.setTimestamp(3, logsList.get(i).getStartDate());
                ps.setLong(4, logsList.get(i).getResponseTime());
                ps.setString(5, logsList.get(i).getIsError());
            }

            @Override
            public int getBatchSize() {
                return logsList.size();
            }
        });
        logger.info("It was inserted {} logs into Logs", insertedLog.length);
    }

我必须做

List<Logs> list = new ArrayList<>(logs.values());

因为我不知道如何使用jdbcTemplate.batchUpdate()

对于每个ConcurrentHashMap,并收集100或1000批量的批次以推送到数据库中。

有人可以帮我吗?

PS

我已经尝试过

public class LogProcessor {
private int batchSize = 10;
private final double initialCapacity = 1.26;
private ConcurrentHashMap<Long, Logs> logMap;
private AtomicLong logMapSize;

    private void initLogMap() {
this.logMap = new ConcurrentHashMap<>((int) (batchSize * initialCapacity));
this.logMapSize = new AtomicLong();
    }

public void process(LogKeeper keeper){
 LogHandler log = keeper.getLog();
 Long i = logMapSize.incrementAndGet();
 logMap.put(i, log.toJdbc());
 System.out.println("Lines inside the map = "+logMap.size());
 if (i % batchSize == 0) {
  System.out.println("Reached batchSize and = " + batchSize);
                    saveToDBAndClear(logMap);
                }

private synchronized void saveToDBAndClear(ConcurrentHashMap<Long, Logs> logs) {
List<Logs> list = new ArrayList<>(logs.values());
System.out.println("Created list with size = "+ list.size());
logService.insertLog(list);
initLogMap();
System.out.println("Now size of map = "+ logs.size()+" and AtomicLong = "+logMapSize.intValue() );
}

    @TransactionalRollback
    public void insertLog(List<Logs> logsList) {
        System.out.println("Inside insertLog method");
        int[][] insertedLog=jdbcTemplate.batchUpdate(INSERT, logsList, 15, (ps, arg) -> {
            ps.setString(1, arg.getMethod());
            ps.setString(2, arg.getClient());
            ps.setTimestamp(3, arg.getStartDate());
            ps.setLong(4, arg.getResponseTime());
            ps.setString(5, arg.getIsError());
        });
        System.out.println("It was inserted " + insertedLog[0].length + " logs into DB");
    }
}

And there is some log information

现在,如您所见。我的batchSize作为私有字段是10。在batchUpdate中,我放了15。我想,如果我发送到insertLog方法列表(例如,大小等于1或100),它将收集到大小等于15的批处理并将发送到DB,但仅插入列表组成的那个卷。

因此,我必须准确收集Map中所需的batchSize,然后将其发送到insertLog方法。如果可以的话,我只将Logs日志发送到insertLog方法中,并放入

 public void insertLog(Logs logs){
jdbcTemplate.batchUpdate(INSERT, logs, 1000, (ps, arg) ->(...));}

在那之后有人可以帮忙安装bathSize吗?是否可以在没有像10%batchSize = 0这样的简单验证的情况下进行batchInsert或batchUpdate?

1 个答案:

答案 0 :(得分:0)

看看PLSQLSample.java供参考