我有一个csv文件,大约有100万条记录。每行代表一个学生对象。
在服务层中,我正在读取csv文件,并将其更改为实体,如下所示:
File file = new File("test.csv");
try
{
String csvFile = "test.csv";
BufferedReader br = null;
String line = "";
String cvsSplitBy = ",";
try
{
br = new BufferedReader(new FileReader(csvFile));
while ((line = br.readLine()) != null)
{
// use comma as separator
String[] student = line.split(cvsSplitBy);
System.out
.println("Country [code= " + student[0] + " , name=" + student[1] + "]");
StudentTable st = new StudentTable();
st.setEmplyoee(student[0]);
st.setName(student[1]);
springbootDao.saveStudent(st);
}
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
catch (IOException e)
{
e.printStackTrace();
}
finally
{
if (br != null)
{
try
{
br.close();
}
catch (IOException e)
{
e.printStackTrace();
}
}
}
}
catch (Exception e)
{
e.printStackTrace();
}
道层
void saveStudent(StudentTable st)
{
getSession();
session.save(st);
session.beginTransaction().commit();
}
private Session getSession()
{
if (session == null)
{
session = entity.unwrap(SessionFactory.class).openSession();
}
return session;
}
当我尝试保存它时,它可以节省一百万次时间,比如说保存每一行,这应该符合逻辑。
显然,这不是一个好方法。无论如何,我是否可以使用多个线程来优化它以提高速度,因为单线程显然会花费更多时间,或者休眠时可以使用任何方法转储数据以提高速度?
更多更新:
我将其更改为
while ((line = br.readLine()) != null)
{
// use comma as separator
String[] student = line.split(cvsSplitBy); //java.lang.OutOfMemoryError: Java heap space
StudentTable st = new StudentTable();
st.setEmplyoee(student[0]);
st.setName(student[1]);
stList.add(st);
}
springbootDao.saveStudent(stList);
现在我正面临内存不足错误
答案 0 :(得分:0)
以这种方式完成
服务层:
String csvFile = "test.csv";
BufferedReader br = null;
String line = "";
String cvsSplitBy = ",";
try
{
List<StudentTable> stList = new ArrayList<StudentTable>();
br = new BufferedReader(new FileReader(csvFile));
StudentTable st = new StudentTable();
while ((line = br.readLine()) != null)
{
// use comma as separator
String[] student = line.split(cvsSplitBy);
st.setEmplyoee(student[0]);
st.setName(student[1]);
stList.add(st);
}
springbootDao.saveStudent(stList);
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
catch (IOException e)
{
e.printStackTrace();
}
finally
{
if (br != null)
{
try
{
br.close();
}
catch (IOException e)
{
e.printStackTrace();
}
}
}
dao layer:
// System.out.println("dao layer" + st);
getSession();
Transaction tx = session.beginTransaction();
session.doWork(new Work()
{
@Override
public void execute(Connection conn)
throws SQLException
{
PreparedStatement pstmt = null;
try
{
String sqlInsert = "insert into student (name,emplyoee) values (?,?) ";
pstmt = conn.prepareStatement(sqlInsert);
int i = 0;
for (StudentTable name : st)
{
pstmt.setString(1, name.getName());
pstmt.setString(2, name.getEmplyoee());
pstmt.addBatch();
// 20 : JDBC batch size
if (i % 200000 == 0)
{
System.out.println(name.getEmplyoee() + "-------------------");
pstmt.executeBatch();
}
i++;
}
pstmt.executeBatch();
}
finally
{
pstmt.close();
}
}
});
tx.commit();
session.close();
// session.save(st);
// session.beginTransaction().commit();