我必须从数据库中读取大量的Blob数据(超过300Gb)并插入到另一个数据库中。我正在使用以下代码读取数据
if (dr.HasRows)
{
while (dr.Read())
{
media m = new media
{
docid = Convert.ToInt32(dr["Id"]),
Content = Convert.ToByte(dr["BlobData"]),
madiaName = Convert.ToString(dr["Name"])
}
}
InsertInNewDb(m);
}
我正在逐行读取数据并在另一个数据库中插入数据。问题是在发送一些数据后生成了内存完全异常,因为我没有处理对象。 如何在一次迭代后处理对象?
答案 0 :(得分:1)
要将多个答案和评论结合在一起,请尝试以下方法:
// The SqlConnection, SqlCommand and SqlDataReader need to be in using blocks
// so that they are disposed in a timely manner. This does not clean up
// memory, it cleans up unmanaged resources like handles
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand("SELECT * FROM OldTable", conn))
{
using (SqlDataReader dr = cmd.ExecuteReader())
{
if (dr.HasRows)
{
while (dr.Read())
{
media m = new media
{
// Don't convert - cast instead. These are already the correct
// type.
docid = (int) dr["Id"],
// There are more efficient ways to do this, but
// Convert.ToByte was copying only a single byte
Content = dr["BlobData"],
madiaName = (string)dr["Name"]
}
// You probably want to insert _all_ of the rows.
// Your code was only inserting the last
InsertInNewDb(m);
}
}
}
}
}
答案 1 :(得分:0)
您可以尝试对DataReader进行分页,这应该可行。尝试在某些行之后关闭数据的连接,源和目标。请记住使用带有使用指令的对象来更好地管理内存。