通常我只运行少于50的循环,运行正常。现在我正在努力扩大规模并且用5k +它需要几分钟。
Public Shared Sub Add(type As BufferType, objIDs As List(Of Integer), userToken As String)
Dim timeInUTCSeconds As Integer = Misc.UTCDateToSeconds(Date.Now)
For Each objID As Integer In objIDs
Dim insertStmt As String = "IF NOT EXISTS (SELECT ObjectID From " & TableName(type) & " where ObjectID = " & objID & " and UserToken = '" & userToken.ToString & "')" & _
" BEGIN INSERT INTO " & TableName(type) & "(ObjectID,UserToken,time) values(" & objID & ", '" & userToken.ToString & "', " & timeInUTCSeconds & ") END" & _
" ELSE BEGIN UPDATE " & TableName(type) & " set Time = " & timeInUTCSeconds & " where ObjectID = " & objID & " and UserToken = '" & userToken.ToString & "' END"
DAL.SQL.Insert(insertStmt)
Next
End Sub
答案 0 :(得分:1)
最好的选择是将整个工作转移到SQL端。将 objIDs 列表作为表参数传递给存储过程,并使用MERGE..UPDATE..INSERT语句。
如果您需要VB侧码,您可以进行多项优化。
答案 1 :(得分:1)
(我假设SQL Server是一个合理的最新版本,因为它是一个.Net问题。)
答案 2 :(得分:1)
您应始终使用SQLBULKCOPY插入大量数据。您可以查看始终认为的here作为将大量数据插入表格的最佳做法。
演示代码将使您的事实清晰,这取自here
private static void PerformBulkCopy()
{
string connectionString =
@"Server=localhost;Database=Northwind;Trusted_Connection=true";
// get the source data
using (SqlConnection sourceConnection =
new SqlConnection(connectionString))
{
SqlCommand myCommand =
new SqlCommand("SELECT * FROM tablename", sourceConnection);
sourceConnection.Open();
SqlDataReader reader = myCommand.ExecuteReader();
// open the destination data
using (SqlConnection destinationConnection =
new SqlConnection(connectionString))
{
// open the connection
destinationConnection.Open();
using (SqlBulkCopy bulkCopy =
new SqlBulkCopy(destinationConnection.ConnectionString))
{
bulkCopy.BatchSize = 500;
bulkCopy.NotifyAfter = 1000;
bulkCopy.SqlRowsCopied +=
new SqlRowsCopiedEventHandler(bulkCopy_SqlRowsCopied);
bulkCopy.DestinationTableName = "Tablename";
bulkCopy.WriteToServer(reader);
}
}
reader.Close();
}
}
另外,请不要忘记阅读与 Batchsize
相关的文档希望有所帮助。