使用多个并行查询时的性能问题-SqlClient

时间:2019-07-05 10:37:55

标签: c# .net sql-server task-parallel-library

在更换服务器并尝试增加一些数据库密集型任务的工作线程数之后,我发现应用程序出现性能问题。

经过一些测试,我发现问题出在从dataReader读取数据。在30个线程上执行简单查询至少比在单线程上慢15倍。使用PerfView,我发现大部分时间都浪费在BLOCKED_TIME上。

对于测试,我将Ryzen Threadripper(32cores / 64threads)与SqlServer的本地实例一起使用。在具有相似规格的生产服务器上,结果相同。

我尝试运行30个应用程序实例-2-3和30个实例之间的性能几乎没有差异,因此服务器性能足以承载30个并行查询。

我尝试了一些连接字符串的更改,例如增加/减少最小/最大池大小,禁用池,将LCP更改为TCP-无结果。

?

有什么方法可以提高性能,并使我的应用程序具有线程数可扩展性?


编辑。 db结构和示例查询以重现:

    class Program
    {
        static void Main(string[] args)
        {
            var ids = new List<Guid>() { ... }; //filled by database ids 
            var stats = new ConcurrentBag<long>();

            //warmup
            stats.Add(TestMethod());

            Console.WriteLine(String.Format("|{0}|{1,5}ms|", "warmup", stats.Average()));

            //start 1 to 30 threads (test on server with 32 cores / 64 threads)
            for (int i = 1; i <= 30; i++)
            {
                stats = new ConcurrentBag<long>();
                var tasks = Enumerable.Range(0, i).Select(idx =>
                {
                    var id = ids[idx]; // separate ids to be sure we're not reading same records from disk
                    return Task.Run(() =>
                    {
                        for (int j = 0; j < 20; j++)
                        {
                            stats.Add(TestMethod(id));
                        }
                    });
                }).ToArray();

                Task.WaitAll(tasks);
                Console.WriteLine(String.Format("|{0,2}|{1,5}ms|", i, (int)stats.Average()));

            }

            Console.WriteLine("End");
            Console.ReadLine();

        }


        private static long TestMethod()
        {
            var records = new List<object[]>();
            var sw = new Stopwatch();
            using (var connection = new SqlConnection(ConnectionString))
            {
                connection.Open();
                using (var transaction = connection.BeginTransaction())
                using (var command = connection.CreateCommand())
                {
                    command.Transaction = transaction;
                    command.CommandText = SqlQuery;
                    command.Parameters.Add(new SqlParameter("id", id));

                    // measure only dataReader time
                    sw.Start();
                    using (var dataReader = command.ExecuteReader())
                    {
                        // got ~2000 rows from query
                        while (dataReader.Read())
                        {
                            //read all data from row, test on Guid
                            var values = new object[6];
                            dataReader.GetValues(values);
                            records.Add(values);
                        }
                    }
                    sw.Stop();
                }
            }
            return sw.ElapsedMilliseconds;
        }
/****** Object:  Table [dbo].[Table_1]    Script Date: 05.07.2019 14:08:15 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Table_1](
    [Id] [uniqueidentifier] NOT NULL,
    [Ref1] [uniqueidentifier] NULL,
    [Field1] [uniqueidentifier] NULL,
    [Field2] [uniqueidentifier] NULL,
 CONSTRAINT [PK_Table_1] PRIMARY KEY CLUSTERED 
(
    [Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = OFF) ON [PRIMARY]
) ON [PRIMARY]
GO
/****** Object:  Table [dbo].[Table_2]    Script Date: 05.07.2019 14:08:15 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Table_2](
    [Id] [uniqueidentifier] NOT NULL,
    [Field1] [uniqueidentifier] NULL,
 CONSTRAINT [PK_Table_2] PRIMARY KEY CLUSTERED 
(
    [Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = OFF) ON [PRIMARY]
) ON [PRIMARY]
GO
/****** Object:  Index [IDX_Table_1_Ref1]    Script Date: 05.07.2019 14:08:15 ******/
CREATE NONCLUSTERED INDEX [IDX_Table_1_Ref1] ON [dbo].[Table_1]
(
    [Ref1] ASC
)
INCLUDE (   [Field1],
    [Field2]) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = OFF) ON [PRIMARY]
GO
ALTER TABLE [dbo].[Table_1]  WITH CHECK ADD  CONSTRAINT [FK_Table_1_Table_2] FOREIGN KEY([Ref1])
REFERENCES [dbo].[Table_2] ([Id])
GO
ALTER TABLE [dbo].[Table_1] CHECK CONSTRAINT [FK_Table_1_Table_2]
GO

现在id在T1中有30个记录,而在T1中有2000 * 30个记录,因此每个线程在具有30个记录的同一数据集上工作。数据填充有随机的newid()。


edit2。

在情况下,我还比较了该解决方案-30个独立进程与1个进程以及Sql Server上的30个线程。 30个单独的进程可以正常工作-大约相当于原始执行时间的150%,而不是1500%。 差异最大-在30个独立进程和单线程的情况下,我获得了约14个等待任务和每秒20k个批处理请求,在单进程和30个线程中,我获得了30个以上的等待任务(主要在网络I / O上)和2k个批处理请求/秒。


设置

select  
    t2.id as Id,
    t2.Field1 as Field1,
    t1.Id as T1_Id,
    t1.Ref1 as T1_T2,
    t1.Field1 as T1_Field1,
    t1.Field2 as T1_Field2
from dbo.Table_2 t2
join dbo.Table_1 t1 on t1.Ref1 = t2.Id
where t2.id = @id

解决了我的问题,现在它可以扩展到服务器上的最大可用线程。感谢您的帮助!

1 个答案:

答案 0 :(得分:0)

检查您的GC设置。

https://www.dotnetcurry.com/csharp/1471/garbage-collection-csharp-dotnet-core

设置参数

ServerGarbageCollection = true
ConcurrentGarbageCollection = false

可能会有所帮助。 :)