使用BCP Utility按行号将大文件拆分为几个较小的文件

时间:2015-11-18 15:40:52

标签: sql-server csv bcp

我希望有人能够帮助我,或者指出一些能够解决问题的资源。

我正在使用BCP实用程序从SQL服务器生成一个大型CSV文件(约2亿行)。

我想知道在处理过程中是否有可能让BCP拆分文件(比如每1000万行),这样我就不会有一个大文件了,而是最终得到20个较小的文件?

我知道我可以在创建文件后拆分文件,但这会占用更多的存储空间,因为我会复制原始文件。

非常感谢任何帮助!

非常感谢。

米克

1 个答案:

答案 0 :(得分:1)

这是您的选择。使用while循环将迭代动态sql以生成由xp_cmdshell执行的bcp代码。您应该能够在下面使用我的代码并根据自己的喜好进行更改。

作为一个例子,我在下面的SQL代码生成的bcp代码中包含了一个打印件。希望这对您来说是一个可行的解决方案。

declare @c int = 1 --file version
declare @s int = 1 --predicate begin int
declare @e int = 10000000  --predicate end int

while @s <= 200000000
begin

    declare @sql varchar(8000)
    print @sql

    select @sql = 
    'bcp select * from [your_table] where <your_id> >= ' + convert(varchar(10),@s) + ' and  [your_id] <=' + convert(varchar(10),@e) + ' out [your_file_name]' + convert(varchar(10),@c) + '.txt -c -t, -T -S' + ' [your_directory]'

    exec  master..xp_cmdshell @sql


    set @s = @e + 1
    set @e = @s + 9999999
    set @c = @c + 1

end

打印出exec master..xp_cmdshell @sql

执行的@sql
bcp select * from [your_table] where <your_id> >= 1 and  [your_id] <=10000000 out [your_file_name]1.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 10000001 and  [your_id] <=20000000 out [your_file_name]2.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 20000001 and  [your_id] <=30000000 out [your_file_name]3.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 30000001 and  [your_id] <=40000000 out [your_file_name]4.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 40000001 and  [your_id] <=50000000 out [your_file_name]5.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 50000001 and  [your_id] <=60000000 out [your_file_name]6.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 60000001 and  [your_id] <=70000000 out [your_file_name]7.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 70000001 and  [your_id] <=80000000 out [your_file_name]8.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 80000001 and  [your_id] <=90000000 out [your_file_name]9.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 90000001 and  [your_id] <=100000000 out [your_file_name]10.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 100000001 and  [your_id] <=110000000 out [your_file_name]11.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 110000001 and  [your_id] <=120000000 out [your_file_name]12.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 120000001 and  [your_id] <=130000000 out [your_file_name]13.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 130000001 and  [your_id] <=140000000 out [your_file_name]14.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 140000001 and  [your_id] <=150000000 out [your_file_name]15.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 150000001 and  [your_id] <=160000000 out [your_file_name]16.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 160000001 and  [your_id] <=170000000 out [your_file_name]17.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 170000001 and  [your_id] <=180000000 out [your_file_name]18.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 180000001 and  [your_id] <=190000000 out [your_file_name]19.txt -c -t, -T -S [your_directory]

bcp select * from [your_table] where <your_id> >= 190000001 and  [your_id] <=200000000 out [your_file_name]20.txt -c -t, -T -S [your_directory]