因此,我试图制作一个备份脚本,该脚本将从我的mssql下载CSV,然后压缩文件,然后将备份上传到Amazon S3。
我每天运行脚本时,表的问题是平均有2000万行。看起来就像永远滞后,直到20分钟后完成。我想知道是否有办法专门显示invoke-sqlcmd的进度条。我做了一些研究,我能找到的所有示例都是仅在for循环上制作进度条,而不是对单个命令进行进度。
这是我的代码:
ECHO "Starting Download"
Import-Module sqlps
#$SQLquery="SELECT * FROM dbo.$PREFIX$i"
$SQLquery="SELECT * FROM dbo.events"
ECHO "Executing query = $SQLquery"
$hostname = "."
$pass = "test"
$usern = "test"
$database = "theDB"
$result=invoke-sqlcmd -ServerInstance $hostname -query $SQLquery -HostName $hostname -Password $pass -Username $usern -Database $database -verbose
#echo $result
pause
$result |export-csv -path $CSVPATH -notypeinformation
pause
ECHO "Starting Zip:"
Compress-Archive -LiteralPath $CSVPATH -CompressionLevel Optimal -DestinationPath $ZIPPATH
ECHO "Starting Delete: $CSVPATH "
del "$CSVPATH"
echo "Removed $CSVNAME"
aws s3 cp $ZIPPATH s3://test_$ZIPNAME
pause
此脚本有效,但正如我所说,我想在invoke-sqlcmd中添加一个进度条,以使其在下载巨大文件时不会看起来像冻结的一样。
这是我到目前为止可以找到的,但这仅适用于循环进度
$VerbosePreference = "Continue"
Write-Verbose "Test Message"
for ($a=1; $a -lt 100; $a++) {
Write-Progress -Activity "Working..." -PercentComplete $a -CurrentOperation "$a% complete" -Status "Please wait."
Start-Sleep -Milliseconds 100
}
答案 0 :(得分:1)
考虑到约2000万个庞大的记录数据集,使用System.Data.Common namespace中的某些.NET类可能是一个好主意。而且我不确定Export-Csv
的实现方式,但是System.IO.StreamWriter对于写入大文件非常有效。
带有内联注释的简单测试/工作示例:
# replace $tableName with yours
$sqlCount = "SELECT COUNT(*) FROM dbo.$($tableName)";
$sqlSelect = "SELECT * FROM dbo.$($tableName)";
$provider = [System.Data.Common.DbProviderFactories]::GetFactory('System.Data.SqlClient');
$connection = $provider.CreateConnection();
# replace $connectionString with yours, e.g.:
# "Data Source=$($INSTANCE-NAME);Initial Catalog=$($DATABASE-NAME);Integrated Security=True;";
$connection.ConnectionString = $connectionString;
$command = $connection.CreateCommand();
# get total record count for Write-Progress
$command.CommandText = $sqlCount;
$connection.Open();
$reader = $command.ExecuteReader();
$totalRecords = 0;
while ($reader.Read()) {
$totalRecords = $reader[0];
}
$reader.Dispose();
# select CSV data
$command.CommandText = $sqlSelect;
$reader = $command.ExecuteReader();
# get CSV field names
$columnNames = @();
for ($i = 0; $i -lt $reader.FieldCount; $i++) {
$columnNames += $reader.GetName($i);
}
# read and populate data one row at a time
$values = New-Object object[] $columnNames.Length;
$currentCount = 0;
# replace $CSVPATH with yours
$writer = New-Object System.IO.StreamWriter($CSVPATH);
$writer.WriteLine(($columnNames -join ','));
while ($reader.Read()) {
$null = $reader.GetValues($values);
$writer.WriteLine(($values -join ','));
if (++$currentCount % 1000 -eq 0) {
Write-Progress -Activity 'Reading data' `
-Status "Finished reading $currentCount out of $totalRecords records." `
-PercentComplete ($currentCount / $totalRecords * 100);
}
}
$command.Dispose();
$reader.Dispose();
$connection.Dispose();
$writer.Dispose();