使用Powershell检查S3中是否存在文件

时间:2018-03-15 01:11:03

标签: powershell amazon-web-services amazon-s3

我正在编写一个脚本,使用Powershell从S3执行Copy-S3Object,但是,我需要先检查一下.ready文件。该存储桶有一个文件夹/test/*.ready。我知道如何检查我的本地文件,但无法弄清楚如何检查S3:

    Initialize-AWSDefaultConfiguration -AccessKey $AKey -SecretKey $SKey -Region $region

Set-Location $source
$files = Get-ChildItem 'test/*.ready' | Select-Object -Property Name
try {
   if(Test-S3Bucket -BucketName $bucket) {
      foreach($file in $files) {
         if(!(Get-S3Object -BucketName $bucket -Key $file.Name)) { ## verify if exist
            Copy-S3Object -BucketName $bucket -Key $s3File -Region $region -AccessKey $Akey -SecretKey $SKey -LocalFolder $localpath
         } 
      }
   } Else {
      Write-Host "The bucket $bucket does not exist."
   }
} catch {
   Write-Host "Error uploading file $file"
}

3 个答案:

答案 0 :(得分:3)

您可以使用" Head Object"用于查看是否已创建S3文件/对象的API。这是PowerShell等效的HeadObject。

Get-S3ObjectMetadata

  

HEAD操作从对象检索元数据而不返回   对象本身。如果您只对此感兴趣,此操作非常有用   在对象的元数据中。要使用HEAD,您必须具有READ访问权限   对象

示例

try {$metadata = Get-S3ObjectMetadata -BucketName bucket-name -Key someFile.txt; "Found"} catch { "Not Found" }

答案 1 :(得分:1)

您也可以像这样检查(如果您想测试文件,请在下面的代码段中删除正斜杠):

if(Get-S3Object -BucketName abhibucketsss | where{$_.Key -like "test/*.ready/"}){
"Folder found"
}
else{
"Folder Not found"
}

答案 2 :(得分:1)

我处在相同的情况下,所以开始使用一个PowerShell脚本进行工作,该脚本实际上是同步数据而不是覆盖数据。

默认情况下,Write-s3Object CMDlet将覆盖,并且我检查了没有选项指定不覆盖现有文件。

这是我检查S3上是否存在本地文件夹的方法:

if ((Get-S3Object -BucketName $BucketName -KeyPrefix $Destination -MaxKey 2).count -lt "2") {

这是我检查文件是否存在以及S3文件与本地文件大小相同的方式

$fFile = Get-ChildItem -Force -File -Path "LocalFileName"


$S3file = Get-S3Object -BucketName "S3BucketName" -Key "S3FilePath"
$s3obj = ($S3file.key -split "/")[-1]

if ($fFile.name -eq $s3obj -and $S3file.size -ge $fFile.Length) {
    WriteWarning "File exists: $s3obj"

}

这是完整的脚本。

Function Sync-ToS3Bucket {
[CmdletBinding()]
param (
    [Parameter(Mandatory=$True,Position=1)]
    [string]$BucketName,
    [Parameter(Mandatory=$True,Position=2)]
    [string]$LocalFolderPath,
    [string]$S3DestinationFolder,
    [string]$S3ProfileName,
    [string]$AccessKey,
    [string]$SecretKey,
    [switch]$ShowProgress

)

Function WriteInfo ($msg) {
    Write-Host "[$(get-date)]:: $msg"
}

Function WriteAction ($msg) {
    Write-Host "[$(get-date)]:: $msg" -ForegroundColor Cyan
}

Function WriteWarning ($msg) {
    Write-Host "[$(get-date)]:: $msg" -ForegroundColor Yellow
}

Function WriteError ($msg) {
    Write-Host "[$(get-date)]:: $msg" -ForegroundColor Red
}

Function WriteLabel ($msg) {
    "`n`n`n"
    Write-Host ("*" * ("[$(get-date)]:: $msg").Length)
    $msg
    Write-Host( "*" * ("[$(get-date)]:: $msg").Length)
}

function Calculate-TransferSpeed ($size, $eTime) {
    writeInfo "Total Data: $size bytes, Total Time: $eTime seconds"
    if ($size -ge "1000000") {

        WriteInfo ("Upload speed         : " + [math]::round($($size / 1MB)/$eTime, 2) + " MB/Sec")
    }
    Elseif ($size -ge "1000" -and $size -lt "1000000" ) {

        WriteInfo ("Upload speed         : " + [math]::round($($size / 1kb)/$eTime,2)+ " KB/Sec")
    }
    Else {
        if ($size -ne $null -and $size) {
            WriteInfo ("Upload speed         : " + [math]::round($ssize/$eTime,2) + " Bytes/Sec")
        }
        else {
            WriteInfo ("Upload speed         : 0 Bytes/Sec")

        }
    }
}

function Get-ItemSize ($size, $msg) {
    if ($size -ge "1000000000") {
        WriteInfo "Upload $msg Size   : $([math]::round($($size /1gb),2)) GB"

    }
    Elseif ($size -ge "1000000" -and $size -lt "1000000000" ) {
        WriteInfo "Upload $msg Size   : $([math]::round($($size / 1MB),2)) MB"

    }
    Elseif ($size -ge "1000" -and $size -lt "1000000" )  {
        WriteInfo "Upload $msg Size   : $([math]::round($($size / 1kb),2)) KB"

    }
    Else {
        if ($size -ne $null -and $size) {
            WriteInfo "Upload $msg Size   : $([string]$size) Bytes"
        }

        else {
            WriteInfo "Upload $msg Size   : 0 Bytes"

        }
    }
}


clear
"`n`n`n`n`n`n`n`n`n`n"
$OstartTime = get-date


if ($LocalFolderPath.Substring($LocalFolderPath.Length -1) -eq '\') {
    #$LocalFolderPath =  $LocalFolderPath + '\'
    $LocalFolderPath =  $Localfolderpath.Substring(0,$Localfolderpath.Length -1)
}
if ($S3DestinationFolder.Substring($S3DestinationFolder.Length -1) -eq '\') {
    #$LocalFolderPath =  $LocalFolderPath + '\'
    $S3DestinationFolder =  $S3DestinationFolder.Substring(0,$S3DestinationFolder.Length -1)
}
set-location $LocalFolderPath
$LocalFolderPath = $PWD.Path

Start-Transcript "AWS-S3Upload.log" -Append
"`n`n`n`n`n`n`n`n`n`n"
WriteLabel "Script start time: $OstartTime"

WriteAction "Getting sub directories"
$Folders = Get-ChildItem -Path $LocalFolderPath -Directory -Recurse -Force | select FullName

WriteAction "Getting list of all files"
$allFiles = Get-ChildItem -Path $LocalFolderPath -File -Recurse -Force | select FullName

WriteAction "Getting folder count"
$FoldersCount = $Folders.count

WriteAction "Getting file count"
$allFilesCount = $allFiles.count

$i = 0

foreach ($Folder in $Folders.fullname) {


    $UploadFolder = $Folder.Substring($LocalFolderPath.length)
    $Source  = $Folder

    $Destination = $S3DestinationFolder + $UploadFolder



    if ($ShowProgress) {
        $i++
        $Percent = [math]::Round($($($i/$FoldersCount*100)))
        Write-Progress -Activity "Processing folder: $i out of $FoldersCount" -Status "Overall Upload Progress: $Percent`%     ||     Current Upload Folder Name: $UploadFolder" -PercentComplete $Percent
    }
    "`n`n"
    "_" * $("[$(get-date)]:: Local Folder Name    : $UploadFolder".Length)
    WriteInfo "Local Folder Name    : $UploadFolder"
    WriteInfo "S3 Folder path       : $Destination"

    WriteAction "Getting folder size"
    $Files = Get-ChildItem -Force -File -Path $Source | Measure-Object -sum Length

    Get-ItemSize $Files.sum "Folder"

    if ((Get-S3Object -BucketName $BucketName -KeyPrefix $Destination -MaxKey 2).count -lt "2") {

        WriteAction "Folder does not exist"
        WriteAction "Uploading all files"


        WriteInfo ("Upload File Count    : " + $files.count)

        $startTime = get-date
        WriteInfo "Upload Start Time    : $startTime"
        Write-S3Object -BucketName $BucketName -KeyPrefix $Destination -Folder $Source -Verbose -ConcurrentServiceRequest 100

        $stopTime = get-date
        WriteInfo "Upload Finished Time : $stopTime"

        $elapsedTime = $stopTime - $StartTime
        WriteInfo ("Time Elapsed         : " + $elapsedTime.days + " Days, " + $elapsedTime.hours + " Hours, "  + $elapsedTime.minutes + " Minutes, " + $elapsedTime.seconds+ " Seconds")

        Calculate-TransferSpeed $files.Sum $elapsedTime.TotalSeconds

        #sleep 10
    }
    else {
        WriteAction "Getting list of local files in local folder to transfer"
        $fFiles = Get-ChildItem -Force -File -Path $Source

        WriteAction "Counting files"
        $fFilescount = $ffiles.count
        WriteInfo "Upload File Count    : $fFilescount"
        $j=0
        foreach ($fFile in $fFiles) {

            if ($ShowProgress) {
                $j++
                $fPercent = [math]::Round($($($j/$fFilescount*100)))
                Write-Progress -Activity "Processing File: $j out of $fFilescount" -Id 1 -Status "Current Progress: $fPercent`%           ||     Processing File: $ffile" -PercentComplete $fPercent
            }
            #WriteAction "Getting S3 bucket objects"

            $S3file = Get-S3Object -BucketName $BucketName -Key "$Destination\$ffile"
            $s3obj = $S3file.key -replace "/","\"

            if ("$S3DestinationFolder$UploadFolder\$ffile" -eq $s3obj -and $S3file.size -ge $ffile.Length) {
                WriteWarning "File exists: $s3obj"

            }
            else {

                WriteAction "Uploading file       : $ffile"

                Get-ItemSize $fFile.Length "File"
                $startTime = get-date
                WriteInfo   "Upload Start Time    : $startTime"

                Write-S3Object -BucketName $BucketName -File $fFile.fullname -Key "$Destination\$fFile" -ConcurrentServiceRequest 100 -Verbose

                $stopTime = get-date
                WriteInfo "Upload Finished Time : $stopTime"

                $elapsedTime = $stopTime - $StartTime

                WriteInfo ("Time Elapsed         : " + $elapsedTime.days + " Days, " + $elapsedTime.hours + " Hours, "  + $elapsedTime.minutes + " Minutes, " + $elapsedTime.seconds+ " Seconds")
                Calculate-TransferSpeed $fFile.Length $elapsedTime.TotalSeconds
                break

            }

        }

    }

}

$OstopTime = get-date
"Script Finished Time : $OstopTime"

$elapsedTime = $OstopTime - $OStartTime

"Time Elapsed         : " + $elapsedTime.days + " Days, " + $elapsedTime.hours + " Hours, "  + $elapsedTime.minutes + " Minutes, " + $elapsedTime.seconds+ " Seconds"

stop-transcript
}

在您的AWS Powershell实例中运行脚本,它将创建一个cmdlet或函数。

您可以像这样使用它:

Sync-ToS3Bucket -BucketName YouS3BucketName -LocalFolderPath “ C:\ AmazonDrive \” - S3DestinationFolder YourDestinationS3folderName - ShowProgress:$ true

  • 确保使用相对路径
  • 通过运行Initialize-AWSDefaultConfiguration
  • 确保已启动defaultAWS配置
  • 默认情况下,脚本不会显示对性能更好的进度,但是您可以使用开关-showProgress:$true
  • 将其打开
  • 脚本也会创建文件夹结构
  • 您可以使用它将本地文件夹同步到S3。如果该文件夹在S3上不存在,则脚本将上传整个文件夹。如果该文件夹存在,则脚本将遍历本地文件夹中的每个文件,并确保它在S3上存在。

我仍在改进脚本,并将其发布在我的GitHub个人资料上。让我知道您是否有任何反馈。

一些截图: enter image description here

enter image description here