用于将IIS日志移动到AWS S3存储桶的Powershell脚本

时间:2014-08-21 02:56:01

标签: powershell amazon-web-services

我正在尝试获取一个脚本,该脚本将iis日志和存档从我的实例移动到S3存储桶(例如日志)。 S3路径:logs / iislogs / instance-ID / W3SVC1,/ W3SVC2等

Import-Module 'C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1'
$bucket='logs'
$source="c:\inetpub\logs\LogFiles"

$wc = New-Object System.Net.WebClient;
$instanceIdResult = $wc.DownloadString("http://IP/latest/meta-data/instance-id")

foreach ($i in Get-ChildItem $source)
{
if ($i.CreationTime -lt ($(Get-Date).AddDays(-1)))
{
Write-S3Object -BucketName $bucket -File $i.FullName -Key iislogs/$instanceIdResult/$i
}
}

结果我收到了错误:

Write-S3Object:FilePath属性指示的文件不存在! 在行:12 char:15 + Write-S3Object<<<< -BucketName $ bucket -File $ i.FullName -Key iislogs / $ instanceIdResult / $ i     + CategoryInfo:InvalidOperation:(Amazon.PowerShe ... eS3ObjectCmdlet:WriteS3ObjectCmdlet)[Write-S3Object],InvalidOperationException     + FullyQualifiedErrorId:System.ArgumentException,Amazon.PowerShell.Cmdlets.S3.WriteS3ObjectCmdlet

同样在S3中:logs / iislogs / instance-ID /从子文件夹复制的所有文件。

请帮忙

经过一些研究,我能够将超过1天的日志文件复制到S3,然后从源PC中删除它们。但是S3存储桶路径包含的问题是... \ c:\ inetpub \ logs \ LogFiles。如何剪切并复制到logs / iislogs / instance-ID / W3SVC1,/ W3SVC2?

Import-Module 'C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1'
$bucket='logs'
$source="c:\inetpub\logs\LogFiles\*"
$wc = New-Object System.Net.WebClient;
$instanceIdResult = $wc.DownloadString("http://IP/latest/meta-data/instance-id")


foreach ($i in Get-ChildItem $source -include *.txt -recurse)

{
if ($i.CreationTime -lt ($(Get-Date).AddDays(-1)))
{
Write-S3Object -BucketName $bucket -Key iislogs/$instanceIdResult/$i -File $i
}
}
Get-ChildItem -Path $source -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt ($(Get-Date).AddDays(-1))} | Remove-Item -Force

2 个答案:

答案 0 :(得分:3)

以下问题的答案。此脚本执行我需要的操作,并将powershell控制台输出保存到文件中,并将此文件作为电子邮件附件发送给我。

# This script will copy all log files from $source older than 3 days into AWS S3 $bucket/iislogs/instanceid/C:/inetpub/logs/LogFiles using $Akey and $SKey credentials. Then delete copied files and send email with report
# !!! install AWSToolsAndSDKForNet_sdk before run !!!
# Make sure that you have access to C:\inetpub\logs\LogFiles\... folders
# Created 26 Aug 2014 by Nick Sinyakov


Import-Module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
$bucket="YOUR AWS S3 BUSKET"
$source="C:\inetpub\logs\LogFiles\*"
$outputpath="C:\temp\log.txt"
$wc = New-Object System.Net.WebClient
$instanceId = $wc.DownloadString("http://IP/latest/meta-data/instance-id")
$AKey="AWS access key"
$SKey="AWS secret key"

Set-AWSCredentials -AccessKey $AKey -SecretKey $SKey -StoreAs For_Move
Initialize-AWSDefaults -ProfileName For_Move -Region YOUR-AWS-REGION

Start-Transcript -path $outputpath -Force
foreach ($i in Get-ChildItem $source -include *.log -recurse)
{
if ($i.CreationTime -lt ($(Get-Date).AddDays(-3)))
{
$fileName = (Get-ChildItem $i).Name
$parentFolderName = Split-Path (Split-Path $i -Parent) -Leaf
Write-S3Object -BucketName $bucket -Key iislogs/$instanceId/$parentFolderName/$fileName -File $i
}
}
Stop-Transcript
Send-MailMessage -To email@domain.com -From email@domain.com -Subject "IIS Log move to S3 report" -SmtpServer yoursmtpserver -Attachments $outputpath
Get-ChildItem -Path $source -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt ($(Get-Date).AddDays(-3))} | Remove-Item -Force

希望它会帮助某人

答案 1 :(得分:0)

这是我每天运行的脚本,用于将日志保存在IIS中。它扫描所有IIS网站,查找其日志文件夹,将日志推送到S3并使用下划线标记已处理的日志文件名。希望它有所帮助

Import-Module AWSPowerShell
# Set the script variables
$accessKey = "XXXXXXXXXXXXXXXXXXXX"
$secretKey = "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
$bucketName = "bucketname"
$today = get-date

# Add a function for iterating log files in a directory and pushing them to S3
function processLogDirectory {
  Param($logDir, $bucketName)
  # Log directories are only created after a site is first accessed
  # Check if the log directory exists
  if(Test-Path $logDir) {
    # Get all .log files from the folder except the ones we've processed previously
    $logs = Get-ChildItem -Path $logDir  -Exclude "*.log_"

    # Iterate the logs for pushing to S3
    foreach($log in $logs) {
      # Make sure we don't try to upload today's log file
      if($log.name -ne $log_today) {
        # Push the log file to the S3 Bucket specified in a folder based on the site's name
        Write-S3Object -BucketName $bucketName -Key "$($site.name)/$($log.name)" -File $log.FullName
        # As a safety, just rename the files instead of deleting them
        # If the original files are left, they will get reuploaded. 
        # Reuploaded files will overwrite original logs in the S3 bucket
        # New versions of the logs will be created if versioning is enabled on the bucket
        # Rename-Item $log.FullName "$($log.name)_"
        # Replace the previous line with the next line to delete the log files permanently
        # Remove-Item $log.FullName -whatif # remove the -whatif to really really delete the logs
        # Also, it references the original log file name.
        # There will be an exception if the Rename-Item line is not removed or the Remove-Item is not modified
      }
    }
  }
}

# Create an AWS Credentials object 
Set-AWSCredentials -AccessKey $accessKey -SecretKey $secretKey

# Get filename for Today's log
# We won't be able to access it due to lock from IIS
$log_today = "u_ex$('{0:yy}' -f $today)$('{0:MM}' -f $today)$('{0:dd}' -f $today).log"

# Get All websites
$websites = (Get-Website) 

# Iterate through the sites
foreach($site in $websites) {
  # Check if there is an FTP site started
  if($site.ftpserver.state -eq "started") {
    # Get the FTP site's log directory
    $log_dir = $site.ftpserver.logfile.directory.replace("%SystemDrive%",$env:SystemDrive)
    $svc = "FTPSVC$($site.id)"
    # Add trailing slash if needed - needed more often than you would expect 
    if($log_dir[-1] -ne "\") {
      $log_dir = "$($log_dir)\"
    }

    # Concatenate the full log directory
    $svclog_dir = "$($log_dir)$($svc)"
    Write-Host "processing $($site.name)"
    processLogDirectory -logDir $svclog_dir -bucketName $bucketName
  } else {
    # Process the W3 site
    if($site.elementtagname -eq "site") {
      # Get the W3 site's log directory
      $log_dir = $site.logfile.directory.replace("%SystemDrive%",$env:SystemDrive)
      $svc = "W3SVC$($site.id)"
      # Add trailing slash if needed - needed more often than you would expect 
      if($log_dir[-1] -ne "\") {
        $log_dir = "$($log_dir)\"
      }

      # Concatenate the full log directory
      $svclog_dir = "$($log_dir)$($svc)"
      Write-Host "processing $($site.name)"
      processLogDirectory -logDir $svclog_dir -bucketName $bucketName
    }
  }
}