AWS Cloudwatch记录到Azure日志分析

时间:2019-09-29 00:24:58

标签: amazon-web-services azure-log-analytics aws-cloudwatch-log-insights

我知道HTTP Data Collector API可用于将数据提取到Azure Log分析中,我的要求是关于将AWS Cloudwatch数据传输到Azure。我们拥有Azure托管的应用程序,而外部AWS托管了无服务器Lamda函数,我们希望将这13个无服务器函数的日志导入Azure。我从documentation知道,这里有一个可以用作AWS Lamda函数的python函数,该python示例在MSFT文档中。但是我无法理解的是AWS云收集器需要创建哪种Json格式,以便他们可以将其发送到Azure Log Analytics。有什么例子吗?有关如何完成此操作的任何帮助。我也遇到过这个博客,但这是特定的。 https://www.splunk.com/blog/2017/02/03/how-to-easily-stream-aws-cloudwatch-logs-to-splunk.html

1 个答案:

答案 0 :(得分:0)

嘿,没关系,我能够进行更深入的研究,我发现在AWS中,我可以通过订阅从一个Lambda到其他Lambda函数流日志。设置好之后,我所做的一切就被消耗掉了,并即时创建了JSON并将其发送到Azure日志。如果您或任何人对此感兴趣,请使用以下代码:-

import json
import datetime
import hashlib
import hmac
import base64
import boto3
import datetime
import gzip

from botocore.vendored import requests
from datetime import datetime

Update the customer ID to your Log Analytics workspace ID
customer_id = "XXXXXXXYYYYYYYYYYYYZZZZZZZZZZ"

For the shared key, use either the primary or the secondary Connected Sources client authentication key
shared_key = "XXXXXXXXXXXXXXXXXXXXXXXXXX"

The log type is the name of the event that is being submitted
log_type = 'AWSLambdafuncLogReal'

json_data = [{
"slot_ID": 12345,
"ID": "5cdad72f-c848-4df0-8aaa-ffe033e75d57",
"availability_Value": 100,
"performance_Value": 6.954,
"measurement_Name": "last_one_hour",
"duration": 3600,
"warning_Threshold": 0,
"critical_Threshold": 0,
"IsActive": "true"
},
{
"slot_ID": 67890,
"ID": "b6bee458-fb65-492e-996d-61c4d7fbb942",
"availability_Value": 100,
"performance_Value": 3.379,
"measurement_Name": "last_one_hour",
"duration": 3600,
"warning_Threshold": 0,
"critical_Threshold": 0,
"IsActive": "false"
}]
#body = json.dumps(json_data)
#####################
######Functions######
#####################

Build the API signature
def build_signature(customer_id, shared_key, date, content_length, method, content_type, resource):
x_headers = 'x-ms-date:' + date
string_to_hash = method + "\n" + str(content_length) + "\n" + content_type + "\n" + x_headers + "\n" + resource
bytes_to_hash = bytes(string_to_hash, encoding="utf-8")
decoded_key = base64.b64decode(shared_key)
encoded_hash = base64.b64encode(
hmac.new(decoded_key, bytes_to_hash, digestmod=hashlib.sha256).digest()).decode()
authorization = "SharedKey {}:{}".format(customer_id,encoded_hash)
return authorization

Build and send a request to the POST API
def post_data(customer_id, shared_key, body, log_type):
method = 'POST'
content_type = 'application/json'
resource = '/api/logs'
rfc1123date = datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT')
print (rfc1123date)
content_length = len(body)
signature = build_signature(customer_id, shared_key, rfc1123date, content_length, method, content_type, resource)
uri = 'https://' + customer_id + '.ods.opinsights.azure.com' + resource + '?api-version=2016-04-01'

headers = {
    'content-type': content_type,
    'Authorization': signature,
    'Log-Type': log_type,
    'x-ms-date': rfc1123date
}
response = requests.post(uri,data=body, headers=headers)
if (response.status_code >= 200 and response.status_code <= 299):
    print("Accepted")
else:
    print("Response code: {}".format(response.status_code))
    print(response.text)
def lambda_handler(event, context):
cloudwatch_event = event["awslogs"]["data"]
decode_base64 = base64.b64decode(cloudwatch_event)
decompress_data = gzip.decompress(decode_base64)
log_data = json.loads(decompress_data)
print(log_data)
awslogdata = json.dumps(log_data)
post_data(customer_id, shared_key, awslogdata, log_type)