嗨,我想将Parquet文件上传到ADLS第2代Blob。我正在使用下面的代码行创建blob并在其中上传镶木地板文件。
blob = BlobClient.from_connection_string(conn_str="Connection String", container_name="parquet", blob_name=outdir)
df.to_parquet('logs.parquet',compression='GZIP') #df is dataframe
with open("./logs.parquet", "rb") as data:
blob.upload_blob(data)
os.remove("logs.parquet")
我没有遇到任何错误,并且文件也被写入Blob。但是,我认为我做的不正确,因为ADX / kusto查询无法理解该文件,并且那里也看不到任何数据。
以下是我在Azure数据资源管理器中执行的从第二代ADLS中的上载实木复合地板文件中获取记录的步骤。
创建的外部表:
.create external table LogDataParquet(AppId_s:string,UserId_g:string,Email_s:string,RoleName_s:string,Operation_s:string,EntityId_s:string,EntityType_s:string,EntityName_s:string,TargetTitle_s:string,TimeGenerated:datetime)
kind=blob
dataformat=parquet
(
h@'https://streamoutalds2.blob.core.windows.net/stream-api-raw-testing;secret'
)
with
(
folder = "ExternalTables"
)
外部表列映射:
.create external table LogDataParquet parquet mapping "LogDataMapparquet" '[{ "column" : "AppId_s", "path" : "$.AppId_s"},{ "column" : "UserId_g", "path" : "$"},{ "column" : "Email_s", "path" : "$.Email_s"},{ "column" : "RoleName_s", "path" : "$.RoleName_s"},{ "column" : "Operation_s", "path" : "$.Operation_s"},{ "column" : "EntityId_s", "path" : "$.EntityId_s"}]'
外部表不提供记录
external_table('LogDataParquet')
没有记录
external_table('LogDataParquet') | count
1条记录-计数0
我在流分析中使用了类似的方案,其中接收传入流并将其以拼花格式保存到ADLS。在这种情况下,ADX中的外部表可以很好地获取记录。我觉得在以blob格式编写镶木地板文件时出现了错误-(使用open(“ ./ logs.parquet”,“ rb”)作为数据:)
答案 0 :(得分:1)
根据日志,外部表的定义如下:
.create external table LogDataParquet(AppId_s:string,UserId_g:string,Email_s:string,RoleName_s:string,Operation_s:string,EntityId_s:string,EntityType_s:string,EntityName_s:string,TargetTitle_s:string,TimeGenerated:datetime)
kind=blob
partition by
AppId_s,
bin(TimeGenerated,1d)
dataformat=parquet
(
'******'
)
with
(
folder = "ExternalTables"
)
PARTITION BY
子句告诉ADX预期的文件夹布局为:
<AppId_s>/<TimeGenerated, formatted as 'yyyy/MM/dd'>
例如:
https://streamoutalds2.blob.core.windows.net/stream-api-raw-testing;secret/SuperApp/2020/01/31
在本节中,您可以找到有关ADX如何在查询期间在外部存储上查找文件的更多信息:https://docs.microsoft.com/en-us/azure/data-explorer/kusto/management/external-tables-azurestorage-azuredatalake#artifact-filtering-logic
要根据文件夹布局固定外部表定义,请使用.alter
命令:
.alter external table LogDataParquet(AppId_s:string,UserId_g:string,Email_s:string,RoleName_s:string,Operation_s:string,EntityId_s:string,EntityType_s:string,EntityName_s:string,TargetTitle_s:string,TimeGenerated:datetime)
kind=blob
dataformat=parquet
(
h@'https://streamoutalds2.blob.core.windows.net/stream-api-raw-testing;secret'
)
with
(
folder = "ExternalTables"
)
顺便说一句,如果映射是幼稚的(例如,映射的列名称与数据源列名称匹配),则Parquet格式不需要它。