我正在尝试在Azure数据工厂V1中执行管道,该管道将对文件执行Azure批处理执行。我使用blob存储作为输入和输出来实现它并且它工作。但是,我不是要将输入和输出更改为我的数据湖存储中的文件夹。当我尝试部署它时,它会给我以下错误:
Entity provisioning failed: AzureML Activity 'MLActivity' specifies 'DatalakeInput' in a property that requires an Azure Blob Dataset reference.
如何将输入和输出作为datalakestore而不是blob?
管道:
{
"name": "MLPipeline",
"properties": {
"description": "use AzureML model",
"activities": [
{
"type": "AzureMLBatchExecution",
"typeProperties": {
"webServiceInput": "DatalakeInput",
"webServiceOutputs": {
"output1": "DatalakeOutput"
},
"webServiceInputs": {},
"globalParameters": {}
},
"inputs": [
{
"name": "DatalakeInput"
}
],
"outputs": [
{
"name": "DatalakeOutput"
}
],
"policy": {
"timeout": "02:00:00",
"concurrency": 3,
"executionPriorityOrder": "NewestFirst",
"retry": 1
},
"scheduler": {
"frequency": "Hour",
"interval": 1
},
"name": "MLActivity",
"description": "description",
"linkedServiceName": "MyAzureMLLinkedService"
}
],
"start": "2016-02-08T00:00:00Z",
"end": "2016-02-08T00:00:00Z",
"isPaused": false,
"hubName": "hubname",
"pipelineMode": "Scheduled"
}
}
输出数据集:
{
"name": "DatalakeOutput",
"properties": {
"published": false,
"type": "AzureDataLakeStore",
"linkedServiceName": "AzureDataLakeStoreLinkedService",
"typeProperties": {
"folderPath": "/DATA_MANAGEMENT/"
},
"availability": {
"frequency": "Hour",
"interval": 1
}
}
}
输入数据集:
{
"name": "DatalakeInput",
"properties": {
"published": false,
"type": "AzureDataLakeStore",
"linkedServiceName": "AzureDataLakeStoreLinkedService",
"typeProperties": {
"fileName": "data.csv",
"folderPath": "/RAW/",
"format": {
"type": "TextFormat",
"columnDelimiter": ","
}
},
"availability": {
"frequency": "Hour",
"interval": 1
}
}
}
AzureDatalakeStoreLinkedService:
{
"name": "AzureDataLakeStoreLinkedService",
"properties": {
"description": "",
"hubName": "xyzdatafactoryv1_hub",
"type": "AzureDataLakeStore",
"typeProperties": {
"dataLakeStoreUri": "https://xyzdatastore.azuredatalakestore.net/webhdfs/v1",
"authorization": "**********",
"sessionId": "**********",
"subscriptionId": "*****",
"resourceGroupName": "xyzresourcegroup"
}
}
}
链接服务是在tutorial基于数据工厂V1之后完成的。
答案 0 :(得分:2)
我认为AzureDataLakeStoreLinkedService存在一些问题。请验证。
根据用于访问数据存储的身份验证,您的AzureDataLakeStoreLinkedService json必须如下所示 -
使用服务主体身份验证
{
"name": "AzureDataLakeStoreLinkedService",
"properties": {
"type": "AzureDataLakeStore",
"typeProperties": {
"dataLakeStoreUri": "https://<accountname>.azuredatalakestore.net/webhdfs/v1",
"servicePrincipalId": "<service principal id>",
"servicePrincipalKey": {
"type": "SecureString",
"value": "<service principal key>"
},
"tenant": "<tenant info, e.g. microsoft.onmicrosoft.com>",
"subscriptionId": "<subscription of ADLS>",
"resourceGroupName": "<resource group of ADLS>"
},
"connectVia": {
"referenceName": "<name of Integration Runtime>",
"type": "IntegrationRuntimeReference"
}
}
}
使用托管服务身份验证
{
"name": "AzureDataLakeStoreLinkedService",
"properties": {
"type": "AzureDataLakeStore",
"typeProperties": {
"dataLakeStoreUri": "https://<accountname>.azuredatalakestore.net/webhdfs/v1",
"tenant": "<tenant info, e.g. microsoft.onmicrosoft.com>",
"subscriptionId": "<subscription of ADLS>",
"resourceGroupName": "<resource group of ADLS>"
},
"connectVia": {
"referenceName": "<name of Integration Runtime>",
"type": "IntegrationRuntimeReference"
}
}
}
这是Microsoft文档供参考 - Copy data to or from Azure Data Lake Store by using Azure Data Factory