我目前正在尝试在Azure Data Factory V2上建立此管道(如您在所附图片中所见)。总之,此ERP系统将每月导出此报告(包含实际和预测数据的CSV文件),并将其保存在Blob容器中。一旦保存了该文件CSV,事件触发器便应激活该存储过程,该存储过程又将每月清除一次我在Azure SQL中的事实表中的所有实际数据。
一旦删除了实际数据,管道将随后具有复制活动,该复制活动又将CSV报告(事实+预测)复制到Azure SQL中的相同事实表。复制活动完成后,HTTP逻辑APP将从blob容器中删除该新的CSV文件。此工作流程将是一个经常性的事件,将在每个月进行一次。
到目前为止,我已经能够独立运行这3个活动。但是,当我将它们加入同一管道中时,尝试“全部发布”时出现了一些参数错误。因此,我不确定管道中的每个活动是否需要相同的参数?
我的管道的JSON代码如下:
{
"name": "TM1_pipeline",
"properties": {
"activities": [
{
"name": "Copy Data1",
"type": "Copy",
"dependsOn": [
{
"activity": "Stored Procedure1",
"dependencyConditions": [
"Succeeded"
]
}
],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false
},
"typeProperties": {
"source": {
"type": "BlobSource",
"recursive": false
},
"sink": {
"type": "SqlSink",
"writeBatchSize": 10000
},
"enableStaging": false,
"dataIntegrationUnits": 0
},
"inputs": [
{
"referenceName": "SourceDataset_e7y",
"type": "DatasetReference",
"parameters": {
"copyFolder": {
"value": "@pipeline().parameters.sourceFolder",
"type": "Expression"
},
"copyFile": {
"value": "@pipeline().parameters.sourceFile",
"type": "Expression"
}
}
}
],
"outputs": [
{
"referenceName": "DestinationDataset_e7y",
"type": "DatasetReference"
}
]
},
{
"name": "Stored Procedure1",
"type": "SqlServerStoredProcedure",
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"typeProperties": {
"storedProcedureName": "[dbo].[test_sp]"
},
"linkedServiceName": {
"referenceName": "AzureSqlDatabase",
"type": "LinkedServiceReference"
}
},
{
"name": "Web1",
"type": "WebActivity",
"dependsOn": [
{
"activity": "Copy Data1",
"dependencyConditions": [
"Succeeded"
]
}
],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"typeProperties": {
"url": "...",
"method": "POST",
"body": {
"value": "@pipeline().parameters.BlobName",
"type": "Expression"
}
}
}
],
"parameters": {
"sourceFolder": {
"type": "String",
"defaultValue": "@pipeline().parameters.sourceFolder"
},
"sourceFile": {
"type": "String",
"defaultValue": "@pipeline().parameters.sourceFile"
},
"BlobName": {
"type": "String",
"defaultValue": {
"blobname": "source-csv/test.csv"
}
}
}
},
"type": "Microsoft.DataFactory/factories/pipelines"
}