数据工厂每天安排的多项活动一旦无效

时间:2017-03-12 17:52:39

标签: azure-data-factory

通过更改以下管道进行多项测试后,我将在此论坛中发布此信息,以寻求专家的帮助。以下管道的基本思想是 Activity-1 将通过调用'U-SQL'脚本进行一些计算,该脚本将结果输出到'Data Lake商店'。现在 Activity-2 将获取从 Activity-1 生成的数据,并将该数据复制到“Azure-Sql”。这两项活动都安排在每日运行一次。但是我没有看到管道被触发过。如果计划每15分钟运行一次,它运行正常,我做错了什么?

{
        "name": "IncrementalLoad_Pipeline",
        "properties": {
            "description": "This is a pipeline to to pick files from Data Lake as per the slice start date time.",
            "activities": [
                {
                    "type": "DataLakeAnalyticsU-SQL",
                    "typeProperties": {
                        "scriptPath": "andeblobcontainer\\script.usql",
                        "scriptLinkedService": "AzureStorageLinkedService",
                        "degreeOfParallelism": 3,
                        "priority": 100,
                        "parameters": {
                            "in": "$$Text.Format('/Input/SyncToCentralDataLog_{0:dd_MM_yyyy}.txt', Date.AddDays(SliceStart,-7))",
                            "out": "$$Text.Format('/Output/incremental_load/StcAnalytics_{0:dd_MM_yyyy}.tsv', Date.AddDays(SliceStart,-7))"
                        }
                    },
                    "inputs": [
                        {
                            "name": "IncrementalLoad_Input"
                        }
                    ],
                    "outputs": [
                        {
                            "name": "IncrementalLoad_Output"
                        }
                    ],
                    "scheduler": {
                        "frequency": "Day",
                        "interval": 1
                    },
                    "name": "IncrementalLoad",
                    "linkedServiceName": "AzureDataLakeAnalyticsLinkedService"
                },
                {
                    "type": "Copy",
                    "typeProperties": {
                        "source": {
                            "type": "AzureDataLakeStoreSource",
                            "recursive": false
                        },
                        "sink": {
                            "type": "SqlSink",
                            "writeBatchSize": 0,
                            "writeBatchTimeout": "00:00:00"
                        }
                    },
                    "inputs": [
                        {
                            "name": "IncrementalLoad_Input2"
                        },
                        {
                            "name": "IncrementalLoad_Output"
                        }
                    ],
                    "outputs": [
                        {
                            "name": "AzureSQLDatasetOutput"
                        }
                    ],
                    "scheduler": {
                        "frequency": "Day",
                        "interval": 1
                    },
                    "name": "CopyToAzureSql"
                }
            ],
            "start": "2016-09-12T23:45:00Z",
            "end": "2016-09-13T01:00:00Z",
            "isPaused": false,
            "hubName": "vijaytest-datafactory_hub",
            "pipelineMode": "Scheduled"
        }
    }

1 个答案:

答案 0 :(得分:1)

使用您在上面提供的JSON,开始和结束时段都不够大。 ADF无法在不到一天的时间内提供一组每日时间片。

尝试将开始和结束时段增加到1周。例如:

        "start": "2016-09-12",
        "end": "2016-09-18",

您应该能够延长结束日期而不会丢弃管道。

希望这有帮助。