天蓝色数据工厂-超时接收器端

时间:2020-08-10 05:50:04

标签: azure timeout azure-data-factory

我尝试将大表转换为Azure SQL Server。 小的完成了,大的没有完成,落在了超时接收端。 错误已附上。 虽然sql server没有指定任何超时,但仍然无法正常工作。

sql db是800 DTU。

如果这是问题,我如何增加接收器侧的超时时间。

数据工厂不是应该保存连接并在失败后重试吗?

errors:
{
    "dataRead": 1372864152,
    "dataWritten": 1372864152,
    "sourcePeakConnections": 1,
    "sinkPeakConnections": 2,
    "rowsRead": 2205634,
    "rowsCopied": 2205634,
    "copyDuration": 8010,
    "throughput": 167.377,
    "errors": [
        {
            "Code": 11000,
            "Message": "Failure happened on 'Sink' side. 'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Timeouts in SQL write operation.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.,Source=.Net SqlClient Data Provider,SqlErrorNumber=-2,Class=11,ErrorCode=-2146232060,State=0,Errors=[{Class=11,Number=-2,State=0,Message=Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.,},],''Type=System.ComponentModel.Win32Exception,Message=The wait operation timed out,Source=,'",
            "EventType": 0,
            "Category": 5,
            "Data": {
                "FailureInitiator": "Sink"
            },
            "MsgId": null,
            "ExceptionType": null,
            "Source": null,
            "StackTrace": null,
            "InnerEventInfos": []
        }
    ],
    "effectiveIntegrationRuntime": "XXX",
    "billingReference": {
        "activityType": "DataMovement",
        "billableDuration": [
            {
                "meterType": "SelfhostedIR",
                "duration": 2.0166666666666666,
                "unit": "Hours"
            }
        ]
    },
    "usedParallelCopies": 1,
    "executionDetails": [
        {
            "source": {
                "type": "SqlServer"
            },
            "sink": {
                "type": "SqlServer"
            },
            "status": "Failed",
            "start": "2020-08-03T17:16:58.8388528Z",
            "duration": 8010,
            "usedParallelCopies": 1,
            "profile": {
                "queue": {
                    "status": "Completed",
                    "duration": 810
                },
                "preCopyScript": {
                    "status": "Completed",
                    "duration": 0
                },
                "transfer": {
                    "status": "Completed",
                    "duration": 7200,
                    "details": {
                        "readingFromSource": {
                            "type": "SqlServer",
                            "workingDuration": 7156,
                            "timeToFirstByte": 0
                        },
                        "writingToSink": {
                            "type": "SqlServer"
                        }
                    }
                }
            },
            "detailedDurations": {
                "queuingDuration": 810,
                "preCopyScriptDuration": 0,
                "timeToFirstByte": 0,
                "transferDuration": 7200
            }
        }
    ],
    "dataConsistencyVerification": {
        "VerificationResult": "NotVerified"
    },
    "durationInQueue": {
        "integrationRuntimeQueue": 810
    }
}

2 个答案:

答案 0 :(得分:0)

请尝试在接收端设置写入批处理超时:

  1. 批处理插入操作完成之前的等待时间 超时。允许的值为时间跨度。例如“ 00:30:00” (30分钟)。

enter image description here

参考:Azure SQL Database as the sink

答案 1 :(得分:0)

想补充我的情况:与 OP 相同的错误,但在执行数据流期间引发异常。

按照接受的答案的指示,我将批量大小设置为数据流接收器中的某个限制将有助于解决我的问题。