我想使用Stream Analytics将数据从IoT中心传输到Cosmos数据库和存储表。存储表正常。但是,我的流分析在活动日志中给我以下数据转换错误:
“输出记录不包含要用作分区键属性的列'deviceId'(区分大小写)。默认情况下,Azure Stream Analytics将无限期地重试写入事件,直到写入成功。请考虑选择“丢弃输出错误策略”跳过此类错误,因此错误不会阻止作业进度。在写入[0]批处理后遇到的错误。”
deviceId是我的cosmos数据库中的分区键。我可以看到数据正在正确地纳入流分析中。这是输入示例:
[{"deviceId":1,"dateStamp":"2019-03-27T18:55:43.3546682Z","temperature":6.510664596692969,"EventProcessedUtcTime":"2019-03-27T18:58:41.6172586Z","PartitionId":1,"EventEnqueuedUtcTime":"2019-03-27T18:55:43.3450000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"Simulator","ConnectionDeviceGenerationId":"636891216524279053","EnqueuedTime":"2019-03-27T18:55:43.3370000Z","StreamId":null}},
{"deviceId":1,"dateStamp":"2019-03-27T18:56:43.3809346Z","temperature":5.5680961758215428,"EventProcessedUtcTime":"2019-03-27T18:58:41.6172586Z","PartitionId":1,"EventEnqueuedUtcTime":"2019-03-27T18:56:43.3640000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"Simulator","ConnectionDeviceGenerationId":"636891216524279053","EnqueuedTime":"2019-03-27T18:56:43.3690000Z","StreamId":null}},
{"deviceId":1,"dateStamp":"2019-03-27T18:57:43.4122929Z","temperature":5.07182001605249,"EventProcessedUtcTime":"2019-03-27T18:58:41.6172586Z","PartitionId":1,"EventEnqueuedUtcTime":"2019-03-27T18:57:43.4050000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"Simulator","ConnectionDeviceGenerationId":"636891216524279053","EnqueuedTime":"2019-03-27T18:57:43.4010000Z","StreamId":null}}]
以下是我的SQL API查询,ColdStorageSmartFridge是存储表,HotStorageSmartFridge是cosmosdb:
SELECT
deviceId,
dateStamp as time,
temperature
INTO
[ColdStorageSmartFridge]
FROM
[IoTHubSmartFridge]
SELECT
deviceId,
dateStamp,
temperature
INTO
[HotStorageSmartFridge]
FROM
[IoTHubSmartFridge]
我已经为此工作了整个下午,但无法正常工作。我想念什么?
答案 0 :(得分:2)
似乎您的分区键可能区分大小写。以前,Azure Stream Analytics对不同的字段进行小写。尽管这不是预期的行为,但我们不想在服务中引入重大更改,因此已在“ compatibility level 1.1”下发布了此修复程序。
您能否尝试将作业兼容性级别更改为1.1,并让我知道它是否可以解决您的问题。
我们将在不久的将来更改默认的兼容级别。