我正在使用Logstash版本5.5.3 &Google_bigquery输出插件v3.2.1
我正在尝试将数据从Kafka主题加载到BigQuery(以调试日志级别运行)
在日志中,我看到这样的行:
BQ:上传对象。 {:filename =>“ / tmp / logstash-bq-5e1bba825d869e2118db8107f3019b2694a52505ef3b5973596f78ef5cfe / logstash_bq_barak-agg-tms-1.c.rnd-tms.internal_2018-12-05T13:00.part000.log”, :table_id =>“ logstash_2018_12_05T13_00”}
,我可以看到数据是在计算机上的临时文件中创建的。
但是,Logstash无法将数据加载到BigQuery:
[2018-12-05T13:19:02,302] [错误] [logstash.outputs.googlebigquery] BQ:无法上传文件。重试。 {:exception =>#
}
我的输入是平面json,并使用json_schema配置:
json_schema => { fields => [ { name => "sourceId" type => "STRING" },{ name => "targetId" type => "STRING" },{ name => "tmsTimestamp" type => "TIMESTAMP" },{ name => "latency" type => "FLOAT" },{ name => "targetType" type => "STRING" },{ name => "type" type => "STRING" },{ name => "network" type => "STRING" },{ name => "targetIp" type => "STRING" },{ name => "linkId" type => "STRING" },{ name => "sourceIp" type => "STRING" },{ name => "targetHostname" type => "STRING" },{ name => "targetTMAPort" type => "INTEGER" },{ name => "timestamp" type => "TIMESTAMP" } ] }
答案 0 :(得分:0)
事实证明,我遇到了很多配置和授权问题,但是特定版本的插件(3.2.1)对我隐藏了它们。
我降级到3.0.1版,能够看到问题的具体性质,因此可以解决它们。
这很有帮助: https://github.com/logstash-plugins/logstash-codec-cloudtrail/issues/15