我使用Spark REST API执行Spark作业,如何获得每个Spark作业的stdout
和stderr
?
我阅读了文档Monitoring and Instrumentation。
application
API运作良好,响应如下:
[ {
"id" : "app-20190308150153-0039",
"name" : "Spark shell",
"attempts" : [ {
"startTime" : "2019-03-08T15:01:52.759GMT",
"endTime" : "1969-12-31T23:59:59.999GMT",
"lastUpdated" : "2019-03-08T15:01:52.759GMT",
"duration" : 0,
"sparkUser" : "root",
"completed" : false,
"appSparkVersion" : "2.3.1",
"lastUpdatedEpoch" : 1552057312759,
"startTimeEpoch" : 1552057312759,
"endTimeEpoch" : -1
} ]
} ]
所以我尝试了/applications/[base-app-id]/logs
这样的API:
wget localhost:4040/api/v1/applications/app-20190308150153-0039/logs
但是得到一个令人困惑的答复。响应是
PK
PK
是什么?如何获得正确的答案?