我想通过java和spark来传输日志文件。我的代码很简单:
String base = "c:/test";
SparkConf conf = new SparkConf().setAppName("First_App").setMaster("local[2]");
JavaStreamingContext ssc= new JavaStreamingContext(conf, Seconds.apply(1));
JavaDStream<String> line = ssc.textFileStream(base);
line.map(new Function<String, Integer>()
{
@Override
public Integer call(String v1) throws Exception
{
System.out.println(v1);
int l = v1.length();
return l;
}
});
line.print();
ssc.start();
ssc.awaitTermination();
在c:/test
中是一个使用log back生成的日志文件。其内容是:
INFO:Data=Do Save Entity
INFO:Data=Do Delete Entity
但是当我运行我的应用程序时,在控制台中显示结果:
18/02/18 19:55:30 INFO JobScheduler: Added jobs for time 1518971130000 ms
18/02/18 19:55:30 INFO JobScheduler: Starting job streaming job 1518971130000 ms.0 from job set of time 1518971130000 ms
18/02/18 19:55:30 INFO JobScheduler: Finished job streaming job 1518971130000 ms.0 from job set of time 1518971130000 ms
18/02/18 19:55:30 INFO JobScheduler: Total delay: 0.291 s for time 1518971130000 ms (execution: 0.002 s)
-------------------------------------------
Time: 1518971130000 ms
-------------------------------------------
18/02/18 19:55:30 INFO FileInputDStream: Cleared 0 old files that were older than 1518971070000 ms:
18/02/18 19:55:30 INFO ReceivedBlockTracker: Deleting batches:
18/02/18 19:55:30 INFO InputInfoTracker: remove old batch metadata:
18/02/18 19:55:31 INFO FileInputDStream: Finding new files took 16 ms
18/02/18 19:55:31 INFO FileInputDStream: New files at time 1518971131000 ms:
-------------------------------------------
Time: 1518971131000 ms
-------------------------------------------
并继续输出。 我的目标很简单:流式传输日志文件,然后在控制台中打印其内容,当然,这是暂时的,因为最后,我想将文件保存在数据库中。