Perl:使用日志文件

时间:2015-07-31 12:25:07

标签: perl

请帮助我,每100行之后,我的周期如何在15秒内休眠,并在指定的时间内重新开启日志。

numberOfDays

3 个答案:

答案 0 :(得分:2)

使用File::Tail

use File::Tail;
my $file = File::Tail->new(name => $name, maxinterval => 300, adjustafter => 7);
while (defined(my $line = $file->read)) {
    print "$line";
}

如果这不符合您的需求,请参阅tellseek

答案 1 :(得分:1)

只需使用两个循环:

while () {   # Same as for (;;).
    for (1 .. 100) {
        open my $LOGFILE, '<', $logfile or die $!;
        if ($logline = <LOGFILE>) {
            # 條件
        }
        sleep 15;
    }
}

答案 2 :(得分:0)

喜欢这个

Error: application failed with exception
java.lang.NoClassDefFoundError: org/apache/spark/streaming/flume/FlumeUtils
        at SimpleApp.main(SimpleApp.java:61)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:367)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:77)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.flume.FlumeUtils
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)




  I have included the below dependency as pom file



<dependency> <!-- Spark Flume dependency -->
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-streaming-flume_2.10</artifactId>
  <version>1.2.1</version>
</dependency>
<dependency> <!-- Spark Core dependency -->
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.10</artifactId>
  <version>1.2.1</version>
</dependency>
<dependency> <!-- Spark Steaming dependency -->
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-streaming_2.10</artifactId>
  <version>1.2.1</version>