我正在为我的应用程序使用spark,但我得到了不必要的日志。如何在spark java应用程序中禁用日志

时间:2017-05-06 04:49:29

标签: apache-spark apache-spark-sql apache-spark-mllib

以下是我在控制台中获取的日志。

.spark.executor.Executor       : Finished task 185.0 in stage 189.0 (TID 4477). 11508 bytes result sent to driver
2017-05-06 10:00:18.767  INFO 3336 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 188.0 in stage 189.0 (TID 4480, localhost, executor driver, partition 188, ANY, 6317 bytes)
2017-05-06 10:00:18.769  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.769  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.769  INFO 3336 --- [result-getter-1] o.apache.spark.scheduler.TaskSetManager  : Finished task 185.0 in stage 189.0 (TID 4477) in 75 ms on localhost (executor driver) (185/200)
2017-05-06 10:00:18.769  INFO 3336 --- [launch worker-5] org.apache.spark.executor.Executor       : Running task 188.0 in stage 189.0 (TID 4480)
2017-05-06 10:00:18.770  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.770  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.771  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 2 non-empty blocks out of 201 blocks
2017-05-06 10:00:18.771  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 3 non-empty blocks out of 401 blocks
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.774  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 1 ms
2017-05-06 10:00:18.775  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 2 non-empty blocks out of 201 blocks
2017-05-06 10:00:18.775  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.777  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 3 non-empty blocks out of 401 blocks
2017-05-06 10:00:18.777  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.786  INFO 3336 --- [launch worker-6] org.apache.spark.executor.Executor       : Finished task 182.0 in stage 189.0 (TID 4474). 11508 bytes result sent to driver
2017-05-06 10:00:18.786  INFO 3336 --- [er-event-loop-1] o.apache.spark.scheduler.TaskSetManager  : Starting task 189.0 in stage 189.0 (TID 4481, localhost, executor driver, partition 189, ANY, 6317 bytes)
2017-05-06 10:00:18.787  INFO 3336 --- [result-getter-2] o.apache.spark.scheduler.TaskSetManager  : Finished task 182.0 in stage 189.0 (TID 4474) in 132 ms on localhost (executor driver) (186/200)
2017-05-06 10:00:18.787  INFO 3336 --- [launch worker-6] org.apache.spark.executor.Executor       : Running task 189.0 in stage 189.0 (TID 4481)
2017-05-06 10:00:18.790  INFO 3336 --- [launch worker-5] org.apache.spark.executor.Executor       : Finished task 188.0 in stage 189.0 (TID 4480). 11356 bytes result sent to driver
2017-05-06 10:00:18.790  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.790  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.791  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.791  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.792  INFO 3336 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 190.0 in stage 189.0 (TID 4482, localhost, executor driver, partition 190, ANY, 6317 bytes)
2017-05-06 10:00:18.792  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 2 non-empty blocks out of 201 blocks
2017-05-06 10:00:18.792  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.794  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 2 non-empty blocks out of 401 blocks
2017-05-06 10:00:18.794  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.796  INFO 3336 --- [launch worker-4] org.apache.spark.executor.Executor       : Finished task 187.0 in stage 189.0 (TID 4479). 11356 bytes result sent to driver
2017-05-06 10:00:18.798  INFO 3336 --- [er-event-loop-0] o.apache.spark.scheduler.TaskSetManager  : Starting task 191.0 in stage 189.0 (TID 4483, localhost, executor driver, partition 191, ANY, 6317 bytes)
2017-05-06 10:00:18.798  INFO 3336 --- [launch worker-5] org.apache.spark.executor.Executor       : Running task 190.0 in stage 189.0 (TID 4482)
2017-05-06 10:00:18.798  INFO 3336 --- [result-getter-3] o.apache.spark.scheduler.TaskSetManager  : Finished task 188.0 in stage 189.0 (TID 4480) in 31 ms on localhost (executor driver) (187/200)
2017-05-06 10:00:18.798  INFO 3336 --- [result-getter-3] o.apache.spark.scheduler.TaskSetManager  : Finished task 187.0 in stage 189.0 (TID 4479) in 35 ms on localhost (executor driver) (188/200)
2017-05-06 10:00:18.800  INFO 3336 --- [launch worker-4] org.apache.spark.executor.Executor       : Running task 191.0 in stage 189.0 (TID 4483)
2017-05-06 10:00:18.801  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.801  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.802  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.802  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.803  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.803  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.803  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 201 blocks
2017-05-06 10:00:18.803  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.804  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.804  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.804  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 401 blocks

下面是我的POM文件。

<dependencies>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-data-rest</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web-services</artifactId>
    </dependency>
     <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-databind</artifactId>
        </dependency> 
    <dependency>
        <groupId>info.debatty</groupId>
        <artifactId>java-string-similarity</artifactId>
        <version>RELEASE</version>
    </dependency>



    <dependency>
        <groupId>com.univocity</groupId>
        <artifactId>univocity-parsers</artifactId>
        <version>2.3.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.11</artifactId>
        <version>2.1.0</version>
    </dependency>
    <dependency>
        <groupId>org.codehaus.janino</groupId>
        <artifactId>commons-compiler</artifactId>
        <version>2.6.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.1.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.1.0</version>
    </dependency>
        <dependency>
        <groupId>com.oracle</groupId>
        <artifactId>ojdbc6</artifactId>
        <version>11.2.0.3</version>
    </dependency>


    <dependency>
       <groupId>org.apache.spark</groupId>
       <artifactId>spark-network-common_2.10</artifactId>
       <version>1.4.0</version>
      </dependency>
      <dependency>
       <groupId>org.codehaus.janino</groupId>
       <artifactId>commons-compiler</artifactId>
       <version>2.7.5</version>
      </dependency>


    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-test</artifactId>
        <scope>test</scope>
    </dependency>
</dependencies>

3 个答案:

答案 0 :(得分:2)

我认为您可以将日志级别更改为

sparkContext.setLogLevel("WARN")

您可以在

中选择日志级别
ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN

如果打印的日志位于spark-shell中,则可以从位于conf/log4j.properties的配置文件更改日志级别(将名称更改为conf/log4j.properties.template) 然后根据需要更改日志级别

log4j.rootCategory=INFO, console

并重新打开shell,你会看到更少的输出。

答案 1 :(得分:0)

在spark-default-conf中将此设置spark.history.fs.cleaner.enabled设置为true。这将有助于在7天后从hdfs清除日志,默认情况下可以通过设置spark.history.fs.cleaner.maxAge

来更改日志。

答案 2 :(得分:0)

我在scala中使用下面的代码:     Logger.getLogger( “组织”)。setLevel(Level.OFF)      Logger.getLogger( “阿卡”)。setLevel(Level.WARN)

你可以在Java中尝试类似的东西。