排除ch.qos.logback溢出

时间:2017-09-11 14:57:43

标签: logging gradle build.gradle slf4j

我正在使用:

compile (libs.spring_boot_starter_logging)
testCompile 'org.apache.qpid:qpid-broker:0.28'

它们都有SLF4J绑定,所以我有错误:

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/Users/maciej.glowala.ext/.gradle/caches/modules-2/files-2.1/ch.qos.logback/logback-classic/1.1.11/ccedfbacef4a6515d2983e3f89ed753d5d4fb665/logback-classic-1.1.11.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/Users/maciej.glowala.ext/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-log4j12/1.6.4/6b4973e0320e220ec6534478d60233fd1cc51c9b/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]

我可以从qpid-broker中排除版本1.6.4,但它会产生错误,因为代理找不到某些包含在较新版本中的方法。当我想要排除版本1.1.11

compile (libs.spring_boot_starter_logging){
            exclude group: 'ch.qos.logback', module: 'logback-classic'
        }

我遇到了溢出错误

java.lang.StackOverflowError
    at org.apache.log4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:39)
    at org.apache.log4j.LogManager.getLogger(LogManager.java:45)
    at org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:64)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:358)
    at org.apache.log4j.Category.<init>(Category.java:57)
    at org.apache.log4j.Logger.<init>(Logger.java:37)
    at org.apache.log4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:43)
    at org.apache.log4j.LogManager.getLogger(LogManager.java:45)
    at org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:64)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:358)
    at org.apache.log4j.Category.<init>(Category.java:57)
    at org.apache.log4j.Logger.<init>(Logger.java:37)
    .
    .
    .

更重要的是,compile SLF4J testCompile configurations { testCompile.exclude group:'ch.qos.logback' } 时,import csv import os class createCSV: def __init__(self, filename): try: self.csvFile = open(filename,'ab') headers = ['Link','Description'] self.writer = csv.DictWriter(self.csvFile, delimiter='\t', fieldnames=headers) if os.stat(filename).st_size == 0: # write header only once self.writer.writeheader() except Exception, error: print error def write_row(self,link,desc): self.writer.writerow({'Link':link, 'Description':desc}) def __del__(self): self.csvFile.close() {{}}} {}} {}} {{}}

#!/usr/bin/env python

import requests
from bs4 import BeautifulSoup
import csv
import os

class createCSV:
    def __init__(self, filename):
        try:
            self.csvFile = open(filename,'ab')
            headers = ['Link','Description']
            self.writer = csv.DictWriter(self.csvFile, delimiter='\t', fieldnames=headers)

            if os.stat(filename).st_size == 0:  # write header only once
                self.writer.writeheader()       
        except Exception, error:
            print error

    def write_row(self,link,desc):
        self.writer.writerow({'Link':link, 'Description':desc})

    def __del__(self):
        self.csvFile.close()

res = requests.get("http://your_1st_url_here").text
soup = BeautifulSoup(res,"lxml")
links = soup.find_all("a")

# here we create the "test.csv" which 
# we will use to append values to
outfile = createCSV('test.csv')

for link in links:
        item_link = link.get("href").strip()
        item_text = link.text.replace("View Position","").encode('utf-8').strip()
        # append values to "test.csv"
        outfile.write_row(item_link, item_text)

# Remember that for the second scraper to write on the same .csv file 
# as the first one, you need to use the same 'createCSV' object - which 
# in this case is the "outfile".


res = requests.get("http://your_2nd_url_here").text
soup = BeautifulSoup(res,"lxml")
links = soup.find_all("a")

for li in soup.find('ul', class_='list-articles list').find_all('li'):
    level = li.find_all('dd', {'class': 'author'})[1].get_text()
    if "Graduate" in level:
        links = li.find_all(href=True)
        for link in links:
            if "career" in link.get("href") and 'COPENHAGEN' in link.text:
                item_link = link.get("href").strip()
                item_text = link.text.replace("View Position","").encode('utf-8').strip()
                # we use the same 'createCSV' object
                outfile.write_row(item_link, item_text)

但无论如何我仍然有同样的溢出错误。任何人都可以帮我如何设置build.gradle以使用更新版本的测试?

1 个答案:

答案 0 :(得分:0)

如果有人将来需要它,请通过额外的排除解决它:

testCompile.exclude group:'org.slf4j', module:'log4j-over-slf4j'