证书已过期,不能与cerify = True一起使用; requests.exceptions.SSLError证书验证失败

时间:2018-10-19 16:02:38

标签: python web-scraping beautifulsoup python-requests

我是一名真正的Python初学者,并且基本上从互联网上学习了所有内容-因此,如果我可能没有正确掌握所有概念,请原谅。

我的问题是我尝试使用requestsBeautifulSoup对网络抓取进行编程。自从两天以来,我收到证书过期的错误消息,如果我输入this website也是如此,我什至无法将其作为例外添加到资源管理器中。

这是我的代码:

def project_spider(max_pages):
    global page
    page = 1
    #for i in range(1, max_pages+1):
    while page <= max_pages:
       # for i in range(1, page + 1)
            page += 1
            url = 'https://hubbub.org/projects/?page=' + str(page)
            # Collect list of urls
            try:
                source_code = requests.get(url, allow_redirects=False, timeout=15, verify=False)
            except Exception or AttributeError or ConnectionError or IOError:
                print 'Failed to open url.'
                pass
            # Turn urls to text
            plain_text = source_code.text.encode('utf-8')
            # define object with all text on website
            soup = BeautifulSoup(plain_text, 'html.parser')
            # define variable that finds in the text data everything that is in the html code considered "diverse" and has the attributes 'col...' class
            data = soup.findAll('div', attrs={'class': 'col-xs-12 col-sm-6 col-md-4 col-lg-3'})
            # for every found diverse in the data variable
            for div in data:
               #search all diverse for links (a)
               links = div.findAll('a', href=True)
               global names
               names = div.find('h4').contents[0]
               print(names)
               for a in links:
                   global links2
                   links2 = a['href']
                   print(links2)
                   get_single_item_data(links2)

专家可能会采用不同的编程方式。但是,我尝试使用verify = False和session()修复它,但是它不起作用。我还尝试跳过它在(5)中的页面,但无法跳过。我现在真的很绝望,因为我得到的只是这个错误:

https://rabbitraisers.org/p/fantasticfloats/
Traceback (most recent call last):
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\connectionpool.py", line 600, in urlopen
    chunked=chunked)
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\connectionpool.py", line 343, in _make_request
    self._validate_conn(conn)
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\connectionpool.py", line 849, in _validate_conn
    conn.connect()
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\connection.py", line 356, in connect
    ssl_context=context)
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\util\ssl_.py", line 359, in ssl_wrap_socket
    return context.wrap_socket(sock, server_hostname=server_hostname)
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\ssl.py", line 412, in wrap_socket
    session=session
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\ssl.py", line 850, in _create
    self.do_handshake()
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\ssl.py", line 1108, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1045)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\requests\adapters.py", line 445, in send
    timeout=timeout
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\connectionpool.py", line 638, in urlopen
    _stacktrace=sys.exc_info()[2])
  File "C:\Users\stockisa\AppData\Local\Programs\Python\Python37\lib\site-packages\urllib3\util\retry.py", line 398, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='rabbitraisers.org', port=443): Max retries exceeded with url: /p/fantasticfloats/ (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1045)')))

1 个答案:

答案 0 :(得分:0)

将其导入源代码的顶部

from requests.packages.urllib3.exceptions import InsecureRequestWarning

然后将其作为project_spider函数的第一行

requests.packages.urllib3.disable_warnings(InsecureRequestWarning)