dryscrape:“找不到......的路线”

时间:2017-05-07 15:05:10

标签: javascript python python-2.7 dryscrape

上下文

我正在尝试编写自己的资金聚合器,因为市场上大多数可用工具尚未覆盖所有金融网站。我在raspberrypi上使用python 2.7.9。

由于请求库,我设法连接到我的两个帐户(一个放贷网站和一个我的养老金)。 我想要汇总的第三个网站给我带来了2个星期以来的困难,它的名字是https://www.amundi-ee.com

我发现该网站实际上是在使用JavaScript,经过多次研究后我最终使用了dryscrape(我不能使用selenium因为不再支持Arm)。

问题

运行此代码时:

import dryscrape

url='https://www.amundi-ee.com'
extensionInit='/psf/#login'
extensionConnect='/psf/authenticate'
extensionResult='/psf/#'
urlInit = url + extensionInit
urlConnect = url + extensionConnect
urlResult = url + extensionResult

s = dryscrape.Session()
s.visit(urlInit)
print s.body()
login = s.at_xpath('//*[@id="identifiant"]')
login.set("XXXXXXXX")
pwd = s.at_xpath('//*[@name="password"]')
pwd.set("YYYYYYY")
# Push the button
login.form().submit()
s.visit(urlConnect)
print s.body()
s.visit(urlResult)

代码访问urlConnect第21行时出现问题,正文打印行22返回以下内容:

{"code":405,"message":"No route found for \u0022GET \/authenticate\u0022: Method Not Allowed (Allow: POST)","errors":[]}

问题

为什么我有这样的错误消息?如何正确登录网站以检索我要查找的数据?

PS:我的代码灵感来自这个问题 Python dryscrape scrape page with cookies

1 个答案:

答案 0 :(得分:0)

好的,经过一个多月的努力解决这个问题,我很高兴地说我终于得到了我想要的东西

问题是什么?

基本上有两件大事(可能更多,但我可能已经忘记了):

  1. 密码必须通过按钮推送,这些是随机的 生成所以每次访问时都需要进行新的映射
  2. login.form().submit()通过点击验证按钮足够好来搞乱所需数据页面的访问
  3. 这是最终的代码,如果你发现一个糟糕的用法,请不要犹豫,因为我是一个python新手和一个零星的编码器。

    import dryscrape
    from bs4 import BeautifulSoup
    from lxml import html
    from time import sleep
    from webkit_server import InvalidResponseError
    from decimal import Decimal
    import re
    import sys 
    
    
    def getAmundi(seconds=0):
    
        url = 'https://www.amundi-ee.com/psf'
        extensionInit='/#login'
        urlInit = url + extensionInit
        urlResult = url + '/#'
        timeoutRetry=1
    
        if 'linux' in sys.platform:
            # start xvfb in case no X is running. Make sure xvfb 
            # is installed, otherwise this won't work!
            dryscrape.start_xvfb()
    
        print "connecting to " + url + " with " + str(seconds) + "s of loading wait..." 
        s = dryscrape.Session()
        s.visit(urlInit)
        sleep(seconds)
        s.set_attribute('auto_load_images', False)
        s.set_header('User-agent', 'Google Chrome')
        while True:
            try:
                q = s.at_xpath('//*[@id="identifiant"]')
                q.set("XXXXXXXX")
            except Exception as ex:
                seconds+=timeoutRetry
                print "Failed, retrying to get the loggin field in " + str(seconds) + "s"
                sleep(seconds)
                continue
            break 
    
        #get password button mapping
        print "loging in ..."
        soup = BeautifulSoup(s.body())
        button_number = range(10)
        for x in range(0, 10):
         button_number[int(soup.findAll('button')[x].text.strip())] = x
    
        #needed button
        button_1 = button_number[1] + 1
        button_2 = button_number[2] + 1
        button_3 = button_number[3] + 1
        button_5 = button_number[5] + 1
    
        #push buttons for password
        button = s.at_xpath('//*[@id="num-pad"]/button[' + str(button_2) +']')
        button.click()
        button = s.at_xpath('//*[@id="num-pad"]/button[' + str(button_1) +']')
        button.click()
        ..............
    
        # Push the validate button
        button = s.at_xpath('//*[@id="content"]/router-view/div/form/div[3]/input')
        button.click()
        print "accessing ..."
        sleep(seconds)
    
        while True:
            try:
                soup = BeautifulSoup(s.body())
                total_lended = soup.findAll('span')[8].text.strip()
                total_lended = total_lended = Decimal(total_lended.encode('ascii','ignore').replace(',','.').replace(' ',''))
                print total_lended
    
            except Exception as ex:
                seconds+=1
                print "Failed, retrying to get the data in " + str(seconds) + "s"
                sleep(seconds)
                continue
            break 
    
        s.reset()