没有正确地抓取xpath

时间:2018-01-03 13:34:18

标签: selenium xpath web-scraping

我尝试在此页面上使用以下xpath,但未正确加载。

groups = ".//*[contains(@class, 'sl-CouponParticipantWithBookCloses_Name ')]"

xp_bp1 = ".//following::div[contains(@class,'sl-MarketCouponValuesExplicit33')][./div[contains(@class,'gl-MarketColumnHeader')][.='1']]//span[@class='gl-ParticipantOddsOnly_Odds']"

目前的输出是..

[['3.00'], ['3.00'], ['3.00'] etc,,

所需:

[['3.00'], ['1.30'], ['1.25'] etc,,

数据我是after

脚本:

import csv
import time

from selenium import webdriver
from selenium.common.exceptions import TimeoutException, NoSuchElementException

driver = webdriver.Chrome()
driver.set_window_size(1024, 600)
driver.maximize_window()

driver.get('https://www.bet365.com.au/#/AC/B1/C1/D13/E108/F16/S1/')
driver.get('https://www.bet365.com.au/#/AC/B1/C1/D13/E108/F16/S1/')
time.sleep(10)




groups = ".//*[contains(@class, 'sl-CouponParticipantWithBookCloses_Name ')]"
#//div[contains(@class, 'gl-ParticipantOddsOnlyDarker gl-ParticipantOddsOnly gl-Participant_General sl-MarketCouponAdvancedBase_LastChild ')]
xp_bp1 = ".//following::div[contains(@class,'sl-MarketCouponValuesExplicit33')][./div[contains(@class,'gl-MarketColumnHeader')][.='1']]//span[@class='gl-ParticipantOddsOnly_Odds']"

while True:
    try:

        time.sleep(2)

        data = []

        for elem in driver.find_elements_by_xpath(groups):
            try:
                bp1 = elem.find_element_by_xpath(xp_bp1).text
            except:
                bp1 = None

            url1 = driver.current_url

            data.append([bp1])
        print(data)

        url1 = driver.current_url
        with open('test.csv', 'a', newline='', encoding="utf-8") as outfile:
            writer = csv.writer(outfile)
            for row in data:
                writer.writerow(row + [url1])

    except TimeoutException as ex:
        pass
    except NoSuchElementException as ex:
        print(ex)
        break

0 个答案:

没有答案