我想从www.geocaching.com下载HTML网页以获取一些信息。但是,我想下载的网页有两种显示方式,具体取决于用户是否登录。我想要抓取的信息只能在用户登录时找到。
过去,我使用download.file()
和mapply()
从网址列表(geocache_link_list
)下载HTML文件,并使用其他列表(geocache_name_list
)命名这样:
mapply(function(x,y) download.file(x,y), geocache_link_list, geocache_name_list)
但是这会下载未登录的页面。
我也尝试使用RCurl
,但这也下载了非登录页面,因此我从未尝试将其合并到mapply函数中:
library(RCurl)
baseurl <- geocache_link_list[1]
un <- readline("Type the username:")
pw <- readline("Type the password:")
upw <- paste(un, pw, sep = ":")
是否可以使用RSelenium
或RCurl
之类的内容在R中运行浏览器,以便输入登录详细信息,然后重定向到所需的页面并下载?
答案 0 :(得分:1)
这很容易!
library(RCurl)
library(xml2)
html_inputs <- function(p, xpath = "//form/input") {
xml_find_all(p, xpath) %>% {setNames(as.list(xml_attr(., "value")), xml_attr(., "name"))}
}
get_header <- function(){
## RCurl设置, 直接把cookie粘贴过来,即可登录
myHttpheader<- c(
"User-Agent" = "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.71",
# "Accept" = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8",
"Accept-Language" = "zh-CN,zh;q=0.8,en-US;q=0.5,en;q=0.3",
# "Accept-Encoding"="gzip, deflate",
"Connection"="keep-alive",
DNT = 1,
"Upgrade-Insecure-Requests" = 1,
"Host" = "www.geocaching.com")
file_cookie <- "cookies.txt"
ch <- getCurlHandle(# cainfo="pem/cacert.pem",
# ssl.verifyhost=FALSE, ssl.verifypeer = FALSE,
followlocation = TRUE,
verbose = TRUE,
cookiejar = file_cookie, cookiefile = file_cookie,
httpheader = myHttpheader)#带上百宝箱开始上路
tmp <- curlSetOpt(curl = ch)
return(ch)
}
ch <- get_header()
h <- basicHeaderGatherer()
#input your username and password here
user <- "kongdd"
pwd <- "****"
p <- getURL("https://www.geocaching.com/account/login", curl = ch)
tooken <- html_inputs(read_html(p))[1]
params <- list(Password = user,
Username = pwd) %>% c(., tooken)
p2 <- postForm("https://www.geocaching.com/account/login", curl = ch,
.params = params)
grep("kongdd", p2)#If 1 returned, it indicate you have login successfully.
成功登录后,您可以使用参数 curl
访问数据。