我想以编程方式导出this website处的可用记录。要手动执行此操作,我将导航到该页面,单击“导出”,然后选择csv。
我尝试从导出按钮复制链接,只要我有一个cookie(我相信)就可以使用。因此,wget或httr请求将导致html站点而不是文件。
我发现some help from an issue on the rvest github repo但最终我无法弄清楚问题制造者如何使用对象来保存cookie并在请求中使用它。
我在这里:
library(httr)
library(rvest)
apoc <- html_session("https://aws.state.ak.us/ApocReports/Registration/CandidateRegistration/CRForms.aspx")
headers <- headers(apoc)
GET(url = "https://aws.state.ak.us/ApocReports/Registration/CandidateRegistration/CRForms.aspx?exportAll=False&exportFormat=CSV&isExport=True",
add_headers(headers)) # how can I take the output from headers in httr and use it as an argument in GET from httr?
我检查了robots.txt,这是允许的。
答案 0 :(得分:1)
当您获取https://aws.state.ak.us/ApocReports/Registration/CandidateRegistration/CRForms.aspx时,可以从标题中获取__VIEWSTATE和__VIEWSTATEGENERATOR,然后在后续的POST查询和GET csv中重用那些__VIEWSTATE和__VIEWSTATEGENERATOR。
options(stringsAsFactors=FALSE)
library(httr)
library(curl)
library(xml2)
url <- 'https://aws.state.ak.us/ApocReports/Registration/CandidateRegistration/CRForms.aspx'
#get session headers
req <- GET(url)
req_html <- read_html(rawToChar(req$content))
fields <- c("__VIEWSTATE","__VIEWSTATEGENERATOR")
viewheaders <- lapply(fields, function(x) {
xml_attr(xml_find_first(req_html, paste0(".//input[@id='",x,"']")), "value")
})
names(viewheaders) <- fields
#post request. you can get the list of form fields using tools like Fiddler
params <- c(viewheaders,
list(
"M$ctl19"="M$UpdatePanel|M$C$csfFilter$btnExport",
"M$C$csfFilter$ddlNameType"="Any",
"M$C$csfFilter$ddlField"="Elections",
"M$C$csfFilter$ddlReportYear"="2017",
"M$C$csfFilter$ddlStatus"="Default",
"M$C$csfFilter$ddlValue"=-1,
"M$C$csfFilter$btnExport"="Export"))
resp <- POST(url, body=params, encode="form")
print(resp$status_code)
resptext <- rawToChar(resp$content)
#writeLines(resptext, "apoc.html")
#get response i.e. download csv
url <- "https://aws.state.ak.us//ApocReports/Registration/CandidateRegistration/CRForms.aspx?exportAll=True&exportFormat=CSV&isExport=True"
req <- GET(url, body=params)
read.csv(text=rawToChar(req$content))
您可能需要使用输入/代码来准确地获得所需内容。
这是另一个使用RCurl的类似解决方案: how-to-login-and-then-download-a-file-from-aspx-web-pages-with-r