错误:当前工作目录中不存在“NA”(Webscraping)
·
问题:错误:当前工作目录中不存在“NA”(Webscraping)
我正在尝试从以下网址抓取数据:https://university.careers360.com/colleges/list-of-degree-colleges-in-India我想点击每个大学名称并获取每个学院的特定数据。
首先,我所做的是将所有大学网址收集在一个向量中:
#loading the package:
library(xml2)
library(rvest)
library(stringr)
library(dplyr)
#Specifying the url for desired website to be scrapped
baseurl <- "https://university.careers360.com/colleges/list-of-degree-colleges-in-India"
#Reading the html content from Amazon
basewebpage <- read_html(baseurl)
#Extracting college name and its url
scraplinks <- function(url){
#Create an html document from the url
webpage <- xml2::read_html(url)
#Extract the URLs
url_ <- webpage %>%
rvest::html_nodes(".title a") %>%
rvest::html_attr("href")
#Extract the link text
link_ <- webpage %>%
rvest::html_nodes(".title a") %>%
rvest::html_text()
return(data_frame(link = link_, url = url_))
}
#College names and Urls
allcollegeurls<-scraplinks(baseurl)
到目前为止工作正常,但是当我对每个 url 使用 read\html 时,它显示错误。
#Reading the each url
for (i in allcollegeurls$url) {
clgwebpage <- read_html(allcollegeurls$url[i])
}
错误:当前工作目录中不存在“NA”(“C:/Users/User/Documents”)。
我什至使用了“break”命令,但仍然出现同样的错误:
#Reading the each url
for (i in allcollegeurls$url) {
clgwebpage <- read_html(allcollegeurls$url[i])
if(is.na(allcollegeurls$url[i]))break
}
请帮忙。
按要求发布 allcollegeurls 的 str -:
> str(allcollegeurls)
Classes ‘tbl_df’, ‘tbl’ and 'data.frame': 30 obs. of 2 variables:
$ link: chr "Netaji Subhas Institute of Technology, Delhi" "Hansraj
College, Delhi" "School of Business, University of Petroleum and Energy
Studies, D.." "Hindu College, Delhi" ...
$ url : chr "https://www.careers360.com/university/netaji-subhas-
university-of-technology-new-delhi"
"https://www.careers360.com/colleges/hansraj-college-delhi"
"https://www.careers360.com/colleges/school-of-business-university-of-
petroleum-and-energy-studies-dehradun"
"https://www.careers360.com/colleges/hindu-college-delhi" ...
解答
这部作品,
purrr::map(allcollegeurls$url, read_html)
map 函数:map 函数通过将函数应用于每个元素并返回与输入相同长度的向量来转换其输入。我喜欢避免在 R 中使用for。
更多推荐

所有评论(0)