我正在使用PyGithub抓取一些存储库,尽管在遍历搜索页面时遇到一些错误。
def scrape_interval(self, interval):
for repo_number, repo in self.search(interval):
code...
def search(self, interval):
try:
iterator = enumerate(self.github.search_repositories(query="Laravel created:" + interval))
except:
print.warning("Going to sleep for 1 hour. The search API hit the limit")
time.sleep(3600)
iterator = self.search(interval)
return iterator
如您所见,在def search
中创建迭代器时,我尝试捕获错误。但是错误抛出在行for repo_number, repo in self.search(interval):
上,所以这是在迭代器获得下一项的某个时候?
我可以采取哪些措施来使这些错误可捕获?我最好避免将整个for循环包装在try子句中,而是在迭代本身中进行管理。
有关错误本身的参考:
File "/Users/olofjondelius/Documents/Code/laravel-ai/src/examples/migration-analysis/../../GithubScraper.py", line 47, in scrape_interval
for repo_number, repo in self.search(interval):
File "/anaconda3/envs/laravel-ai/lib/python3.7/site-packages/github/PaginatedList.py", line 58, in _iter_
newElements = self._grow()
File "/anaconda3/envs/laravel-ai/lib/python3.7/site-packages/github/PaginatedList.py", line 70, in _grow
newElements = self._fetchNextPage()
File "/anaconda3/envs/laravel-ai/lib/python3.7/site-packages/github/PaginatedList.py", line 172, in _fetchNextPage
headers=self.__headers
File "/anaconda3/envs/laravel-ai/lib/python3.7/site-packages/github/Requester.py", line 185, in requestJsonAndCheck
return self.__check(*self.requestJson(verb, url, parameters, headers, input, cnx))
File "/anaconda3/envs/laravel-ai/lib/python3.7/site-packages/github/Requester.py", line 231, in requestJson
return self.__requestEncode(cnx, verb, url, parameters, headers, input, encode)
File "/anaconda3/envs/laravel-ai/lib/python3.7/site-packages/github/Requester.py", line 284, in __requestEncode
status, responseHeaders, output = self.__requestRaw(cnx, verb, url, requestHeaders, encoded_input)
File "/anaconda3/envs/laravel-ai/lib/python3.7/site-packages/github/Requester.py", line 309, in __requestRaw
requestHeaders
File "/anaconda3/envs/laravel-ai/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/envs/laravel-ai/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/envs/laravel-ai/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/envs/laravel-ai/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/envs/laravel-ai/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/envs/laravel-ai/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/envs/laravel-ai/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/envs/laravel-ai/lib/python3.7/socket.py", line 707, in create_connection
for res in getaddrinfo(host, port, 0, SOCK_STREAM):
File "/anaconda3/envs/laravel-ai/lib/python3.7/socket.py", line 748, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno 8] nodename nor servname provided, or not known
答案 0 :(得分:3)
听起来像是在迭代器上而不是在创建迭代器时引发异常。当前的try
和except
块仅捕获在调用self.github.search_repositories
时立即引发的异常,而不会在消耗结果时出现任何异常。
要解决此问题,可以使search
函数成为生成器。这样一来,您就可以在拥有值的同时产生值,但是仍然可以捕获异常并根据需要进行重试。
尝试这样的事情:
def search(self, interval):
while True:
try:
it = enumerate(self.github.search_repositories(query="Laravel created:" + interval))
yield from it
return # if we completed the yield from without an exception, we're done!
except: # you should probably limit this to catching a specific exception types
print.warning("Going to sleep for 1 hour. The search API hit the limit")
time.sleep(3600)
正如我在评论中指出的那样,您应该将裸露的except
语句更改为except socket.gaierror
或类似的内容,以免抑制所有 异常,只有您所期望的,延迟会为您解决。仍然应该允许某些真正意想不到的事情来停止程序(因为它可能反映出代码中其他地方的错误)。