将一阵真实的python脚本更改为仅运行一次

时间:2019-02-26 03:44:48

标签: python facebook-graph-api

我是python的新手,我希望这段代码仅运行一次并停止运行,而不是每30秒运行一次

因为我想使用命令行每5秒使用不同的访问令牌运行这样的多个代码。 当我尝试这段代码时,它永远不会跳到第二个,因为它是有一段时间的:

import requests
import time

api_url = "https://graph.facebook.com/v2.9/"
access_token = "access token"
graph_url = "site url"
post_data = { 'id':graph_url, 'scrape':True, 'access_token':access_token }
# Beware of rate limiting if trying to increase frequency.
refresh_rate = 30 # refresh rate in second

while True:
    try:
        resp = requests.post(api_url, data = post_data)
        if resp.status_code == 200:
            contents = resp.json()
            print(contents['title'])
        else:
            error = "Warning: Status Code {}\n{}\n".format(
                resp.status_code, resp.content)
            print(error)
            raise RuntimeWarning(error)
    except Exception as e:
        f = open ("open_graph_refresher.log", "a")
        f.write("{} : {}".format(type(e), e))
        f.close()
        print(e)
    time.sleep(refresh_rate)

1 个答案:

答案 0 :(得分:0)

据我了解,您正在尝试为多个访问令牌执行这段代码。为了简化您的工作,请将所有access_tokens列为清单,并使用以下代码。它假定您事先了解所有access_tokens

import requests
import time

def scrape_facebook(api_url, access_token, graph_url):
    """ Scrapes the given access token"""
    post_data = { 'id':graph_url, 'scrape':True, 'access_token':access_token }

    try:
        resp = requests.post(api_url, data = post_data)
        if resp.status_code == 200:
            contents = resp.json()
            print(contents['title'])
        else:
            error = "Warning: Status Code {}\n{}\n".format(
                resp.status_code, resp.content)
            print(error)
            raise RuntimeWarning(error)
    except Exception as e:
        f = open (access_token+"_"+"open_graph_refresher.log", "a")
        f.write("{} : {}".format(type(e), e))
        f.close()
        print(e)

access_token = ['a','b','c']
graph_url = ['sss','xxx','ppp']
api_url = "https://graph.facebook.com/v2.9/"

for n in range(len(graph_url)):
    scrape_facebook(api_url, access_token[n], graph_url[n])
    time.sleep(5)