当程序突然蟒蛇结束时如何保持json的完整性

时间:2016-02-23 16:45:13

标签: python json

我正在运行多个线程'在一个打开10个孩子的脚本中。每个子脚本都查询API并将一些数据记录到json中。我记录了另一个json中保存的每个json,但是代码有时会突然终止并破坏json的完整性,所以它不能重新打开,例如:

["2016_02_21_18_46_41", 1], ["2016_02_21_18_46_42", 1], ["2016_02_21_18_46_4

如何才能使它只在json完成时才将数据写入?或者我该如何解决这个问题?

为了连续性,这里是我的父脚本

from threading import Thread
import sys

sys.path.append('/python/loanrates/master')

names =  ['BTS', 'ETH', 'CLAM', 'DOGE', 'FCT', 'MAID', 'STR', 'XMR', 'XRP', 'BTC' ]
threads = []
for name in names: 
    sys.path.append('/python/loanrates/'+name)

import Master

for name in names:
    T = Thread(target=Master.main, args=(name,))
    print T
    threads.append(T)

for thread_ in threads:
    thread_.start()

for thread_ in threads:
    thread_.join()

这是我的孩子脚本

import arrow
import json
import pickle  
import time
import urllib

def main(name):

    date = arrow.utcnow().format('YYYY_MM_DD_HH_mm_ss')
    previous_date = "2016_02_18_09_02_52"
    previous_second = date[-2:]
    count = 0

    print name,'has started'

    while True:
            date = arrow.utcnow().format('YYYY_MM_DD_HH_mm_ss')
            second = int(date[-2:])
            url = 'https://poloniex.com/public?command=returnLoanOrders&currency='+name
            try:
                response = urllib.urlopen(url)
                data = json.loads(response.read())
            except:
                data = 'error'
                print 'error wth name has occured, probably been blocked by polo'
                time.sleep(10)

            #open previous data
            with open( 'D:/python/loanrates/'+name+'/'+previous_date+'.json', 'r') as f:
                previous_data = json.load(f)

            #open date log
            with open( 'D:/python/loanrates/'+name+'/date_store.json', 'r') as f:
                date_store = json.load(f)

            #compare new to old data
            # if new != old, new data is saved and that date recives a 1 in the 'date_store' dict,
            # signifying theres is new data for that date
            if previous_data != data and previous_second != second and second%10 == 0:

                date_store.append((date,1))

                with open( 'D:/python/loanrates/'+name+'/'+date+'.json', 'w') as f:
                    json.dump(data, f)

                with open( 'D:/python/loanrates/'+name+'/date_store.json', 'w') as f:
                    json.dump(date_store, f)

                previous_date = date
                previous_second = second

                count += 1
                if count == 1000: print 'name has logged 1000'
            # if new = old the new data hasn't changed and isn't saved, that date 'date_store' recieves a 0
            # signifying that the previous date of value 1 can be subtituted for this date
            elif previous_second != second and second%10 == 0:
                date_store.append((date,0))

                with open( 'D:/python/loanrates/'+name+'/date_store.json', 'w') as f:
                    json.dump(date_store, f)

                previous_second = second

if __name__ == '__main__':
    try:
        main(name)
    except KeyboardInterrupt:
        print 'Interrupted'
        try:
            sys.exit(0)
        except SystemExit:
            os._exit(0)
    except:
        pass

(你可以看到只有一个子脚本,但是它在for循环中)

1 个答案:

答案 0 :(得分:0)

如果在数据未损坏的情况下成功加载备份文件时,

try / except应该起作用。然后,当尝试加载json时,请在发生异常时加载备份文件