如果在python中有一个庞大的字典,如何减少内存和加速?

时间:2018-04-20 07:01:10

标签: python macos list dictionary memory

我想查找一个字典,其中值(字符串)有特殊字并带有键。该字典已将负载转储到多个文件中(其中任何一个都是几百个maga字节,因此它们的总和按GB的顺序排列)。所以速度太慢,你有没有更好的方法来加速程序?我的代码如下:

teachertag=['K', 'curriculum', 'School', 'childhood', ]

import pickle
import re
import os
import time
start_time = time.time()


itemlist2=[]
for i in document:
    with open (i, 'rb') as fp:
        itemlist = pickle.load(fp)
        itemlist2+=itemlist



teacher=[]
parentandstudent=[]
parentandstudenttweet=[]
teachertweet=[]
allpeople=[]
i=0
for user in itemlist2:    
    if user.lang == 'en' and user.description!='':
        allpeople+=[user.screen_name]
        wordList = re.sub("[^\w]", " ",  user.description).split()
        k=0
        for j in wordList:
            k+=1
            if j in teachertag:
                teacher+=[user.screen_name]
                teachertweet+=[user.description]

                break
        else:
            parentandstudent+=[user.screen_name]
            parentandstudenttweet+=[user.description]

0 个答案:

没有答案