使用loadtxt以递归方式读取文件

时间:2014-07-15 11:25:48

标签: python numpy iteration

我有大量的.asc文件包含两个给定卫星的(x,y)坐标。每个卫星大约有3,000个单独的文件(例如,Satellite1 = [file1,file2,...,file3000]和Satellite2 = [file1,file2,...,file3000])。

我试图在Python(版本2.7.8 | Anaconda 2.0。)中编写一些代码,找到地球表面上两个卫星轨道交叉的多个点。 我已经编写了一些基本代码,它使用loadtxt将两个文件作为输入(即一个来自Sat1,一个来自Sat2)。简而言之,代码如下所示:

sat1_in = loadtxt("sat1_file1.asc", usecols = (1,2), comments = "#") 
sat2_in = loadtxt("sat2_file1.asc", usecols = (1,2), comments = "#") 

def main():        
    xover_search() # Returns True or False whether a crossover is found.
    xover_final()  # Returns the (x,y) coordinates of the crossover.
    write_output() # Appends this coordinates to a txt file for later display.

if __name__ == "__main__":
main()

我想将此代码实现到整个数据集,使用输出" sat1_in"和" sat2_in"对于卫星1和卫星2之间所有可能的文件组合。这些是我到目前为止的想法:

#Create two empty lists to store all the files to process for Sat1 and Sat2:
sat1_files = []
sat2_files = []

#Use os.walk to fill each list with the respective file paths:
for root, dirs, filenames in os.walk('.'):
    for filename in fnmatch.filter(filenames, 'sat1*.asc'):
        sat1_files.append(os.path.join(root, filename))

for root, dirs, filenames in os.walk('.'):
    for filename in fnmatch.filter(filenames, 'sat2*.asc'):
        sat2_files.append(os.path.join(root, filename))  

#Calculate all possible combinations between both lists using itertools.product:
iter_file = list(itertools.product(sat1_files, sat2_files)) 

#Extract two lists of files for sat1 and sat2 to be compared each iteration:
sat1_ordered = [seq[0] for seq in iter_file] 
sat2_ordered = [seq[1] for seq in iter_file]

这就是我被困的地方。如何迭代" sat1_ordered"和" sat2_ordered"使用loadtxt提取每个文件的坐标列表?我唯一尝试的是:

for file in sat1_ordered:
    sat1_in = np.loadtxt(file, usecols = (1,2),comments = "#")

但是这将创建一个包含卫星1的所有测量值的巨大列表。

有人能给我一些关于如何解决这个问题的想法吗?

1 个答案:

答案 0 :(得分:1)

也许你正在寻找类似的东西:

for file1, file2 in iter_file:
  sat1_in = np.loadtxt(file1, usecols = (1,2),comments = "#")
  sat2_in = np.loadtxt(file2, usecols = (1,2),comments = "#")
  ....