I have a very large number (>1000) of files, each about 20MB, which represent continuous time-series data saved in a simple binary format such that if I concatenate them all directly, I recover my full time series.
I would like to do this virtually in python, by using memmap to address each file and then concatenate them all on the fly into one big memmap.
Searching around SO suggests that np.concatenate will load them into memory, which I can't do. The question here seems to answer it in part, but the answer there assumes that I know how big my files are before concatenation, which is not necessarily true.
So, is there a general way to concatenate memmaps without knowing beforehand how big they are?
EDIT: it was pointed out that the linked question actually creates a concatenated file on disk. This is not something I want.