architecture for multivariate time series networks where some variables are shared across units

时间:2019-03-19 14:38:10

标签: machine-learning keras deep-learning time-series lstm

I've got a time series that has the following shape:

arr.shape
Out[9]: (2864, 98, 34)

So, 2864 units, 98 time steps, and 34 variables. The 98 time steps are annual.

I've also got a "global" variable that is also monthly. It's global in the sense that it applies to each of the N units.

pdat.shape
Out[10]: (1176, 1)
1176/12
Out[11]: 98.0

I'm trying to build a model that will jointly predict the 34 input variables (really my interest centers on a few of them), as well as the monthly global variable.

It would be straightforward to reshape the monthly data such that it is

(1,98,12)

then repeat it so that it is

(2864,98,12)

I could then concatenate it to arr, which would make the shape of that array

(2864, 98, 46)

Using the functional API in Keras, I could then define a multi-output network, where one of the outputs reshapes back into a monthly time series.

But the repetition of the (1,98,12) array N times seems wasteful of memory, duplicative, and inelegant. Is there a better way? On the other hand, I don't know how I'd specify a batch size common to a single model when individual inputs have different N.

I'd appreciate suggestions for ways to define networks with this sort of multi-scale architecture, and specifically how to train them given that the batch size for one input won't make sense for another.

Specifically, I think that it'd great to use the functional API to define a model with multiple inputs having different shapes. But I don't know if this is possible or the best way forward.

0 个答案:

没有答案