我想在时间序列数据上构建卷积自动编码器。我希望可以使用TimeseriesGenerator来 window 输入序列。
自动编码器的结构类似
Layer (type) Output Shape Param #
=================================================================
input_42 (InputLayer) [(None, 128, 2)] 0
_________________________________________________________________
.....
_________________________________________________________________
reshape_21 (Reshape) (None, 128, 2) 0
=================================================================
现在我需要形状为(7697, 128, 2)
的训练数据,其中原始序列为(7697+128, 2)
。
我的想法是像这样使用TimeseriesGenerator
generator = TimeseriesGenerator(raw_x,raw_x,length=128)
其中raw_x
是原始时间序列。
但是由于生成器似乎是用于预测的,所以我得到了消息
ValueError: A target array with shape (128, 2) was passed for an output of shape (None, 128, 2) while using as loss `mean_squared_error`. This loss expects targets to have the same shape as the output.
我想要做的工作最小版本看起来很像
def strided_axis0(a, L):
# INPUTS :
# a is array
# L is length of array along axis=0 to be cut for forming each subarray
# Length of 3D output array along its axis=0
nd0 = a.shape[0] - L + 1
# Store shape and strides info
m,n = a.shape
s0,s1 = a.strides
# Finally use strides to get the 3D array view
return np.lib.stride_tricks.as_strided(a, shape=(nd0,L,n), strides=(s0,s0,s1))
dta=strided_axis0(df[["latitude","longitude"]].values,128)
print(dta.shape)
X = np.dstack([range(0,256),range(0,256)])[0]
input_sig = tf.keras.layers.Input(batch_shape=(None,16,2))
flat = tf.keras.layers.Flatten()(input_sig)
bottleneck = tf.keras.layers.Dense(8,activation = 'relu')(flat)
expended = tf.keras.layers.Dense(32,activation = 'relu')(bottleneck)
decoded = tf.keras.layers.Reshape((16,2))(expended)
autoencoder = tf.keras.Model(input_sig, decoded)
autoencoder.compile(optimizer='adam', loss='mse', metrics=['accuracy'])
dta=strided_axis0(X,16)
print(autoencoder.summary())
print(dta.shape)
autoencoder.fit(dta,dta)