RNN LSTM估计正弦波频率和相位

时间:2016-11-26 00:40:02

标签: python machine-learning keras lstm

为了进一步理解RNN和LSTM,我试图用一个简单的LSTM来估计正弦波的频率和相位。事实证明这很难收敛。 MSE非常高(成千上万) 似乎只有一点点工作的唯一事情是,如果我生成具有相同相位的sinewaves(从同一时间开始)并且训练样本作为向量传递而不是作为一次一个样本传递RNN。同时,这里的代码不会收敛。在这段代码中,我删除了每个频率的不同阶段 关于这里出了什么问题的任何想法

我看过这个Keras : How should I prepare input data for RNN?并尝试修改我的输入,但没有运气。

from keras.models import Sequential  
from keras.layers.core import Activation, Dropout ,Dense 
from keras.layers.recurrent import GRU, LSTM
import numpy as np
from sklearn.cross_validation import train_test_split

np.random.seed(0)  # For reproducability
TrainingNums = 12000 #Number of Trials
numSampleInEach = 200 #Length of each sinewave
numPhaseExamples = 1 #for each freq, so many different phases

X = np.zeros((TrainingNums,numSampleInEach))
Y = np.zeros((TrainingNums,2))

#create sinewaves here
for iii in range(0, TrainingNums//numPhaseExamples):
    freq = np.round(np.random.randn()*100)
    for kkk in range(0,numPhaseExamples):
    #set timeOffset below to 0, if you want the same phase every run
        timeOffset = 0# 0 for now else np.random.randint(0,90)
        X[iii*numPhaseExamples+kkk,:] = np.sin(2*3.142*freq*np.linspace(0+timeOffset,numSampleInEach-1+timeOffset,numSampleInEach)/10000)
        Y[iii*numPhaseExamples+kkk,0] = freq
        Y[iii*numPhaseExamples+kkk,1] = timeOffset

X = np.reshape(X,(TrainingNums, numSampleInEach,1))
#This below works when there is no phase variation
#X = np.reshape(X,(TrainingNums, numSampleInEach,1))

X_train, X_test, y_train, y_test = train_test_split(X, Y, test_size=0.33)

#Now create the RNN
model = Sequential()  
#batch_input_shape = [batch_size,timeStep,dataDimension]
model.add(LSTM(128,input_shape=   (numSampleInEach,1),return_sequences=True))

#For things to work for freq estimation only the following change helps
#model.add(LSTM(128,input_shape=(1,numSampleInEach),return_sequences=True))
model.add(Dropout(0.2))
model.add(Activation("relu")) 

#second layer of RNN
model.add(LSTM(128,return_sequences=False))
model.add(Dropout(0.2))
model.add(Activation("relu")) 

model.add(Dense(2,activation="linear"))
model.compile(loss="mean_squared_error", optimizer="Nadam") 
print model.summary()

print "Model compiled."
model.fit(X_train, y_train, batch_size=16, nb_epoch=150, 
      validation_split=0.1)
result = model.evaluate(X_test, y_test, verbose=0)
print 'mse: ', result

所以问题是:

  1. 期望RNN估计频率和相位是否正确?
  2. 我尝试了几种架构(多层LSTM,单层和更多节点等)。我也尝试过不同的架构。

1 个答案:

答案 0 :(得分:1)

在LSTM之后删除激活是正确的答案