我正在尝试使用Keras在我自己定义的evaluate_metric上训练LSTM NN,我将其用作损失函数。我的神经网络的结构是:
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_2 (InputLayer) (None, 15, 1) 0
_________________________________________________________________
lstm_2 (LSTM) (None, 20) 1760
_________________________________________________________________
dense_3 (Dense) (None, 10) 210
_________________________________________________________________
dense_4 (Dense) (None, 1) 11
=================================================================
让我给你一些背景:我的NN产生一个数值数组。这些值已通过某些数学函数进行了缩放。为了将其与实际值进行比较,我需要首先撤消缩放以将其转换回原始上下文。为此,我创建了函数decode_output_values:
import numpy as np
[....]
def decode_output_values(pred_scaled,Y_train):
#Decodes the output values back to the original context
Y_min = np.nanmin(Y_train)
Y_max = np.nanmax(Y_train)
Y_pred = np.exp(pred_scaled*(np.log(Y_max)-np.log(Y_min))+np.log(Y_min))
return Y_pred
现在输出值已解码,我进行了另一次更改,以便能够将其与已知测试集中的实际输出值进行比较。测试集中的这些实际输出值具有大量NA值,并且在某些行上仅具有数值。因此,我只查看具有非NA值的行的相应索引,并使用我创建的函数“evaluation_metric”计算这些值之间的RMSE:
我创建了自己的函数evaluation_metric:
from sklearn.metrics import mean_squared_error
from math import sqrt
[....]
def evaluation_metric(y_true, y_pred_scaled):
#Convert predictions back to original scale
y_pred = decode_output_values(y_pred_scaled, Y_train)
#Get all non-NA values of true values and predictions
mask = ~np.isnan(y_true)
y_true = y_true[mask]
y_pred = y_pred[mask]
error = sqrt(mean_squared_error(y_true, y_pred))
return error
当我尝试使用Keras使用以下代码编译模型时:
import keras
[....]
visible = Input(shape=(np.size(X_train_scaled,1),1))
hidden1 = LSTM(20)(visible)
hidden2 = Dense(10, activation='relu')(hidden1)
output = Dense(1, activation='linear')(hidden2)
initial_model = Model(inputs=visible, outputs=output)
initial_model.compile(loss=evaluation_metric, optimizer='rmsprop', metrics=
[evaluation_metric])
我收到以下错误:
AttributeError:'Tensor'对象没有属性'exp'
完整追溯:
Traceback (most recent call last):
File "<ipython-input-108-9c67c532405a>", line 1, in <module>
initial_model.compile(loss=evaluation_metric, optimizer='rmsprop', metrics=[evaluation_metric])
File "/Users/XX/anaconda3/envs/Research_Paper/lib/python3.6/site-
packages/keras/engine/training.py", line 860, in compile
sample_weight, mask)
File "/Users/XX/anaconda3/envs/Research_Paper/lib/python3.6/site-
packages/keras/engine/training.py", line 459, in weighted
score_array = fn(y_true, y_pred)
File "<ipython-input-82-086ae61141e0>", line 3, in evaluation_metric
y_pred = decode_output_values(y_pred_scaled, Y_train)
File "<ipython-input-82-086ae61141e0>", line 25, in decode_output_values
Y_pred = np.exp(pred_scaled*(np.log(Y_max)-np.log(Y_min))+np.log(Y_min))
AttributeError: 'Tensor' object has no attribute 'exp'
我在MacOS上使用Python 3.6和Spyder 3.2.6。所有使用的软件包都更新到最新版本。
有人可以帮我解决这个错误吗?
答案 0 :(得分:0)
只需将numpy
更改为keras.backend
:
import keras.backend as K
def decode_output_values(pred_scaled,Y_train):
#Decodes the output values back to the original context
Y_min = np.nanmin(Y_train)
Y_max = np.nanmax(Y_train)
Y_pred = K.exp(pred_scaled*(K.log(Y_max)-K.log(Y_min)) + K.log(Y_min))
return Y_pred