当尝试在Adam上遵循Keras doc时,我从文档中复制了这一行:
keras.optimizers.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0, amsgrad=False)
并得到此错误
传递给优化器的意外关键字参数:amsgrad
编辑1
省略amsgrad
参数同意插入行
keras.optimizers.Adam(lr=0.001, beta_1=0.9, beta_2=0.999, epsilon=None, decay=0.0)
但是当尝试使用
训练模型时happyModel.fit(x = X_train, y = Y_train, epochs = 50, batch_size = 600)
出现以下错误:
不支持任何值。
完整错误:
-------------------------------------------------- ---------------------------- ValueError Traceback(最近的呼叫 最后)在() 1 ###在这里开始代码###(1行) ----> 2 happyModel.fit(x = X_train,y = Y_train,历元= 50,batch_size = 100) 3 ###此处结束代码###
/opt/conda/lib/python3.6/site-packages/keras/engine/training.py在 fit(self,x,y,batch_size,epochs,verbose,callbacks, validate_split,validation_data,随机播放,class_weight, sample_weight,initial_epoch,steps_per_epoch,validation_steps, ** kwargs)1574 else:1575 ins = x + y + sample_weights -> 1576 self._make_train_function()1577 f = self.train_function 1578
/opt/conda/lib/python3.6/site-packages/keras/engine/training.py在 _make_train_function(个体) 第958章真相大白 第959章 -> 960 loss = self.total_loss) 961更新= self.updates + training_updates 962#获取损失和指标。在每次通话时更新权重。
/opt/conda/lib/python3.6/site-packages/keras/legacy/interfaces.py在 包装器(* args,** kwargs) 85 warnings.warn('将您的
' + object_name + 86 '
调用更新为Keras 2 API:'+签名,stacklevel = 2) ---> 87 return func(* args,** kwargs) 88 wrapper._original_function = func 89包装纸/opt/conda/lib/python3.6/site-packages/keras/optimizers.py在 get_updates(自我,损失,参数) 432 m_t =(self.beta_1 * m)+(1.-self.beta_1)* g 433 v_t =(self.beta_2 * v)+(1.-self.beta_2)* K.square(g) -> 434 p_t = p-lr_t * m_t /(K.sqrt(v_t)+ self.epsilon) 435 436 self.updates.append(K.update(m,m_t))
/opt/conda/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py 在binary_op_wrapper(x,y)中 827如果不是isinstance(y,sparse_tensor.SparseTensor): 828尝试: -> 829 y = ops.convert_to_tensor(y,dtype = x.dtype.base_dtype,name =“ y”) 830除了TypeError: 831#如果RHS不是张量,则可能是张量感知对象
/opt/conda/lib/python3.6/site-packages/tensorflow/python/framework/ops.py 在convert_to_tensor中(值,dtype,名称,preferred_dtype) 674 name = name, 第675章席卷天元 -> 676 as_ref = False) 677 678
/opt/conda/lib/python3.6/site-packages/tensorflow/python/framework/ops.py 在internal_convert_to_tensor(value,dtype,name,as_ref, preferred_dtype) 739 740如果ret为None: -> 741 ret = conversion_func(值,dtype = dtype,name = name,as_ref = as_ref) 742 743如果未实现ret:
/opt/conda/lib/python3.6/site-packages/tensorflow/python/framework/constant_op.py 在_constant_tensor_conversion_function(v,dtype,name,as_ref)中 111 as_ref = False): 第112章 -> 113返回常量(v,dtype = dtype,name = name) 114 115
/opt/conda/lib/python3.6/site-packages/tensorflow/python/framework/constant_op.py 以常量(值,dtype,形状,名称,verify_shape) 100张量值= attr_value_pb2.AttrValue() 101 tensor_value.tensor.CopyFrom( -> 102 tensor_util.make_tensor_proto(值,dtype = dtype,shape = shape,verify_shape = verify_shape)) 103 dtype_value = attr_value_pb2.AttrValue(type = tensor_value.tensor.dtype) 104 const_tensor = g.create_op(
/opt/conda/lib/python3.6/site-packages/tensorflow/python/framework/tensor_util.py 在make_tensor_proto中(值,dtype,形状,verify_shape) 362: 363如果值是None: -> 364提高ValueError(“不支持任何值。”) 365#如果提供dtype,则将numpy数组强制为类型 366#尽可能提供。
ValueError:不支持任何值。
因此,仅省略参数并不能解决问题
如何使adam优化器正常工作?
谢谢
答案 0 :(得分:0)
amsgrad
参数None values not supported.
问题来自epsilon
参数中的“无”。需要指定一个值。