Python 3 SyntaxError:语法无效(机器学习)

时间:2017-09-20 04:02:00

标签: python python-3.x numpy machine-learning deep-learning

我用coursera中的一个隐藏层编写Planar数据分类

 # GRADED FUNCTION: backward_propagation

def backward_propagation(参数,缓存,X,Y):     """     使用上面的说明实现向后传播。

Arguments:
parameters -- python dictionary containing our parameters 
cache -- a dictionary containing "Z1", "A1", "Z2" and "A2".
X -- input data of shape (2, number of examples)
Y -- "true" labels vector of shape (1, number of examples)

Returns:
grads -- python dictionary containing your gradients with respect to different parameters
"""
m = X.shape[1]

# First, retrieve W1 and W2 from the dictionary "parameters".
### START CODE HERE ### (≈ 2 lines of code)
W1 = parameters["W1"]
W2 = parameters["W2"]
### END CODE HERE ###

# Retrieve also A1 and A2 from dictionary "cache".
### START CODE HERE ### (≈ 2 lines of code)
A1 = cache["A1"]
A2 = cache["A1"]
### END CODE HERE ###

# Backward propagation: calculate dW1, db1, dW2, db2. 
### START CODE HERE ### (≈ 6 lines of code, corresponding to 6 equations on slide above)
dZ2= A2-Y
dW2 = (1/m)*np.dot(dZ2,A1.T)
db2 = (1/m)*np.sum(dZ2, axis=1, keepdims=True)
dZ1 = np.multiply(np.dot(W2.T, dZ2),1 - np.power(A1, 2)
dW1 = (1 / m) * np.dot(dZ1, X.T)
db1 = (1/m)*np.sum(dZ1,axis1,keepdims=True)
### END CODE HERE ###

grads = {"dW1": dW1,
         "db1": db1,
         "dW2": dW2,
         "db2": db2}

return grads

当我运行此代码时: 文件"",第36行     dW1 =(1 / m)* np.dot(dZ1,X.T)       ^ SyntaxError:语法无效

1 个答案:

答案 0 :(得分:0)

如其中一条评论中所述,您缺少dZ1中的右括号。 并在'db1'中将“axis1”写为轴= 1。