行轴的np.sum在Numpy

时间:2016-03-26 22:29:44

标签: python numpy matrix machine-learning softmax

我写了一个softmax回归函数def softmax_1(x),它基本上接受m x n矩阵,对矩阵求幂,然后求和每列的指数。

x = np.arange(-2.0, 6.0, 0.1)
scores = np.vstack([x, np.ones_like(x), 0.2 * np.ones_like(x)])
#scores shape is (3, 80)

def softmax_1(x):
    """Compute softmax values for each sets of scores in x."""
    return(np.exp(x)/np.sum(np.exp(x),axis=0))

将其转换为DataFrame我必须转置

DF_activation_1 = pd.DataFrame(softmax_1(scores).T,index=x,columns=["x","1.0","0.2"])

所以我想尝试制作一个版本的softmax函数,它接受转置版本并计算softmax函数

scores_T = scores.T
#scores_T shape is (80,3)

def softmax_2(y):
    return(np.exp(y/np.sum(np.exp(y),axis=1)))

DF_activation_2 = pd.DataFrame(softmax_2(scores_T),index=x,columns=["x","1.0","0.2"])

然后我收到此错误:

Traceback (most recent call last):
  File "softmax.py", line 22, in <module>
    DF_activation_2 = pd.DataFrame(softmax_2(scores_T),index=x,columns=["x","1.0","0.2"])
  File "softmax.py", line 18, in softmax_2
    return(np.exp(y/np.sum(np.exp(y),axis=1)))
ValueError: operands could not be broadcast together with shapes (80,3) (80,) 

为什么当我在np.sum方法中转置和切换轴时,这不起作用?

1 个答案:

答案 0 :(得分:4)

更改

np.exp(y/np.sum(np.exp(y),axis=1))

np.exp(y)/np.sum(np.exp(y),axis=1, keepdims=True)

这意味着np.sum将返回一个形状(80, 1)而不是(80,)的数组,它将正确地为该分区广播。另请注意括号关闭的更正。