ReLU可以代替神经网络中的S型激活函数吗

时间:2018-11-12 10:22:26

标签: python neural-network sigmoid relu

我是新手,我试图用ReLU替换以下简单NN中的S形激活函数。我可以这样做吗?我试过替换Sigmoid函数,但是它不起作用。输出应为AND门(如果输入(0,0)->输出0)。

import numpy as np

 # sigmoid function
def nonlin(x, deriv=False):
   if(deriv == True):
       return x*(1-x)
   return 1/(1+np.exp(-x))

# input dataset
X = np.array([[0, 0],
          [0, 1],
          [1, 0],
          [1, 1]])

# output dataset            
y = np.array([[0, 0, 0, 1]]).T

# seed random numbers to make calculation
# deterministic (just a good practice)
np.random.seed(1)

# initialize weights randomly with mean 0
syn0 = 2*np.random.random((2, 1)) - 1

for iter in xrange(10000):

    # forward propagation
    l0 = X
    l1 = nonlin(np.dot(l0,syn0))

    # how much did we miss?
    l1_error = y - l1

    l1_delta = l1_error * nonlin(l1, True)

    syn0 += np.dot(l0.T,l1_delta)

0 个答案:

没有答案