tendorflow relu op的不同结果

时间:2017-01-04 03:01:05

标签: python numpy tensorflow

示例代码如下:

import numpy as np
import tensorflow as tf

a = np.arange(-2, 2, 0.2)
with tf.Session() as s:
    print '         a=', a
    print 'relu of a =', s.run(tf.nn.relu(a))

    a1 = tf.truncated_normal([5], 1, 10)
    print '        a1=', s.run(a1)
    print 'relu of a1=', s.run(tf.nn.relu(a1))

通过运行一次输出一个结果如下:

     a= [ -2.00000000e+00  -1.80000000e+00  -1.60000000e+00  -1.40000000e+00  -1.20000000e+00  -1.00000000e+00  -8.00000000e-01  -6.00000000e-01  -4.00000000e-01  -2.00000000e-01  -4.44089210e-16   2.00000000e-01   4.00000000e-01   6.00000000e-01   8.00000000e-01   1.00000000e+00   1.20000000e+00   1.40000000e+00   1.60000000e+00   1.80000000e+00]
relu of a = [ 0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.   0.2     0.4  0.6  0.8  1.   1.2  1.4  1.6  1.8]
    a1= [ 13.94806099  -6.40441322 -15.23906326  13.16400337 -12.26687431]
 relu of a1= [  0.           7.76932907   0.           1.53322148  11.72072506]

问题:我知道这个nn.relu等于max(features,0),这是由变量a证明的;但对于变量a1来说它是假的:

     1) a1[0]=13.94, but relu(a1[0])=0; 
     2) a1[1]=-6.40, but relu(a1[1])=7.76;  

有什么问题吗?任何帮助表示赞赏。

0 个答案:

没有答案