Tanh激活函数比sigmone具有更高的误差和更差的输出

时间:2014-01-26 03:28:13

标签: backpropagation neural-network

我实现了tanh函数作为我的激活函数,但结果在某种程度上比使用sigmoid激活函数更糟糕。此外,在检查错误时,它会反复显示错误再次上升和下降。

以下是使用tanh的错误示例:

epoch:1823
current error:0.1383121756710299

epoch:1824
current error:0.13831547638188654

epoch:1825
current error:0.13831887880040633

epoch:1826
current error:0.13832238240031222

epoch:1827
current error:0.13832598673034413

epoch:1828
current error:0.1383296914155071

epoch:1829
current error:0.13833349615821078

epoch:1830
current error:0.13833740073931955

epoch:1831
current error:0.1383414050191362

epoch:1832
current error:0.13834550893832906

epoch:1833
current error:0.1383497125188264

epoch:1834
current error:0.13835401586468998

epoch:1835
current error:0.1383584191629793

epoch:1836
current error:0.13836292268462677

epoch:1837
current error:0.1383675267853283

epoch:1838
current error:0.13837223190646183

epoch:1839
current error:0.13837703857605013

epoch:1840
current error:0.138381947409767

epoch:1841
current error:0.1383869591120036

epoch:1842
current error:0.13839207447699467

epoch:1843
current error:0.13839729439001985

epoch:1844
current error:0.1384026198286766

epoch:1845
current error:0.1384080518642369

同时,sigmoid激活功能并未显示此信息。有没有人知道为什么会这样?

0 个答案:

没有答案