![Deep Learning with Keras](https://wfqqreader-1252317822.image.myqcloud.com/cover/291/36701291/b_36701291.jpg)
上QQ阅读APP看书,第一时间看更新
Activation function — sigmoid
The sigmoid function is defined as follows:
![](https://epubservercos.yuewen.com/C78C0C/19470410508972506/epubprivate/OEBPS/Images/image_01_068.jpg?sign=1739296756-ezIF5oSyqClH8h12XgydsYaBe63F08f2-0-7ae557b759e8a8558414807627b1cc9e)
As represented in the following graph, it has small output changes in (0, 1) when the input varies in . Mathematically, the function is continuous. A typical sigmoid function is represented in the following graph:
![](https://epubservercos.yuewen.com/C78C0C/19470410508972506/epubprivate/OEBPS/Images/B06258_01_05.jpg?sign=1739296756-fxEWfdafTA8PCGl6vLSXaiTjehF9ahyw-0-eef9db970371786c6dc0a71aaff933c4)
A neuron can use the sigmoid for computing the nonlinear function . Note that, if
is very large and positive, then
, so
, while if
is very large and negative
so
. In other words, a neuron with sigmoid activation has a behavior similar to the perceptron, but the changes are gradual and output values, such as 0.5539 or 0.123191, are perfectly legitimate. In this sense, a sigmoid neuron can answer maybe.