Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update 5-13-BN.md #6

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions _tutorials/machine-learning/tensorflow/5-13-BN.md
Original file line number Diff line number Diff line change
Expand Up @@ -273,7 +273,7 @@ Relu 激励函数的图在这里:

{% include tut-image.html image-name="5_13_05.png" %}

因为没有使用 NB 的网络, 大部分神经元都死了, 所以连误差曲线都没了.
因为没有使用 BN 的网络, 大部分神经元都死了, 所以连误差曲线都没了.

如果使用不同的 `ACTIVATION` 会怎么样呢? 不如把 `relu` 换成 `tanh`:

Expand All @@ -283,8 +283,8 @@ ACTIVATION = tf.nn.tanh

{% include tut-image.html image-name="5_13_06.gif" %}

可以看出, 没有 NB, 每层的值迅速全部都饱和, 都跑去了 -1/1 这个饱和区间,
NB, 即使前一层因变得相对饱和,
可以看出, 没有 BN, 每层的值迅速全部都饱和, 都跑去了 -1/1 这个饱和区间,
BN, 即使前一层因变得相对饱和,
但是后面几层的值都被 normalize 到有效的不饱和区间内计算. 确保了一个活的神经网络.

tanh 激励函数的图在这里:
Expand Down