Skip to content

Commit

Permalink
修改COPY-FROM No.15 regularizer (#5961)
Browse files Browse the repository at this point in the history
  • Loading branch information
Ainavo authored Jul 5, 2023
1 parent 5f79237 commit 299f4f5
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 72 deletions.
38 changes: 2 additions & 36 deletions docs/api/paddle/regularizer/L1Decay_cn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,44 +28,10 @@ L1Decay 实现 L1 权重衰减正则化,用于模型训练,使得权重矩
代码示例 1
::::::::::::
.. code-block:: python
# Example1: set Regularizer in optimizer
import paddle
from paddle.regularizer import L1Decay
import numpy as np
linear = paddle.nn.Linear(10, 10)
inp = paddle.rand(shape=[10, 10], dtype="float32")
out = linear(inp)
loss = paddle.mean(out)
beta1 = paddle.to_tensor([0.9], dtype="float32")
beta2 = paddle.to_tensor([0.99], dtype="float32")
momentum = paddle.optimizer.Momentum(
learning_rate=0.1,
parameters=linear.parameters(),
weight_decay=L1Decay(0.0001))
back = out.backward()
momentum.step()
momentum.clear_grad()
COPY-FROM: paddle.regularizer.L1Decay:code-example1
代码示例 2
::::::::::::
.. code-block:: python
# Example2: set Regularizer in parameters
# Set L1 regularization in parameters.
# Global regularizer does not take effect on my_conv2d for this case.
from paddle.nn import Conv2D
from paddle import ParamAttr
from paddle.regularizer import L2Decay
my_conv2d = Conv2D(
in_channels=10,
out_channels=10,
kernel_size=1,
stride=1,
padding=0,
weight_attr=ParamAttr(regularizer=L2Decay(coeff=0.01)),
bias_attr=False)
COPY-FROM: paddle.regularizer.L1Decay:code-example2
38 changes: 2 additions & 36 deletions docs/api/paddle/regularizer/L2Decay_cn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,44 +28,10 @@ L2Decay 实现 L2 权重衰减正则化,用于模型训练,有助于防止
代码示例 1
::::::::::::

.. code-block:: python
# Example1: set Regularizer in optimizer
import paddle
from paddle.regularizer import L2Decay
import numpy as np
linear = paddle.nn.Linear(10, 10)
inp = paddle.rand(shape=[10, 10], dtype="float32")
out = linear(inp)
loss = paddle.mean(out)
beta1 = paddle.to_tensor([0.9], dtype="float32")
beta2 = paddle.to_tensor([0.99], dtype="float32")
momentum = paddle.optimizer.Momentum(
learning_rate=0.1,
parameters=linear.parameters(),
weight_decay=L2Decay(0.0001))
back = out.backward()
momentum.step()
momentum.clear_grad()
COPY-FROM: paddle.regularizer.L2Decay:code-example1


代码示例 2
::::::::::::

.. code-block:: python
# Example2: set Regularizer in parameters
# Set L2 regularization in parameters.
# Global regularizer does not take effect on my_conv2d for this case.
from paddle.nn import Conv2D
from paddle import ParamAttr
from paddle.regularizer import L2Decay
my_conv2d = Conv2D(
in_channels=10,
out_channels=10,
kernel_size=1,
stride=1,
padding=0,
weight_attr=ParamAttr(regularizer=L2Decay(coeff=0.01)),
bias_attr=False)
COPY-FROM: paddle.regularizer.L2Decay:code-example2

0 comments on commit 299f4f5

Please sign in to comment.