Skip to content

Commit

Permalink
Update 4.5Optimization.md
Browse files Browse the repository at this point in the history
  • Loading branch information
KuangYu authored Nov 7, 2023
1 parent 4f52538 commit de7f928
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/user_guide/4.5Optimization.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

## 1. Theory

Automatic differentiation is fundamental to DMFF and aids in neural network optimization. During training, it computes the derivatives from output to input using backpropagation, optimizing parameters through gradient descent. With its efficiency in optimizing high-dimensional parameters, this technique isn't limited to neural networks but suits any framework following the "input parameters → model computation → output" sequence, such as molecular dynamics (MD) simulations. Hence, using automatic differentiation and referencing experimental or ab initio data, we can optimize force field parameters by computing the output's derivative with respect to input parameters.
Automatic differentiation is a crucial component of DMFF and plays a significant role in optimizing neural networks. This technique computes the derivatives of output with respect to input using backpropagation, so parameters optimization can be conducted using gradient descent algorithms. With its efficiency in optimizing high-dimensional parameters, this technique is not limited to training neural networks but is also suitable for any physical model optimization (i.e., molecular force field in the case of DMFF). A typical optimization recipe needs two key ingradients: 1. gradient evaluation, which can be done easily using JAX; and 2. an optimizer that takes gradient as inputs, and update parameters following certain optimization algorithm. To help the users building optimization workflows, DMFF provides an wrapper API for optimizers implemented in [Optax](https://github.com/google-deepmind/optax), which is introduced here.

## 2. Function module

Expand Down

0 comments on commit de7f928

Please sign in to comment.