Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
PabloAMC committed Sep 7, 2023
1 parent 5e759b6 commit 0ec29a3
Showing 1 changed file with 88 additions and 33 deletions.
121 changes: 88 additions & 33 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,49 +1,100 @@
# Grad-DFT
<div align="center">

Differentiable DFT is a Jax-based library for researchers to be able to quickly design and experiment with Machine-Learning-based (also called neural) functionals.
# Grad-DFT: a software library for machine learning density functional theory

## The Functional
[![arXiv](http://img.shields.io/badge/arXiv-2101.10279-B31B1B.svg "Grad-DFT")](https://arxiv.org/abs/2101.10279)

The core of the library is the class [Functional](https://github.com/XanaduAI/DiffDFT/blob/main/functional.py#L25), whose design and use is central to Density Functional Theory. In order to implement this functional, two components are key:
</div>

First, we need to define the Jax-based function $f_{\mathbf{\theta}}$ that will implements the functional


Grad-DFT is a JAX-based library enabling the differentiable design and experimentation of exchange-correlation functionals using machine learning techniques. This library supports a parametrization of exchange-correlation functionals based on energy densities and associated coefficient functions; the latter typically constructed using neural networks:

$$
E_{xc}[\rho] = \int d{\mathbf{r}} f_{\mathbf{\theta}}(\rho(\mathbf{r}), |\nabla \rho(\mathbf{r})|, |\nabla^2 \rho(\mathbf{r})|, \ldots).
E_{xc} = \int d\bm{r} \bm{c}_\theta[\rho](\bm{r})\cdot\bm{e}[\rho](\bm{r}).
$$

And second we need to a function that generates the features - inputs to $f_{\mathbf{\theta}}$ - from an instance of the auxiliary class [Molecule](https://github.com/XanaduAI/DiffDFT/blob/main/molecule.py#L61). These are divided according to whether autodifferentiation is intented to be used to compute their derivatives. Since `Molecule` can store arbitrary order derivatives of the atomic orbitals, the `Functional` may depend on arbitrary order derivatives of the electronic density. `Molecule` class contains not only many properties defining the electronic system, but also several auxiliary functions to compute properties such as the electronic density and its derivatives.
Grad-DFT provides significant functionality, including fully differentiable and just-in-time compilable self-consistent loop, direct optimization of the orbitals, and implementation of many of the known constraints of the exact functional in the form of loss functionals.

Once provided with these two defining elements, the energy of a system might be computed via method `functional.energy(params, molecule)`, where `molecule` is an instance of `Molecule`, `functional` is an instance of `Functional`, and `params` represent any parameters the functional may need.
## Use example

The `Functional` class is also the parent class of [NeuralFunctional](https://github.com/XanaduAI/DiffDFT/blob/main/functional.py#L181), which additionally allows to save and load chechpoints, and allows for a straightforward of the usual multi-layer perceptron such as the one used to construct DM21 [1] (also available as an off-the-shelf class and and which allows to load the original parameters),
### Creating a molecule

$$
E_{xc}^{\text{DM21}}[\rho] = \int f_{\mathbf{\theta}}^{\text{DM21}}(\mathbf{x}(\mathbf{r}))\cdot
\begin{bmatrix}
e_x^{\text{LDA}}(\mathbf{r})\\
e^{\text{HF}}(\mathbf{r})\\
e^{\omega \text{HF}}(\mathbf{r})\\
\end{bmatrix}
d\mathbf{r},
$$
The first step is to create a `Molecule` object.

where $\mathbf{x}$ represent 11 features computed from $\rho$. In general, any functional of the form
```python
from grad_dft.interface import molecule_from_pyscf
from pyscf import gto, dft

$$
E_{xc}[\rho] = E_x[\rho] +E_c[\rho] \\
= \sum_\sigma\int d\mathbf{r} \rho_\sigma(\mathbf{r}) F_{x,\sigma}[\rho(\mathbf{r})] \epsilon_{x,\sigma}^{UEG}([\rho], \mathbf{r})
+\sum_\sigma\int d\mathbf{r} \rho_\sigma(\mathbf{r}) F_{c,\sigma}[\rho(\mathbf{r})] \epsilon_{c,\sigma}^{UEG}([\rho], \mathbf{r}),
$$
# Define a PySCF mol object for the H2 molecule
mol = gto.M(atom = [['H', (0, 0, 0)], ['H', (0.74, 0, 0)]], basis = 'def2-tzvp', spin = 0)
# Create a PySCF mean-field object
mf = dft.UKS(mol)
mf.kernel()
# Create a Molecule from the mean-field object
molecule = molecule_from_pyscf(mf)
```

where $F_{x/c, \sigma}$ are the so-called inhomogeneous correction factors, composed of polynomials of a-dimensional derivatives of the electronic density, and $\epsilon_{x/c,\sigma}^{UEG}[\rho_\sigma]$ makes reference to the Uniform Electron Gas exchange/correlation electronic energy. (Range-separated) exact-exchange and dispersion components may also be introduced in the functional.
### Creating a simple functional

Then we can create a `Functional`.

### Functionality, examples and tests
```python
from jax import numpy as jnp
from grad_dft.functional import Functional

Our software library comes with auxiliary function [make_scf_loop](https://github.com/XanaduAI/DiffDFT/blob/main/evaluate.py#L26), which generates a fully differentiable self-consistent loop as long as the functional does not contain Hartree-Fock features, as these require recomputing expensive atomic integrals. A Jax-jit compilable [make_training_scf_loop](https://github.com/XanaduAI/DiffDFT/blob/main/train.py#L288) is also available in this case. Alternatively, the user may use [make_orbital_optimizer](https://github.com/XanaduAI/DiffDFT/blob/main/evaluate.py#L195) which implements Ref. [2] approach of direct molecular orbital optimizer.
def energy_densities(molecule):
rho = molecule.density()
lda_e = -3/2 * (3/(4*jnp.pi))**(1/3) * (rho**(4/3)).sum(axis = 0, keepdims = True)
return lda_e

coefficients = lambda self, rho: jnp.array([[1.]])

LDA = Functional(coefficients, energy_densities)
```

We can use the functional to compute the predicted energy, where `params` stand for the $\theta$ parameters in the equation above.

```python
from flax.core import freeze

params = freeze({'params': {}})
predicted_energy = LDA.energy(params, molecule)
```

We also provide a number of regularization loss functions in [train.py](https://github.com/XanaduAI/DiffDFT/blob/main/train.py#L184), as well as an implementation of quite a few of the known constraints of the exact functional in [constraints.py](https://github.com/XanaduAI/DiffDFT/blob/main/constraints.py) [3]. The library also provides a number of [examples](https://github.com/XanaduAI/DiffDFT/tree/main/examples) of usage, as well as [tests](https://github.com/XanaduAI/DiffDFT/tree/main/tests) checking the implementation of the self-consistent loop, a our clone of DM21, and classical functionals such as B3LYP.
### A more complex neural functional

A more complex, neural functional can be created as

```python
from jax.nn import sigmoid, gelu
from flax import linen as nn
from grad_dft.functional import NeuralFunctional

def coefficient_inputs(molecule):
rho = jnp.clip(molecule.density(), a_min = 1e-27)
kinetic = jnp.clip(molecule.kinetic_density(), a_min = 1e-27)
return jnp.concatenate((rho, kinetic))

def coefficients(self, rhoinputs):
x = nn.Dense(features=1)(rhoinputs)
x = nn.LayerNorm()(x)
return gelu(x)

neuralfunctional = NeuralFunctional(coefficients, energy_densities, coefficient_inputs)
```

with the corresponding energy calculation

```python
from jax.random import PRNGKey

key = PRNGKey(42)
cinputs = coefficient_inputs(molecule)
params = neuralfunctional.init(key, cinputs)

predicted_energy = neuralfunctional.energy(params, molecule)
```

## Install

Expand All @@ -69,9 +120,13 @@ pip install -e ".[examples]"

to install the additional dependencies.

## Bibliography
## Bibtex

1. J. Kirkpatrick, B. McMorrow, D. H. Turban, A. L. Gaunt, J. S. Spencer, A. G. Matthews, A. Obika,
L. Thiry, M. Fortunato, D. Pfau, et al. [Pushing the frontiers of density functionals by solving the fractional electron problem](https://www.science.org/doi/abs/10.1126/science.abj6511). Science, 374(6573):1385–1389, 2021
2. T. Li, M. Lin, Z. Hu, K. Zheng, G. Vignale, K. Kawaguchi, A. C. Neto, K. S. Novoselov, and S. YAN. [D4FT: A Deep Learning approach to Kohn-Sham Density Functional Theory](https://openreview.net/forum?id=aBWnqqsuot7). In The Eleventh International Conference on Learning Representations, 2023.
3. Kaplan, Aaron D., Mel Levy, and John P. Perdew. [The predictive power of exact constraints and appropriate norms in density functional theory](https://doi.org/10.1146/annurev-physchem-062422-013259). *Annual Review of Physical Chemistry* 74 (2023): 193-218.
```
@article{graddft,
title={Grad-DFT: a software library for machine learning density functional theory},
author={Casares, Pablo Antonio Moreno and Baker Jack, and Medvidovi{\'c}, Matija and Dos Reis, Roberto, and Arrazola, Juan Miguel},
journal={arXiv preprint [number]},
year={2023}
}
```

0 comments on commit 0ec29a3

Please sign in to comment.