Skip to content

Commit

Permalink
Merge branch 'devel' into refactor_property
Browse files Browse the repository at this point in the history
  • Loading branch information
Chengqian-Zhang authored Dec 23, 2024
2 parents 2dc6486 + cfe17a3 commit 4ff2a23
Show file tree
Hide file tree
Showing 12 changed files with 46 additions and 26 deletions.
32 changes: 16 additions & 16 deletions CITATIONS.bib
Original file line number Diff line number Diff line change
Expand Up @@ -128,26 +128,26 @@ @article{Zhang_NpjComputMater_2024_v10_p94
doi = {10.1038/s41524-024-01278-7},
}

@misc{Zhang_2023_DPA2,
@article{Zhang_npjComputMater_2024_v10_p293,
annote = {DPA-2},
author = {
Duo Zhang and Xinzijian Liu and Xiangyu Zhang and Chengqian Zhang and Chun
Cai and Hangrui Bi and Yiming Du and Xuejian Qin and Jiameng Huang and
Bowen Li and Yifan Shan and Jinzhe Zeng and Yuzhi Zhang and Siyuan Liu and
Yifan Li and Junhan Chang and Xinyan Wang and Shuo Zhou and Jianchuan Liu
and Xiaoshan Luo and Zhenyu Wang and Wanrun Jiang and Jing Wu and Yudi Yang
and Jiyuan Yang and Manyi Yang and Fu-Qiang Gong and Linshuang Zhang and
Mengchao Shi and Fu-Zhi Dai and Darrin M. York and Shi Liu and Tong Zhu and
Zhicheng Zhong and Jian Lv and Jun Cheng and Weile Jia and Mohan Chen and
Guolin Ke and Weinan E and Linfeng Zhang and Han Wang
Cai and Hangrui Bi and Yiming Du and Xuejian Qin and Anyang Peng and
Jiameng Huang and Bowen Li and Yifan Shan and Jinzhe Zeng and Yuzhi Zhang
and Siyuan Liu and Yifan Li and Junhan Chang and Xinyan Wang and Shuo Zhou
and Jianchuan Liu and Xiaoshan Luo and Zhenyu Wang and Wanrun Jiang and
Jing Wu and Yudi Yang and Jiyuan Yang and Manyi Yang and Fu-Qiang Gong and
Linshuang Zhang and Mengchao Shi and Fu-Zhi Dai and Darrin M. York and Shi
Liu and Tong Zhu and Zhicheng Zhong and Jian Lv and Jun Cheng and Weile Jia
and Mohan Chen and Guolin Ke and Weinan E and Linfeng Zhang and Han Wang
},
title = {
{DPA-2: Towards a universal large atomic model for molecular and material
simulation}
},
publisher = {arXiv},
year = 2023,
doi = {10.48550/arXiv.2312.15492},
title = {{DPA-2: a large atomic model as a multi-task learner}},
journal = {npj Comput. Mater},
year = 2024,
volume = 10,
number = 1,
pages = 293,
doi = {10.1038/s41524-024-01493-2},
}

@article{Zhang_PhysPlasmas_2020_v27_p122704,
Expand Down
7 changes: 6 additions & 1 deletion deepmd/dpmodel/descriptor/dpa2.py
Original file line number Diff line number Diff line change
Expand Up @@ -387,7 +387,7 @@ def __init__(
use_tebd_bias: bool = False,
type_map: Optional[list[str]] = None,
) -> None:
r"""The DPA-2 descriptor. see https://arxiv.org/abs/2312.15492.
r"""The DPA-2 descriptor[1]_.
Parameters
----------
Expand Down Expand Up @@ -434,6 +434,11 @@ def __init__(
sw: torch.Tensor
The switch function for decaying inverse distance.
References
----------
.. [1] Zhang, D., Liu, X., Zhang, X. et al. DPA-2: a
large atomic model as a multi-task learner. npj
Comput Mater 10, 293 (2024). https://doi.org/10.1038/s41524-024-01493-2
"""

def init_subclass_params(sub_data, sub_class):
Expand Down
7 changes: 6 additions & 1 deletion deepmd/pt/model/descriptor/dpa2.py
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ def __init__(
use_tebd_bias: bool = False,
type_map: Optional[list[str]] = None,
) -> None:
r"""The DPA-2 descriptor. see https://arxiv.org/abs/2312.15492.
r"""The DPA-2 descriptor[1]_.
Parameters
----------
Expand Down Expand Up @@ -147,6 +147,11 @@ def __init__(
sw: torch.Tensor
The switch function for decaying inverse distance.
References
----------
.. [1] Zhang, D., Liu, X., Zhang, X. et al. DPA-2: a
large atomic model as a multi-task learner. npj
Comput Mater 10, 293 (2024). https://doi.org/10.1038/s41524-024-01493-2
"""
super().__init__()

Expand Down
2 changes: 1 addition & 1 deletion doc/credits.rst
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ Cite DeePMD-kit and methods
.. bibliography::
:filter: False

Zhang_2023_DPA2
Zhang_npjComputMater_2024_v10_p293

- If frame-specific parameters (`fparam`, e.g. electronic temperature) is used,

Expand Down
2 changes: 1 addition & 1 deletion doc/development/create-a-model-pt.md
Original file line number Diff line number Diff line change
Expand Up @@ -180,7 +180,7 @@ The arguments here should be consistent with the class arguments of your new com
## Package new codes

You may package new codes into a new Python package if you don't want to contribute it to the main DeePMD-kit repository.
A good example is [DeePMD-GNN](https://github.com/njzjz/deepmd-gnn).
A good example is [DeePMD-GNN](https://gitlab.com/RutgersLBSR/deepmd-gnn).
It's crucial to add your new component to `project.entry-points."deepmd.pt"` in `pyproject.toml`:

```toml
Expand Down
2 changes: 1 addition & 1 deletion doc/install/install-from-c-library.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Install from pre-compiled C library {{ tensorflow_icon }}, JAX {{ jax_icon }}
# Install from pre-compiled C library {{ tensorflow_icon }} {{ jax_icon }}

:::{note}
**Supported backends**: TensorFlow {{ tensorflow_icon }}, JAX {{ jax_icon }}
Expand Down
2 changes: 1 addition & 1 deletion doc/model/dpa2.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
**Supported backends**: PyTorch {{ pytorch_icon }}, JAX {{ jax_icon }}, DP {{ dpmodel_icon }}
:::

The DPA-2 model implementation. See https://arxiv.org/abs/2312.15492 for more details.
The DPA-2 model implementation. See https://doi.org/10.1038/s41524-024-01493-2 for more details.

Training example: `examples/water/dpa2/input_torch_medium.json`, see [README](../../examples/water/dpa2/README.md) for inputs in different levels.

Expand Down
2 changes: 1 addition & 1 deletion doc/third-party/out-of-deepmd-kit.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ The codes of the following interfaces are not a part of the DeePMD-kit package a

### External GNN models (MACE/NequIP)

[DeePMD-GNN](https://github.com/njzjz/deepmd-gnn) is DeePMD-kit plugin for various graph neural network (GNN) models.
[DeePMD-GNN](https://gitlab.com/RutgersLBSR/deepmd-gnn) is DeePMD-kit plugin for various graph neural network (GNN) models.
It has interfaced with [MACE](https://github.com/ACEsuit/mace) (PyTorch version) and [NequIP](https://github.com/mir-group/nequip) (PyTorch version).
It is also the first example to the DeePMD-kit [plugin mechanism](../development/create-a-model-pt.md#package-new-codes).

Expand Down
2 changes: 1 addition & 1 deletion doc/train/finetuning.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ The model section will be overwritten (except the `type_map` subsection) by that

#### Fine-tuning from a multi-task pre-trained model

Additionally, within the PyTorch implementation and leveraging the flexibility offered by the framework and the multi-task training process proposed in DPA2 [paper](https://arxiv.org/abs/2312.15492),
Additionally, within the PyTorch implementation and leveraging the flexibility offered by the framework and the multi-task training process proposed in DPA2 [paper](https://doi.org/10.1038/s41524-024-01493-2),
we also support more general multitask pre-trained models, which includes multiple datasets for pre-training. These pre-training datasets share a common descriptor while maintaining their individual fitting nets,
as detailed in the paper above.

Expand Down
2 changes: 1 addition & 1 deletion doc/train/multi-task-training.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ and the Adam optimizer is executed to minimize $L^{(t)}$ for one step to update
In the case of multi-GPU parallel training, different GPUs will independently select their tasks.
In the DPA-2 model, this multi-task training framework is adopted.[^1]

[^1]: Duo Zhang, Xinzijian Liu, Xiangyu Zhang, Chengqian Zhang, Chun Cai, Hangrui Bi, Yiming Du, Xuejian Qin, Jiameng Huang, Bowen Li, Yifan Shan, Jinzhe Zeng, Yuzhi Zhang, Siyuan Liu, Yifan Li, Junhan Chang, Xinyan Wang, Shuo Zhou, Jianchuan Liu, Xiaoshan Luo, Zhenyu Wang, Wanrun Jiang, Jing Wu, Yudi Yang, Jiyuan Yang, Manyi Yang, Fu-Qiang Gong, Linshuang Zhang, Mengchao Shi, Fu-Zhi Dai, Darrin M. York, Shi Liu, Tong Zhu, Zhicheng Zhong, Jian Lv, Jun Cheng, Weile Jia, Mohan Chen, Guolin Ke, Weinan E, Linfeng Zhang, Han Wang, [arXiv preprint arXiv:2312.15492 (2023)](https://arxiv.org/abs/2312.15492) licensed under a [Creative Commons Attribution (CC BY) license](http://creativecommons.org/licenses/by/4.0/).
[^1]: Duo Zhang, Xinzijian Liu, Xiangyu Zhang, Chengqian Zhang, Chun Cai, Hangrui Bi, Yiming Du, Xuejian Qin, Anyang Peng, Jiameng Huang, Bowen Li, Yifan Shan, Jinzhe Zeng, Yuzhi Zhang, Siyuan Liu, Yifan Li, Junhan Chang, Xinyan Wang, Shuo Zhou, Jianchuan Liu, Xiaoshan Luo, Zhenyu Wang, Wanrun Jiang, Jing Wu, Yudi Yang, Jiyuan Yang, Manyi Yang, Fu-Qiang Gong, Linshuang Zhang, Mengchao Shi, Fu-Zhi Dai, Darrin M. York, Shi Liu, Tong Zhu, Zhicheng Zhong, Jian Lv, Jun Cheng, Weile Jia, Mohan Chen, Guolin Ke, Weinan E, Linfeng Zhang, Han Wang, DPA-2: a large atomic model as a multi-task learner. npj Comput Mater 10, 293 (2024). [DOI: 10.1038/s41524-024-01493-2](https://doi.org/10.1038/s41524-024-01493-2) licensed under a [Creative Commons Attribution (CC BY) license](http://creativecommons.org/licenses/by/4.0/).

Compared with the previous TensorFlow implementation, the new support in PyTorch is more flexible and efficient.
In particular, it makes multi-GPU parallel training and even tasks beyond DFT possible,
Expand Down
8 changes: 7 additions & 1 deletion source/api_cc/src/common.cc
Original file line number Diff line number Diff line change
Expand Up @@ -390,7 +390,13 @@ static inline void _load_library_path(std::string dso_path) {
if (!dso_handle) {
throw deepmd::deepmd_exception(
dso_path +
" is not found! You can add the library directory to LD_LIBRARY_PATH");
" is not found or fails to load! You can add the library directory to "
"LD_LIBRARY_PATH."
#ifndef _WIN32
" Error message: " +
std::string(dlerror())
#endif
);
}
}

Expand Down
4 changes: 4 additions & 0 deletions source/lib/src/gpu/cudart/cudart_stub.cc
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,10 @@ void *DP_cudart_dlopen(char *libname) {
#endif
if (!dso_handle) {
std::cerr << "DeePMD-kit: Cannot find " << libname << std::endl;
#ifndef _WIN32
std::cerr << "DeePMD-kit: Error message: " << std::string(dlerror())
<< std::endl;
#endif
return nullptr;
}
std::cerr << "DeePMD-kit: Successfully load " << libname << std::endl;
Expand Down

0 comments on commit 4ff2a23

Please sign in to comment.