npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

dp-distill

v1.0.0

Published

[DP-distill](https://github.com/ruoyuwang1995nya/dp-distill) automates the finetune and distillation process which enables practical atomistic simualtion with the highly transferable, but also computationally expensive DPA-2 pre-trained models. The projec

Downloads

5

Readme

DP-DISTILL: fine-tune and distillation kit for pre-trained Deep Potential models

DP-distill automates the finetune and distillation process which enables practical atomistic simualtion with the highly transferable, but also computationally expensive DPA-2 pre-trained models. The project is currently built upon the DPGEN2 workflow.

Table of Contents

1. Overview

Inspired by DPGEN concurrent learning scheme, DP-distill provides automated workflow for efficient model fine-tune and distillation towards practical application of DPA-2 pretrained model in the field of atomic simulation. Fig.1 shows the basic workflow of model finetune. Given the initial structure input of training systems, the workflow generates perturbed structures, and executes a series of short ab initio molecular dynamics (AIMD) simulation based upon randomly perturbed structures. The pretrained model is firstly finetuned by the AIMD dataset, then DeePMD simulation with the fine-tuned model would search new configurations, which are then labeled by first-principle calculation softwares such as ABACUS. If the fine-tuned model cannot predict the labeled dataset with sufficient accuracy, the collected dataset would be added to the fine-tune training set, and the train-search-label process would iterate until convergence is achieved.

A lightweight DeePMD model can also be generated from a pre-trained/fine-tuned DPA-2 model through distillation, which enables much faster simulation of given systems. The distilled model can be generated with much less GPU resources and negligible CPU cost compared with standard DP-GEN pipeline which involves simultaneous training of multiple randomly initialized models, provided that a well-converged DPA-2 platform model is availiable. Figure 2 shows the schematic of the distillation workflow.

2. Installation

DP-distill can be built and installed form the source.

git clone https://github.com/ruoyuwang1995nya/dp-distill.git
pip install .

3. Quick start

DP-distill can be accessed from CLI interface. For instance, a finetune workflow can be submitted by the following command:

dp-dist submit finetune.json -t finetune

The finetune.json specifies imput parameters of the finetune task, whose details can be found in the examples directory of this repository.

4. Userguide

Examples of json input file for model finetune and distillation can be found in the examples directory.