Welcome to DeepStruc, a Deep Generative Model (DGM) that learns the relation between PDF and atomic structure and thereby solves a structure from a PDF!
- DeepStruc
- DeepStruc App
- Getting started (with Colab)
- Getting started (own computer)
- Author
- Cite
- Acknowledgments
- License
We here apply DeepStruc for the structural analysis of a model system of mono-metallic nanoparticle (MMNPs) with seven different structure types and demonstrate the method for both simulated and experimental PDFs. DeepStruc can reconstruct simulated data with an average mean absolute error (MAE) of the atom xyz-coordinates on 0.093 ± 0.058 Å after fitting a contraction/extraction factor, an ADP and a scale parameter. We demonstrate the generative capability of DeepStruc on a dataset of face-centered cubic (fcc), hexagonal closed packed (hcp) and stacking faulted structures, where DeepStruc can recognize the stacking faulted structures as an interpolation between fcc and hcp and construct new structural models based on a PDF. The MAE is in this example 0.030 ± 0.019 Å.
The MMNPs are provided as a graph-based input to the encoder of DeepStruc. We compare DeepStruc with a similar DGM without the graph-based encoder. DeepStruc is able to reconstruct the structures using a smaller dimension of the latent space thus having a better generative capabillity. We also compare DeepStruc with a brute-force modelling approach and a tree-based classification algorithm. The ML models are significantly faster than the brute-force approach, but DeepStruc can furthermore create a latent space from where synthetic structures can be sampled which the tree-based method cannot! The baseline models can be found in other repositories: brute-force, MetalFinder and CVAE.
Using DeepStruc on your own PDFs is straightforward and does not require anything installed or downloaded to your computer. Follow the instructions on our App.
Follow the instructions in our Colab notebook and try to play around. The Colab offers a bit more flexibility than our App.
Follow these step if you want to train DeepStruc and predict with DeepStruc locally on your own computer.
See the install folder.
See the data folder.
To train your own DeepStruc model simply run:
python train.py
A list of possible arguments or run the '--help' argument for additional information.
If you are intersted in changing the architecture of the model go to train.py and change the model_arch dictionary.
Arg | Description | Example |
---|---|---|
-h or --help |
Prints help message. | |
-d or --data_dir |
Directory containing graph training, validation and test data. str | -d ./data/graphs |
-s or --save_dir |
Directory where models will be saved. This is also used for loading a learner. str | -s bst_model |
-r or --resume_model |
If 'True' the save_dir model is loaded and training is continued. bool | -r True |
-e or --epochs |
Number of maximum epochs. int | -e 100 |
-b or --batch_size |
Number of graphs in each batch. int | -b 20 |
-l or --learning_rate |
Learning rate. float | -l 1e-4 |
-B or --beta |
Initial beta value for scaling KLD. float | -B 0.1 |
-i or --beta_increase |
Increments of beta when the threshold is met. float | -i 0.1 |
-x or --beta_max |
Highst value beta can increase to. float | -x 5 |
-t or --reconstruction_th |
Reconstruction threshold required before beta is increased. float | -t 0.001 |
-n or --num_files |
Total number of files loaded. Files will be split 60/20/20. If 'None' then all files are loaded. int | -n 500 |
-c or --compute |
Train model on CPU or GPU. Choices: 'cpu', 'gpu16', 'gpu32' and 'gpu64'. str | -c gpu32 |
-L or --latent_dim |
Number of latent space dimensions. int | -L 3 |
To predict a MMNP using DeepStruc or your own model on a PDF:
python predict.py
A list of possible arguments or run the '--help' argument for additional information.
Arg | Description | Example |
---|---|---|
-h or --help |
Prints help message. | |
-d or --data |
Path to data or data directory. If pointing to data directory all datasets must have same format. str | -d data/experimental_PDFs/JQ_S1.gr |
-m or --model |
Path to model. If 'None' GUI will open. str | -m ./models/DeepStruc |
-n or --num_samples |
Number of samples/structures generated for each unique PDF. int | -n 10 |
-s or --sigma |
Sample to '-s' sigma in the normal distribution. float | -s 7 |
-p or --plot_sampling |
Plots sampled structures on top of DeepStruc training data. Model must be DeepStruc. bool | -p True |
-g or --save_path |
Path to directory where predictions will be saved. bool | -g ./best_preds |
-i or --index_plot |
Highlights specific reconstruction in the latent space. --data must be specific file and not directory and '--plot True'. int | -i 4 |
-P or --plot_data |
If True then the first loaded PDF is plotted and shown after normalization. bool | -P ./best_preds |
Andy S. Anker1
Emil T. S. Kjær1
Marcus N. Weng1
Simon J. L. Billinge2, 3
Raghavendra Selvan4, 5
Kirsten M. Ø. Jensen1
1 Department of Chemistry and Nano-Science Center, University of Copenhagen, 2100 Copenhagen Ø, Denmark.
2 Department of Applied Physics and Applied Mathematics Science, Columbia University, New York, NY 10027, USA.
3 Condensed Matter Physics and Materials Science Department, Brookhaven National Laboratory, Upton, NY 11973, USA.
4 Department of Computer Science, University of Copenhagen, 2100 Copenhagen Ø, Denmark.
5 Department of Neuroscience, University of Copenhagen, 2200, Copenhagen N.
Should there be any question, desired improvement or bugs please contact us on GitHub or through email: [email protected] or [email protected].
If you use our code or our results, please consider citing our papers. Thanks in advance!
@article{D2DD00086E,
title={DeepStruc: Towards structure solution from pair distribution function data using deep generative models},
author={Emil T. S. Kjær, Andy S. Anker, Marcus N. Weng, Simon J. L. Billinge, Raghavendra Selvan, Kirsten M. Ø. Jensen},
journal={Digital Discovery},
year={2023}}
@article{anker2020characterising,
title={Characterising the atomic structure of mono-metallic nanoparticles from x-ray scattering data using conditional generative models},
author={Anker, Andy Sode and Kjær, Emil TS and Dam, Erik B and Billinge, Simon JL and Jensen, Kirsten MØ and Selvan, Raghavendra},
journal={16th international workshop on mining and learning with graphs under KDD2020 conference}
year={2020}}
Our code is developed based on the the following publication:
@article{anker2020characterising,
title={Characterising the atomic structure of mono-metallic nanoparticles from x-ray scattering data using conditional generative models},
author={Anker, Andy Sode and Kjær, Emil TS and Dam, Erik B and Billinge, Simon JL and Jensen, Kirsten MØ and Selvan, Raghavendra},
journal={16th international workshop on mining and learning with graphs under KDD2020 conference}
year={2020}}
This project is licensed under the Apache License Version 2.0, January 2004 - see the LICENSE file for details.