Graph Neural Networks (GNNs) / Message Passing Neural Networks (MPNNs) have shown impressive performance in various tasks ranging from molecule generation to computer graphics or even physics-based simulations. On the other hand, operations on structural meshes, such as condensation, are well established in the FEM community. Yet, the relationship between MPNNs and these structural operations remains largely unexplored. This thesis aims to formalize the equivalence between the message-passing scheme and operations on FEM matrices.
Keywords: graphs, geometric deep learning, structures, message passing
This thesis aims to formalize the equivalence between message-passing neural networks and operations on structural matrices. The proposed research will contribute to the understanding of the underlying mechanisms of MPNNs and help develop more efficient and interpretable MPNNs as alternatives to structural simulators.
Desired competencies: Solid knowledge of Python and PyTorch. Experience and/or good knowledge about ML and structures (FEM).
This thesis aims to formalize the equivalence between message passing neural networks and operations on structural matrices. Specifically, the student will have the following objectives:
- Review the existing methods and literature.
- Become familiar with the PyTorch Geometric framework.
- Implement a first prototype, based on existing work from the literature.
- Develop a theoretical framework for formalizing the equivalence between MPNNs and structural operations.
- Formulate methods for sub-structuring/condensation schemes which exploit this framework.
- Demonstrate the method on a simple case study, exploring its advantages and limitations.
README.md : the introduction to this project TODO.md : todo list requirements.txt : the required packages for this project
- meeting : the weekly meeting material
- src : the source code directory
- config : folder generated by gen_config.py
- dataset
- tetra : tetra mesh generator
- triangle : triangle mesh generator
- truss : truss mesh generator
- gnn
- bistride.py : bistride downsampling for GraphUNet
- model.py : the GNN models
- trainer.py : trainers used in the main.py
- utils.py : utils functions
- linear_elasiticity
- tetra.py : not finished yet
- triangle.py : linear elasiticiy solver for triangle elements
- truss.py : linear elasiticity solver for truss elements
- utils.py : utils functions
- fast_asm_case.py
- forward_v1.py : the forward problem (learn Galerkin) local version
- forward_v2.py : the forward problem (learn Galerkin) global version
- gen_config.py : generate the config files/directories
- main.py: the main entrace for training and testing
- plot.py: plotting utils
- vis_dataset.py: entrace for visualizing the dataset
- vis_predict.py: entrace for visualizing the prediction
generate config file
it will generate a bunch of folders and toml configuration files
cd src
python gen_config.py
train a bunch of models
cd src
python main.py -f config/train_rectangle_strong_pinn_auto_weight
if you are not the first time to run this command remember to add the force option to overwrite
cd src
python main.py -f config/train_rectangle_strong_pinn_auto_weight --force
train only SIGN
cd src
python main.py -c config/hyperparameter/sign.toml
test frequency variant dataset
cd src
python main.py -f config/frequency_varaiant_test_rectangle_strong_pinn_auto_weight
test boundary variant dataset
cd src
python main.py -f config/boundary_varaiant_test_rectangle_strong_pinn_auto_weight
visualize the dataset
cd src
python vis_dataset.py -d quadrilateral
visualize the prediction result
cd src
python vis_predict.py -c config/hyperparameter/sign.toml -s
Person | Role | Organization |
---|---|---|
Duthé Gregory | Host | Structural Mechanics (Prof. Chatzi) (ETHZ) |
Duthé Gregory | Host | ETH Competence Center - ETH AI Center (ETHZ) |