Skip to content

Commit

Permalink
add PLUMED-ISDB in LibTorch section
Browse files Browse the repository at this point in the history
  • Loading branch information
Massimiliano Bonomi committed Sep 29, 2023
1 parent 15c64d7 commit 018f9f0
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions user-doc/Installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -306,7 +306,7 @@ Then, rebuild plumed.

\subsection installation-libtorch LibTorch

In order to use machine learning models optimized with PyTorch (as in the \ref PYTORCH module) one needs to link the LibTorch C++ library. To do so, one can follow these instructions to download the pre-compiled library and configure PLUMED to use it.
In order to use machine learning models optimized with PyTorch (as in the \ref PYTORCH module) or specific actions implemented in the \ref PLUMED-ISDB module, one needs to link the LibTorch C++ library. To do so, one can follow these instructions to download the pre-compiled library and configure PLUMED to use it.

\warning
Libtorch APIs are still in beta phase, so there might be breaking changes in newer versions. Currently, versions between 1.8.* and 2.0.0 have been tested. Please note that if you want to link a different version it might be necessary to manually specify the required libraries within LIBS in configure.
Expand Down Expand Up @@ -342,7 +342,7 @@ Once the environment variables are set, we can configure PLUMED with the `--enab

**Notes**
- In order to activate also the \ref PYTORCH module one needs to add `--enable-modules=pytorch` or `--enable-modules=all`.
- `--enable-libtorch` will first try first to link the CUDA-enabled library and if it does not found it it will try to link the CPU-only version.
- `--enable-libtorch` will first try first to link the CUDA-enabled library and if it does not found it will try to link the CPU-only version.
- To verify that the linking of LibTorch is succesful, one should look at the output of the configure commands: `checking libtorch[cpu/cuda] [without extra libs/with -ltorch_cpu ... ]`. If any of these commands are succesfull, it will return `... yes`. Otherwise, the configure will display a warning (and not an error!) that says: `configure: WARNING: cannot enable __PLUMED_HAS_LIBTORCH`. In this case, it is recommended to examine the output of the above commands in the config.log file to understand the reason (e.g. it cannot find the required libraries).
- If you want to use the pre-cxx11 ABI LibTorch binaries (useful for instance when installing it on an HPC cluster) then you should download the related version from PyTorch website (e.g. <a href="https://download.pytorch.org/libtorch/cpu/libtorch-shared-with-deps-2.0.0%2Bcpu.zip"> `libtorch-shared-with-deps-2.0.0%2Bcpu.zip`</a>) and add the following option to the configure: `CXXFLAGS="-D_GLIBCXX_USE_CXX11_ABI=0"`.

Expand Down

1 comment on commit 018f9f0

@PlumedBot
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Found broken examples in automatic/a-masterclass-22-09.txt
Found broken examples in automatic/a-masterclass-22-11.txt
Found broken examples in automatic/a-masterclass-22-12.txt
Found broken examples in automatic/performance-optimization.txt
Found broken examples in automatic/a-trieste-6.txt
Found broken examples in automatic/munster.txt
Found broken examples in automatic/ANN.tmp
Found broken examples in automatic/EDS.tmp
Found broken examples in automatic/EMMI.tmp
Found broken examples in automatic/ENVIRONMENTSIMILARITY.tmp
Found broken examples in automatic/FOURIER_TRANSFORM.tmp
Found broken examples in automatic/FUNCPATHGENERAL.tmp
Found broken examples in automatic/FUNCPATHMSD.tmp
Found broken examples in automatic/FUNNEL.tmp
Found broken examples in automatic/FUNNEL_PS.tmp
Found broken examples in automatic/GHBFIX.tmp
Found broken examples in automatic/INCLUDE.tmp
Found broken examples in automatic/MAZE_MEMETIC_SAMPLING.tmp
Found broken examples in automatic/MAZE_OPTIMIZER_BIAS.tmp
Found broken examples in automatic/MAZE_RANDOM_ACCELERATION_MD.tmp
Found broken examples in automatic/MAZE_RANDOM_WALK.tmp
Found broken examples in automatic/MAZE_SIMULATED_ANNEALING.tmp
Found broken examples in automatic/MAZE_STEERED_MD.tmp
Found broken examples in automatic/PIV.tmp
Found broken examples in automatic/PLUMED.tmp
Found broken examples in MiscelaneousPP.md

Please sign in to comment.