Please note
For now, the code is only valid in v2.3.1
. If you want to adapt it in newer versions, please start from the native timeVayingMappedFixedValue bc, and add the new features of precursorHDF5 to it.
If you are using this bc in your work, please consider citing this paper :
Zhang, Teng, Jinghua Li, Yingwen Yan, and Yuxin Fan. “Influence of LES Inflow Conditions on Simulations of a Piloted Diffusion Flame.” International Journal of Computational Fluid Dynamics, July 4, 2024, 1–15. https://doi.org/10.1080/10618562.2024.2370802.
This boundary condition is adapted from timevaryingmappedhdf5fixedvalue.
- Turbulent inlet velocity library is stored in one HDF5 file.
- No need to do the precursor simulation on the fly.
- Rescaling the turbulent inlet velocity based on experimental data or user-defined profiles.
- Support the recycling usage of the precursor's library.
- Other features are similar to the timeVaryingMappedFixedValue.
Here is a brief tutorial of how to intall HDF5 lib:
# Install path
cd $HOME/software
mkdir hdf5
cd hdf5
# Get hdf5-1.8.3 binary release
wget https://support.hdfgroup.org/ftp/HDF5/releases/hdf5-1.8/hdf5-1.8.3/bin/linux-x86_64/5-1.8.3-linux-x86_64-shared.tar.gz
# Unpack
tar xzf 5-1.8.3-linux-x86_64-shared.tar.gz
rm 5-1.8.3-linux-x86_64-shared.tar.gz
# Make a symbolic link "latest"
ln -s 5-1.8.3-linux-x86_64-shared latest
# Redeploy script
cd latest/bin
./h5redeploy
Set hdf5
path in .zshrc
or .bashrc
:
export HDF5_DIR=$HOME/software/hdf5/latest
export LD_LIBRARY_PATH=$HDF5_DIR/lib:$LD_LIBRARY_PATH
Clone this repo to your local drive and compile, for example:
git clone [email protected]:TimoLin/precursorHDF5.git $WM_PROJECT_USER_DIR/precursorHDF5
cd $WM_PROJECT_USER_DIR/precursorHDF5
wmake
Follow this Python script tVMHDF5FV.
The boundary condition expects that the hdf5 file will contain three datasets.
One for the points, of shape N by 3, where N is the number of points you have in the sampling slice. The first column contains the x coordinates, the second the y coordinates, and the third the z coordinates.
One for the time-values that the data is provided for, of shape nTimeValues by 1, where nTimeValues is simply the number of time-values that you are providing the data for.
One for the data itself, of shape nTimeValues by N by nComponents, where nComponents depends on what type of field one is dealing with. For a vector field nComponents is 3, for instance. The ordering of the data should be in agreement with the ordering of the points, i.e. the first value is expected to correspond to the first point etc.
One should provide the following parameters in the definition of the boundary type.
hdf5FileName -- name the hdf5 file.
hdf5PointsDatasetName -- name of the dataset containg the points.
hdf5SampleTimesDatasetName -- name of the dataset containing the sample times.
hdf5FieldValuesDatasetName -- name of the dataset containing the values of the field .
For example, one could have the following in the 0/U file.
inlet
{
type precursorHDF5;
setAverage false;
offset (0 0 0);
perturb 0.0;
mapMethod nearest; //or planarInterpolation;
recycling true;
hdf5FileName "dbTest.hdf5";
hdf5PointsDatasetName "points";
hdf5SampleTimesDatasetName "times";
hdf5FieldValuesDatasetName "velocity";
}
Note: For the hdf5FileName, the file is under the case's root folder, i.e.:
├── 0
├── constant
├── system
└── dbTest.hdf5
Add the following line to controlDict
when using it:
libs ("libPrecursorHDF5.so");
Currently the boundary condition only works with vector fields!