This interface provides a mechanism for kdb+ users to interact with and create HDF5 datasets. The interface is a thin wrapper for kdb+ around the HDF Group's C api for HDF5, outlined in full here
This is part of the Fusion for kdb+ interface collection.
Kdb+ is the world's fastest time-series database, optimized for ingesting, analyzing and storing massive amounts of structured data. To get started with kdb+, please visit https://code.kx.com/q/learn/ for downloads and developer information. For general information, visit https://kx.com/
Hierarchical Data Format 5 (HDF5) is a file format designed specifically for the storage and organization of large amounts of data.
In many ways, HDF5 acts like a hierarchical file system similar to that used by linux or windows. This structure contains two major objects:
- Datasets - Multidimensional arrays of homogeneously-typed data, or compound data containing a mixture of types. Datasets are similar to files within a traditional POSIX file system.
- Groups - Container structures holding datasets or other groups. They function similarly to directories within a traditional POSIX file system.
There are a number of secondary objects and structures, which add complexity to the format. In doing so, they allow the format to be used for a wider range of use cases
- Attributes: These allow metadata information to be associated with a dataset or group e.g. associate the date of data collection with a group, or the temperature when a set of results was collected.
- Linking functionality: Like a traditional POSIX file system it is possible to create links between objects (hard/soft/external). These allow datasets or groups relevant to multiple experiments to be accessed via more intuitive routes.
If you have any HDF5 related questions, you can raise them on the HDF Forum.
- kdb+ ≥ 3.5 64-bit
- HDF5 C api ≥ 1.10
The following outlines instructions for installing the HDF5 group's C api for supported architectures.
Linux
- Download a supported release of hdf5 and install, instructions are provided here.
MacOS
- Run
brew install hdf5
It is recommended that a user install this interface through a release. Installation of the interface from a release is completed in a number of steps
-
Ensure you have downloaded/installed the HDF5 C api following the instructions here
-
Download a release from here for your system architecture.
-
Add the location of the 'lib' directory for the HDF5 C api to
LD_LIBRARY_PATH
/DYLD_LIBRARY_PATH
for Linux or MacOS respectively.## Linux export LD_LIBRARY_PATH=/usr/local/hdf5-c-api/lib/:$LD_LIBRARY_PATH ## MacOS export DYLD_LIBRARY_PATH=/Users/bob/hdf5-c-api/lib/:$DYLD_LIBRARY_PATH
-
Install required q executable script
hdf5.q
and binary filelib/libhdf5.so
to$QHOME
and$QHOME/[ml](64)
respectively by executing the following from the Release directorychmod +x install.sh ./install.sh
In order to successfully build and install this interface, the following environment variables must be set:
HDF5_HOME
= Location of the HDF5 C api installation (directory containing/include
and/lib
subdirectories).QHOME
= Q installation directory (directory containingq.k
).
- Create a directory from which the execute the CMAKE command and move into this directory
mkdir build && cd build
- Execute the
cmake
instructions
cmake ..
- generate the
libhdf5.so
binary
make
- Install the
libhdf5.so
binary into$QHOME/[ml]64
andhdf5.q
into$QHOME
make install
This interface is in active developement and as such there are a number of use-cases that are currently not supported.
- Use of this interface on Windows 64-bit systems
- Creation of compressed datasets
- Access to unlimited datasets
- Interaction with HDF5 images
If your use case requires the above functionality to be available, please open an issue here. If you are capable, please consider contributing to the project.
Documentation outlining the functionality available for this interface can be found here.
The HDF5 interface is provided here under an Apache 2.0 license.
If you find issues with the interface or have feature requests please consider raising an issue here.
If you wish to contribute to this project please follow the contributing guide here.