This repo contains various python tools based on PDAL that are used to work on LiDAR data in the LidarHD project at IGN (Institut National de l'Information Géographique et Forestière / French National Institute of Geographic and Forest Information) to work on LiDAR data.
We've decided to make them available because think that they may be useful to others, but this repo is NOT meant to be substantially modified from community input, and may be amended/completed depending on the fonctionalities that our team needs.
This library contains pdal-based tools to:
- colorize a point cloud using images from Geoplateforme / cartes.gouv.fr (a portal from French government providing access to aerial imagery)
- stitch together LAS files using their location
- standardize LAS files
- unlock LAS files generated by TerraSolid
This library can be used in different ways:
- directly from sources:
make install
creates a mamba environment with the required dependencies - installed with
pip
from pypi:pip install ign-pdal-tools
- used in a docker container: see documentation Dockerfile
- color.py: Colorize a point cloud from Geoplateforme data
Misc tools to get information on a las file, eg. retrieve metadata, find epsg value, find bounds, get parameters to pass to a writer. They are intended to be used from the pdaltools module, for example:
from pdaltools import las_infos
filename = ...
las_infos.las_info_metadata(filename)
Misc tools to get information on a point cloud (numpy array). Eg. get expected origin of a point cloud based on a square tiling:
from pdaltools import pcd_infos
points = ...
pcd_infos.get_pointcloud_origin_from_tile_width(points, tile_width=1000)
- las_clip.py: crop a LAS file using 2d bounding box
- las_merge.py: merge a LAS file with its neighbors according to their filenames
- las_add_buffer.py: add points to a LAS file from a buffer (border) from its neighbors (using filenames to locate neighbors)
WARNING: In las_merge.py
and las_add_buffer.py
, filenames are used to get the LAS files extents
and to find neighbors.
The naming convention is {prefix1}_{prefix2}_{xcoord}_{ycoord}_{postfix})}
(eg. Semis_2021_0770_6278_LA93_IGN69.laz
).
By default, xcoord
and ycoord
are given in kilometers and the shape of the tile is 1 km * 1 km
- standardize_format.py: re-write a LAS file in a standard format (see code for details)
- count_occurences: count occurences for each value of a given attribute in a set of LAS files (initially used for classification)
- count_occurences_for_attribute.py: count occurences in one or several files and save the result in a json file.
- merge_occurences_counts.py: merge counts from several results of count_occurences_for_attribute (json files) into a single json file (used for parallelization)
- replace_attribute_in_las.py: using a json file containing a correspondance map, replace the occurences of each value in a LAS file by its corresponding value from the map.
unlock_file.py: overwrite a LAS file in case PDAL raises this error:
readers.las: Global encoding WKT flag not set for point format 6 - 10.
which is due to TerraSolid
malformed LAS output for LAS1.4 files with point format 6 to 10.
add_points_in_las.py: add points from some vector files (ex: shp, geojson, ...) inside Las. New points will have X,Y and Z coordinates. Other attributes values given by the initial las file are null (ex: classification at 0). These others attributes could be forced by using the '--dimensions/-d' option in the command line (ex : 'add_points_in_las.py -i myLas.las -g myPoints.json -d classification=64' - points will have their classification set to 64). The dimension should be present in the initial las ; this is not allowed to add new dimension.
Every time the code is changed, think of updating the version file: pdaltools/_version.py
Please log your changes in CHANGELOG.md
Before committing your changes, run the precommit hooks. They can be installed to run automatically with make install-precommit
Create the conda environment: make install
Run unit tests: make testing
To generate a pip package and deploy it on pypi, use the Makefile at the root of the repo:
make build
: build the librarymake install
: install the library in an editable way (pip -e
)make deploy
: deploy it on pypi
To build a docker image with the library installed: make docker-build
To test the docker image: make docker-test