diff --git a/wrapping/python/plugins/NXDataAnalysisToolkit/docs/ReadPeregrineHDF5FileFilter.md b/wrapping/python/plugins/NXDataAnalysisToolkit/docs/ReadPeregrineHDF5FileFilter.md index 330ac3aa45..53ac8a44d2 100644 --- a/wrapping/python/plugins/NXDataAnalysisToolkit/docs/ReadPeregrineHDF5FileFilter.md +++ b/wrapping/python/plugins/NXDataAnalysisToolkit/docs/ReadPeregrineHDF5FileFilter.md @@ -10,40 +10,46 @@ The `ReadPeregrineHDF5File` filter is designed to read an HDF5 file that is in a ### Input Parameters - **`Input Peregrine HDF5 File`** (string): Specifies the complete file path to the target HDF5 file. + +## Slice Data Parameters +- **`Slice Data Geometry`**: Path to the Slice Data image geometry to be created. +- **`Slice Data Cell Attribute Matrix Name`**: Name of the Slice Data cell attribute matrix to be created. +- **`Read Segmentation Results`** (boolean): Toggles the reading of segmentation results stored in the HDF5 file. - **`Segmentation Results`** (string): Comma-delimited string that specifies which segmentation results datasets should be read. - **`Read Camera Data`** (boolean): Toggles the reading of camera data stored in the HDF5 file. +- **`Camera Data HDF5 Parent Path`**: The path to the parent group of the camera datasets in the HDF5 file. +- **`Camera Data Datasets`**: Comma-delimited string that specifies which camera data datasets should be read. - **`Read Part Ids`** (boolean): Determines whether part ids data should be read. +- **`Part Ids Array Name`**: Name of the part ids array to be created. - **`Read Sample Ids`** (boolean): Determines whether sample ids data should be read. -- **`Read Anomaly Detection`** (boolean): Determines whether anomaly detection data should be read. -- **`Read X-Ray CT`** (boolean): Determines whether X-ray computed tomography data should be read. -- **`Read Slices Subvolume`** (boolean): Enables or disables the reading of a specific subvolume of slices data within the HDF5 file. If set to True, additional parameters for specifying the subvolume bounds (`Slices Subvolume X Bounds`, `Slices Subvolume Y Bounds`, `Slices Subvolume Z Bounds`) are required. +- **`Sample Ids Array Name`**: Name of the sample ids array to be created. +- **`Enable Slices Subvolume`** (boolean): Enables or disables the reading of a specific subvolume of slices data within the HDF5 file. If set to True, additional parameters for specifying the subvolume bounds (`Slices Subvolume X Bounds`, `Slices Subvolume Y Bounds`, `Slices Subvolume Z Bounds`) are required. - **`Slices Subvolume X Bounds`** (tuple of integers): Defines the minimum and maximum x-coordinates of the slices subvolume to be read. - **`Slices Subvolume Y Bounds`** (tuple of integers): Defines the minimum and maximum y-coordinates of the slices subvolume to be read. - **`Slices Subvolume Z Bounds`** (tuple of integers): Defines the minimum and maximum z-coordinates of the slices subvolume to be read. -- **`Read Registered Data Subvolume`** (boolean): Enables or disables the reading of a specific subvolume of registered data within the HDF5 file. If set to True, additional parameters for specifying the subvolume bounds (`Registered Data Subvolume X Bounds`, `Registered Data Subvolume Y Bounds`, `Registered Data Subvolume Z Bounds`) are required. -- **`Registered Data Subvolume X Bounds`** (tuple of integers): Defines the minimum and maximum x-coordinates of the registered data subvolume to be read. -- **`Registered Data Subvolume Y Bounds`** (tuple of integers): Defines the minimum and maximum y-coordinates of the registered data subvolume to be read. -- **`Registered Data Subvolume Z Bounds`** (tuple of integers): Defines the minimum and maximum z-coordinates of the registered data subvolume to be read. -- **`Read Scan Data Subvolume`** (boolean): Enables or disables the reading of specific slices of scan data within the HDF5 file. If set to True, an additional parameter for specifying the slice bounds (`Scan Data Slice Bounds`) is required. -- **`Scan Data Slice Bounds`** (tuple of integers): Defines the minimum and maximum slices of the scan data to be read. -### Created Data Objects -- **`Slice Data Geometry`**: Path to the Slice Data image geometry to be created. -- **`Slice Data Cell Attribute Matrix Name`**: Name of the Slice Data cell attribute matrix to be created. -- **`Camera Data 0 Array Name`**: Name of the first camera data array to be created. -- **`Camera Data 1 Array Name`**: Name of the second camera data array to be created. -- **`Part Ids Array Name`**: Name of the part ids array to be created. -- **`Sample Ids Array Name`**: Name of the sample ids array to be created. +## Registered Data Parameters - **`Registered Data Geometry`**: Path to the Registered Data image geometry to be created. - **`Registered Data Cell Attribute Matrix Name`**: Name of the Registered Data cell attribute matrix to be created. +- **`Read Anomaly Detection`** (boolean): Determines whether anomaly detection data should be read. - **`Anomaly Detection Array Name`**: Name of the Anomaly Detection array to be created. +- **`Read X-Ray CT`** (boolean): Determines whether X-ray computed tomography data should be read. - **`X-Ray CT Array Name`**: Name of the X-Ray CT array to be created. +- **`Enable Registered Data Subvolume`** (boolean): Enables or disables the reading of a specific subvolume of registered data within the HDF5 file. If set to True, additional parameters for specifying the subvolume bounds (`Registered Data Subvolume X Bounds`, `Registered Data Subvolume Y Bounds`, `Registered Data Subvolume Z Bounds`) are required. +- **`Registered Data Subvolume X Bounds`** (tuple of integers): Defines the minimum and maximum x-coordinates of the registered data subvolume to be read. +- **`Registered Data Subvolume Y Bounds`** (tuple of integers): Defines the minimum and maximum y-coordinates of the registered data subvolume to be read. +- **`Registered Data Subvolume Z Bounds`** (tuple of integers): Defines the minimum and maximum z-coordinates of the registered data subvolume to be read. + +### Scan Data Parameters +- **`Read Scan Datasets`** (boolean): Toggles the reading of the scan datasets stored in the HDF5 file. - **`Scan Data Geometry`**: Path to the Scan Data edge geometry to be created. - **`Scan Data Edge Attribute Matrix Name`**: Name of the Scan Data edge attribute matrix to be created. - **`Scan Data Vertex Attribute Matrix Name`**: Name of the Scan Data vertex attribute matrix to be created. - **`Scan Data Vertex List Array Name`**: Name of the Scan Data vertex list array to be created. - **`Scan Data Edge List Array Name`**: Name of the Scan Data edge list array to be created. - **`Scan Data Time of Travel Array Name`**: Name of the Scan Data Time of Travel array to be created. +- **`Enable Scan Data Subvolume`** (boolean): Enables or disables the reading of specific slices of scan data within the HDF5 file. If set to True, an additional parameter for specifying the slice bounds (`Scan Data Slice Bounds`) is required. +- **`Scan Data Slice Bounds`** (tuple of integers): Defines the minimum and maximum slices of the scan data to be read. ## Example Pipelines diff --git a/wrapping/python/plugins/NXDataAnalysisToolkit/src/NXDataAnalysisToolkit/ReadPeregrineHDF5File.py b/wrapping/python/plugins/NXDataAnalysisToolkit/src/NXDataAnalysisToolkit/ReadPeregrineHDF5File.py index 13eb237afb..9e8d4cf655 100644 --- a/wrapping/python/plugins/NXDataAnalysisToolkit/src/NXDataAnalysisToolkit/ReadPeregrineHDF5File.py +++ b/wrapping/python/plugins/NXDataAnalysisToolkit/src/NXDataAnalysisToolkit/ReadPeregrineHDF5File.py @@ -10,44 +10,7 @@ class ReadPeregrineHDF5File: This section should contain the 'keys' that store each parameter. The value of the key should be snake_case. The name of the value should be ALL_CAPITOL_KEY """ - # Parameter Keys - INPUT_FILE_PATH_KEY = 'input_file_path' - READ_SLICES_SUBVOLUME_KEY = 'read_slices_subvolume' - SLICES_SUBVOLUME_MINMAX_X_KEY = 'slices_subvolume_minmax_x' - SLICES_SUBVOLUME_MINMAX_Y_KEY = 'slices_subvolume_minmax_y' - SLICES_SUBVOLUME_MINMAX_Z_KEY = 'slices_subvolume_minmax_z' - READ_CAMERA_DATA_KEY = 'read_camera_data' - READ_PART_IDS_KEY = 'read_part_ids' - READ_SAMPLE_IDS_KEY = 'read_sample_ids' - READ_ANOMALY_DETECTION_KEY = 'read_anomaly_detection' - READ_X_RAY_CT_KEY = 'read_x_ray_ct' - SEGMENTATION_RESULTS_KEY = 'segmentation_results' - SLICE_DATA_KEY = 'slice_data' - SLICE_DATA_CELL_ATTR_MAT_KEY = 'slice_data_cell_attr_mat' - CAMERA_DATA_0_ARRAY_NAME_KEY = 'camera_data_0_array_name' - CAMERA_DATA_1_ARRAY_NAME_KEY = 'camera_data_1_array_name' - PART_IDS_ARRAY_NAME_KEY = 'part_ids_array_name' - SAMPLE_IDS_ARRAY_NAME_KEY = 'sample_ids_array_name' - REGISTERED_DATA_KEY = 'registered_data' - REGISTERED_DATA_CELL_ATTR_MAT_KEY = 'registered_data_cell_attr_mat' - ANOMALY_DETECTION_ARRAY_NAME_KEY = 'anomaly_detection_array_name' - XRAY_CT_ARRAY_NAME_KEY = 'xray_ct_array_name' - READ_REGISTERED_DATA_SUBVOLUME_KEY = 'read_registered_data_subvolume' - REGISTERED_DATA_SUBVOLUME_MINMAX_X_KEY = 'registered_data_subvolume_minmax_x' - REGISTERED_DATA_SUBVOLUME_MINMAX_Y_KEY = 'registered_data_subvolume_minmax_y' - REGISTERED_DATA_SUBVOLUME_MINMAX_Z_KEY = 'registered_data_subvolume_minmax_z' - READ_SCAN_DATA_SUBVOLUME_KEY = 'read_scan_data_subvolume' - SCAN_DATA_SUBVOLUME_MINMAX_KEY = 'scan_data_subvolume_minmax' - SCAN_DATA_KEY = 'scan_data' - SCAN_DATA_CELL_ATTR_MAT_KEY = 'scan_data_cell_attr_mat' - SCAN_DATA_VERTEX_ATTR_MAT_KEY = 'scan_data_vertex_attr_mat' - SCAN_DATA_VERTEX_LIST_NAME_KEY = 'scan_data_vertex_list_name' - SCAN_DATA_EDGE_LIST_NAME_KEY = 'scan_data_edge_list_name' - TIME_OF_TRAVEL_ARRAY_NAME = 'time_of_travel_array_name' - # HDF5 Dataset Paths - CAMERA_DATA_0_H5_PATH = '/slices/camera_data/visible/0' - CAMERA_DATA_1_H5_PATH = '/slices/camera_data/visible/1' PART_IDS_H5_PATH = '/slices/part_ids' SAMPLE_IDS_H5_PATH = '/slices/sample_ids' SEGMENTATION_RESULTS_H5_PARENT_PATH = '/slices/segmentation_results' @@ -113,6 +76,43 @@ def default_tags(self) -> List[str]: """ return ['python', 'ReadPeregrineHDF5File', 'peregrine', 'hdf5', 'read', 'import'] + # Parameter Keys + INPUT_FILE_PATH_KEY = 'input_file_path' + ENABLE_SLICES_SUBVOLUME_KEY = 'enable_slices_subvolume' + SLICES_SUBVOLUME_MINMAX_X_KEY = 'slices_subvolume_minmax_x' + SLICES_SUBVOLUME_MINMAX_Y_KEY = 'slices_subvolume_minmax_y' + SLICES_SUBVOLUME_MINMAX_Z_KEY = 'slices_subvolume_minmax_z' + READ_SEGMENTATION_RESULTS_KEY = 'read_segmentation_results' + READ_CAMERA_DATA_KEY = 'read_camera_data' + READ_PART_IDS_KEY = 'read_part_ids' + READ_SAMPLE_IDS_KEY = 'read_sample_ids' + READ_ANOMALY_DETECTION_KEY = 'read_anomaly_detection' + READ_X_RAY_CT_KEY = 'read_x_ray_ct' + READ_SCAN_DATASETS_KEY = 'read_scan_datasets' + SEGMENTATION_RESULTS_VALUES_KEY = 'segmentation_results_values' + SLICE_DATA_KEY = 'slice_data' + SLICE_DATA_CELL_ATTR_MAT_KEY = 'slice_data_cell_attr_mat' + CAMERA_DATA_HDF5_PARENT_PATH_KEY = 'camera_data_hdf5_parent_path' + CAMERA_DATA_DATASETS_KEY = 'camera_data_datasets' + PART_IDS_ARRAY_NAME_KEY = 'part_ids_array_name' + SAMPLE_IDS_ARRAY_NAME_KEY = 'sample_ids_array_name' + REGISTERED_DATA_KEY = 'registered_data' + REGISTERED_DATA_CELL_ATTR_MAT_KEY = 'registered_data_cell_attr_mat' + ANOMALY_DETECTION_ARRAY_NAME_KEY = 'anomaly_detection_array_name' + XRAY_CT_ARRAY_NAME_KEY = 'xray_ct_array_name' + ENABLE_REGISTERED_DATA_SUBVOLUME_KEY = 'enable_registered_data_subvolume' + REGISTERED_DATA_SUBVOLUME_MINMAX_X_KEY = 'registered_data_subvolume_minmax_x' + REGISTERED_DATA_SUBVOLUME_MINMAX_Y_KEY = 'registered_data_subvolume_minmax_y' + REGISTERED_DATA_SUBVOLUME_MINMAX_Z_KEY = 'registered_data_subvolume_minmax_z' + ENABLE_SCAN_DATA_SUBVOLUME_KEY = 'enable_scan_data_subvolume' + SCAN_DATA_SUBVOLUME_MINMAX_KEY = 'scan_data_subvolume_minmax' + SCAN_DATA_KEY = 'scan_data' + SCAN_DATA_CELL_ATTR_MAT_KEY = 'scan_data_cell_attr_mat' + SCAN_DATA_VERTEX_ATTR_MAT_KEY = 'scan_data_vertex_attr_mat' + SCAN_DATA_VERTEX_LIST_NAME_KEY = 'scan_data_vertex_list_name' + SCAN_DATA_EDGE_LIST_NAME_KEY = 'scan_data_edge_list_name' + TIME_OF_TRAVEL_ARRAY_NAME = 'time_of_travel_array_name' + def parameters(self) -> nx.Parameters: """This function defines the parameters that are needed by the filter. Parameters collect the values from the user or through a pipeline file. @@ -125,15 +125,16 @@ def parameters(self) -> nx.Parameters: params.insert(nx.Parameters.Separator("Slice Data Parameters")) params.insert(nx.DataGroupCreationParameter(ReadPeregrineHDF5File.SLICE_DATA_KEY, 'Slice Data Geometry', 'The path to the newly created Slice Data image geometry', nx.DataPath(['Slice Data']))) params.insert(nx.DataObjectNameParameter(ReadPeregrineHDF5File.SLICE_DATA_CELL_ATTR_MAT_KEY, 'Slice Data Cell Attribute Matrix Name', 'The name of the Slice Data cell attribute matrix', 'Cell Data')) # ImageGeom::k_CellDataName - params.insert(nx.StringParameter(ReadPeregrineHDF5File.SEGMENTATION_RESULTS_KEY, 'Segmentation Results (comma-delimited)', 'The segmentation results numbers that will be read, separated by commas', '0,1,2,3,4,5,6,7,8,9,10,11')) - params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_CAMERA_DATA_KEY, 'Read Camera Data', 'Specifies whether or not to read the camera data from the input file.', True)) - params.insert(nx.DataObjectNameParameter(ReadPeregrineHDF5File.CAMERA_DATA_0_ARRAY_NAME_KEY, 'Camera Data 0 Array Name', 'The name of the camera data 0 array.', 'Camera Data 0')) - params.insert(nx.DataObjectNameParameter(ReadPeregrineHDF5File.CAMERA_DATA_1_ARRAY_NAME_KEY, 'Camera Data 1 Array Name', 'The name of the camera data 1 array.', 'Camera Data 1')) - params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_PART_IDS_KEY, 'Read Part Ids', 'Specifies whether or not to read the part ids from the input file.', True)) + params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_SEGMENTATION_RESULTS_KEY, 'Read Segmentation Results', 'Specifies whether or not to read the segmentation results from the input file.', False)) + params.insert(nx.StringParameter(ReadPeregrineHDF5File.SEGMENTATION_RESULTS_VALUES_KEY, 'Segmentation Results (comma-delimited)', 'The segmentation results numbers that will be read, separated by commas', '0,1,2,3,4,5,6,7,8,9,10,11')) + params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_CAMERA_DATA_KEY, 'Read Camera Data', 'Specifies whether or not to read the camera data from the input file.', False)) + params.insert(nx.StringParameter(ReadPeregrineHDF5File.CAMERA_DATA_HDF5_PARENT_PATH_KEY, 'Camera Data HDF5 Parent Path', 'The path to the HDF5 parent group that contains the camera data datasets.', 'slices/camera_data')) + params.insert(nx.StringParameter(ReadPeregrineHDF5File.CAMERA_DATA_DATASETS_KEY, 'Camera Data Datasets (comma-delimited)', 'The camera data datasets that will be read, separated by commas', '0,1,2')) + params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_PART_IDS_KEY, 'Read Part Ids', 'Specifies whether or not to read the part ids from the input file.', False)) params.insert(nx.DataObjectNameParameter(ReadPeregrineHDF5File.PART_IDS_ARRAY_NAME_KEY, 'Part Ids Array Name', 'The name of the part ids array.', 'Part Ids')) - params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_SAMPLE_IDS_KEY, 'Read Sample Ids', 'Specifies whether or not to read the sample ids from the input file.', True)) + params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_SAMPLE_IDS_KEY, 'Read Sample Ids', 'Specifies whether or not to read the sample ids from the input file.', False)) params.insert(nx.DataObjectNameParameter(ReadPeregrineHDF5File.SAMPLE_IDS_ARRAY_NAME_KEY, 'Sample Ids Array Name', 'The name of the sample ids array.', 'Sample Ids')) - params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_SLICES_SUBVOLUME_KEY, 'Read Slices Subvolume', 'Specifies whether or not to read a subvolume of the slices from the input file.', False)) + params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.ENABLE_SLICES_SUBVOLUME_KEY, 'Enable Slices Subvolume', 'Specifies whether or not to read a subvolume of the slices from the input file.', False)) params.insert(nx.VectorUInt64Parameter(ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_X_KEY, 'Slices Subvolume X Bounds', 'The min/max bounds (inclusive) of the X dimension for the Slices subvolume.', [0, 99], ['X Min', 'X Max'])) params.insert(nx.VectorUInt64Parameter(ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_Y_KEY, 'Slices Subvolume Y Bounds', 'The min/max bounds (inclusive) of the Y dimension for the Slices subvolume.', [0, 99], ['Y Min', 'Y Max'])) params.insert(nx.VectorUInt64Parameter(ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_Z_KEY, 'Slices Subvolume Z Bounds', 'The min/max bounds (inclusive) of the Z dimension for the Slices subvolume.', [0, 99], ['Z Min', 'Z Max'])) @@ -141,38 +142,46 @@ def parameters(self) -> nx.Parameters: params.insert(nx.Parameters.Separator("Registered Data Parameters")) params.insert(nx.DataGroupCreationParameter(ReadPeregrineHDF5File.REGISTERED_DATA_KEY, 'Registered Data Geometry', 'The path to the newly created Registered Data image geometry', nx.DataPath(['Registered Data']))) params.insert(nx.DataObjectNameParameter(ReadPeregrineHDF5File.REGISTERED_DATA_CELL_ATTR_MAT_KEY, 'Registered Data Cell Attribute Matrix Name', 'The name of the Registered Data cell attribute matrix', 'Cell Data')) # ImageGeom::k_CellDataName - params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_ANOMALY_DETECTION_KEY, 'Read Anomaly Detection', 'Specifies whether or not to read the anomaly detection (part of the registered data) from the input file.', True)) + params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_ANOMALY_DETECTION_KEY, 'Read Anomaly Detection', 'Specifies whether or not to read the anomaly detection (part of the registered data) from the input file.', False)) params.insert(nx.DataObjectNameParameter(ReadPeregrineHDF5File.ANOMALY_DETECTION_ARRAY_NAME_KEY, 'Anomaly Detection Array Name', 'The name of the Anomaly Detection array.', 'Anomaly Detection')) - params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_X_RAY_CT_KEY, 'Read X-Ray CT', 'Specifies whether or not to read the x-ray CT (part of the registered data) from the input file.', True)) + params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_X_RAY_CT_KEY, 'Read X-Ray CT', 'Specifies whether or not to read the x-ray CT (part of the registered data) from the input file.', False)) params.insert(nx.DataObjectNameParameter(ReadPeregrineHDF5File.XRAY_CT_ARRAY_NAME_KEY, 'X-Ray CT Array Name', 'The name of the X-Ray CT array.', 'X-Ray CT')) - params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_REGISTERED_DATA_SUBVOLUME_KEY, 'Read Registered Data Subvolume', 'Specifies whether or not to read a subvolume of the registered data from the input file.', False)) + params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.ENABLE_REGISTERED_DATA_SUBVOLUME_KEY, 'Enable Registered Data Subvolume', 'Specifies whether or not to read a subvolume of the registered data from the input file.', False)) params.insert(nx.VectorUInt64Parameter(ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_X_KEY, 'Registered Data Subvolume X Bounds', 'The min/max bounds (inclusive) of the X dimension for the Registered Data subvolume.', [0, 99], ['X Min', 'X Max'])) params.insert(nx.VectorUInt64Parameter(ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_Y_KEY, 'Registered Data Subvolume Y Bounds', 'The min/max bounds (inclusive) of the Y dimension for the Registered Data subvolume.', [0, 99], ['Y Min', 'Y Max'])) params.insert(nx.VectorUInt64Parameter(ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_Z_KEY, 'Registered Data Subvolume Z Bounds', 'The min/max bounds (inclusive) of the Z dimension for the Registered Data subvolume.', [0, 99], ['Z Min', 'Z Max'])) params.insert(nx.Parameters.Separator("Scan Data Parameters")) + params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_SCAN_DATASETS_KEY, 'Read Scan Datasets', 'Specifies whether or not to read the scan datasets from the input file.', False)) params.insert(nx.DataGroupCreationParameter(ReadPeregrineHDF5File.SCAN_DATA_KEY, 'Scan Data Geometry', 'The path to the newly created Scan Data edge geometry', nx.DataPath(['Scan Data']))) params.insert(nx.DataObjectNameParameter(ReadPeregrineHDF5File.SCAN_DATA_CELL_ATTR_MAT_KEY, 'Scan Data Edge Attribute Matrix Name', 'The name of the Scan Data edge attribute matrix', 'Edge Data')) # EdgeGeom::k_EdgeDataName params.insert(nx.DataObjectNameParameter(ReadPeregrineHDF5File.SCAN_DATA_VERTEX_ATTR_MAT_KEY, 'Scan Data Vertex Attribute Matrix Name', 'The name of the Scan Data vertex attribute matrix', 'Vertex Data')) # EdgeGeom::k_VertexDataName params.insert(nx.DataObjectNameParameter(ReadPeregrineHDF5File.SCAN_DATA_VERTEX_LIST_NAME_KEY, 'Scan Data Vertex List Array Name', 'The name of the Scan Data vertex list array.', 'Vertices')) params.insert(nx.DataObjectNameParameter(ReadPeregrineHDF5File.SCAN_DATA_EDGE_LIST_NAME_KEY, 'Scan Data Edge List Array Name', 'The name of the Scan Data edge list array.', 'Edges')) params.insert(nx.DataObjectNameParameter(ReadPeregrineHDF5File.TIME_OF_TRAVEL_ARRAY_NAME, 'Scan Data Time of Travel Array Name', 'The name of the Scan Data Time of Travel array.', 'Time of Travel')) - params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.READ_SCAN_DATA_SUBVOLUME_KEY, 'Read Scan Data Subvolume', 'Specifies whether or not to read a subvolume of the scan data from the input file.', False)) + params.insert_linkable_parameter(nx.BoolParameter(ReadPeregrineHDF5File.ENABLE_SCAN_DATA_SUBVOLUME_KEY, 'Enable Scan Data Subvolume', 'Specifies whether or not to read a subvolume of the scan data from the input file.', False)) params.insert(nx.VectorUInt64Parameter(ReadPeregrineHDF5File.SCAN_DATA_SUBVOLUME_MINMAX_KEY, 'Scan Data Slice Bounds', 'The min/max slice bounds (inclusive) for the Scan Data subvolume.', [0, 1], ['Min', 'Max'])) - params.link_parameters(ReadPeregrineHDF5File.READ_SLICES_SUBVOLUME_KEY, ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_X_KEY, True) - params.link_parameters(ReadPeregrineHDF5File.READ_SLICES_SUBVOLUME_KEY, ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_Y_KEY, True) - params.link_parameters(ReadPeregrineHDF5File.READ_SLICES_SUBVOLUME_KEY, ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_Z_KEY, True) - params.link_parameters(ReadPeregrineHDF5File.READ_REGISTERED_DATA_SUBVOLUME_KEY, ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_X_KEY, True) - params.link_parameters(ReadPeregrineHDF5File.READ_REGISTERED_DATA_SUBVOLUME_KEY, ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_Y_KEY, True) - params.link_parameters(ReadPeregrineHDF5File.READ_REGISTERED_DATA_SUBVOLUME_KEY, ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_Z_KEY, True) - params.link_parameters(ReadPeregrineHDF5File.READ_CAMERA_DATA_KEY, ReadPeregrineHDF5File.CAMERA_DATA_0_ARRAY_NAME_KEY, True) - params.link_parameters(ReadPeregrineHDF5File.READ_CAMERA_DATA_KEY, ReadPeregrineHDF5File.CAMERA_DATA_1_ARRAY_NAME_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.READ_SEGMENTATION_RESULTS_KEY, ReadPeregrineHDF5File.SEGMENTATION_RESULTS_VALUES_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.ENABLE_SLICES_SUBVOLUME_KEY, ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_X_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.ENABLE_SLICES_SUBVOLUME_KEY, ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_Y_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.ENABLE_SLICES_SUBVOLUME_KEY, ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_Z_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.ENABLE_REGISTERED_DATA_SUBVOLUME_KEY, ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_X_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.ENABLE_REGISTERED_DATA_SUBVOLUME_KEY, ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_Y_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.ENABLE_REGISTERED_DATA_SUBVOLUME_KEY, ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_Z_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.READ_CAMERA_DATA_KEY, ReadPeregrineHDF5File.CAMERA_DATA_HDF5_PARENT_PATH_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.READ_CAMERA_DATA_KEY, ReadPeregrineHDF5File.CAMERA_DATA_DATASETS_KEY, True) params.link_parameters(ReadPeregrineHDF5File.READ_PART_IDS_KEY, ReadPeregrineHDF5File.PART_IDS_ARRAY_NAME_KEY, True) params.link_parameters(ReadPeregrineHDF5File.READ_SAMPLE_IDS_KEY, ReadPeregrineHDF5File.SAMPLE_IDS_ARRAY_NAME_KEY, True) params.link_parameters(ReadPeregrineHDF5File.READ_ANOMALY_DETECTION_KEY, ReadPeregrineHDF5File.ANOMALY_DETECTION_ARRAY_NAME_KEY, True) params.link_parameters(ReadPeregrineHDF5File.READ_X_RAY_CT_KEY, ReadPeregrineHDF5File.XRAY_CT_ARRAY_NAME_KEY, True) - params.link_parameters(ReadPeregrineHDF5File.READ_SCAN_DATA_SUBVOLUME_KEY, ReadPeregrineHDF5File.SCAN_DATA_SUBVOLUME_MINMAX_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.READ_SCAN_DATASETS_KEY, ReadPeregrineHDF5File.SCAN_DATA_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.READ_SCAN_DATASETS_KEY, ReadPeregrineHDF5File.SCAN_DATA_CELL_ATTR_MAT_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.READ_SCAN_DATASETS_KEY, ReadPeregrineHDF5File.SCAN_DATA_VERTEX_ATTR_MAT_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.READ_SCAN_DATASETS_KEY, ReadPeregrineHDF5File.SCAN_DATA_EDGE_LIST_NAME_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.READ_SCAN_DATASETS_KEY, ReadPeregrineHDF5File.SCAN_DATA_VERTEX_LIST_NAME_KEY, True) + params.link_parameters(ReadPeregrineHDF5File.READ_SCAN_DATASETS_KEY, ReadPeregrineHDF5File.TIME_OF_TRAVEL_ARRAY_NAME, True) + params.link_parameters(ReadPeregrineHDF5File.ENABLE_SCAN_DATA_SUBVOLUME_KEY, ReadPeregrineHDF5File.SCAN_DATA_SUBVOLUME_MINMAX_KEY, True) return params @@ -184,6 +193,16 @@ def preflight_impl(self, data_structure: nx.DataStructure, args: dict, message_h :rtype: nx.IFilter.PreflightResult """ input_file_path = args[ReadPeregrineHDF5File.INPUT_FILE_PATH_KEY] + read_segmentation_results: bool = args[ReadPeregrineHDF5File.READ_SEGMENTATION_RESULTS_KEY] + read_camera_data: bool = args[ReadPeregrineHDF5File.READ_CAMERA_DATA_KEY] + read_part_ids: bool = args[ReadPeregrineHDF5File.READ_PART_IDS_KEY] + read_sample_ids: bool = args[ReadPeregrineHDF5File.READ_SAMPLE_IDS_KEY] + read_anomaly_detection: bool = args[ReadPeregrineHDF5File.READ_ANOMALY_DETECTION_KEY] + read_x_ray_ct: bool = args[ReadPeregrineHDF5File.READ_X_RAY_CT_KEY] + read_scan_datasets: bool = args[ReadPeregrineHDF5File.READ_SCAN_DATASETS_KEY] + + if not read_segmentation_results and not read_camera_data and not read_part_ids and not read_sample_ids and not read_anomaly_detection and not read_x_ray_ct and not read_scan_datasets: + return nx.IFilter.PreflightResult(errors=[nx.Error(-2010, f"No datasets selected to be read! Please select at least one dataset to read.")]) actions = nx.OutputActions() preflight_updated_values: List[nx.IFilter.PreflightValue] = [] @@ -243,35 +262,89 @@ def execute_impl(self, data_structure: nx.DataStructure, args: dict, message_han return nx.Result(errors=result.errors) return nx.Result() - + def _preflight_slice_datasets(self, h5_file_reader: h5py.File, origin: List[float], spacing: List[float], filter_args: dict, actions: nx.OutputActions, preflight_updated_values: List[nx.IFilter.PreflightValue]) -> Result: - segmentation_results_str: str = filter_args[ReadPeregrineHDF5File.SEGMENTATION_RESULTS_KEY] + read_segmentation_results: bool = filter_args[ReadPeregrineHDF5File.READ_SEGMENTATION_RESULTS_KEY] + segmentation_results_str: str = filter_args[ReadPeregrineHDF5File.SEGMENTATION_RESULTS_VALUES_KEY] read_camera_data: bool = filter_args[ReadPeregrineHDF5File.READ_CAMERA_DATA_KEY] read_part_ids: bool = filter_args[ReadPeregrineHDF5File.READ_PART_IDS_KEY] read_sample_ids: bool = filter_args[ReadPeregrineHDF5File.READ_SAMPLE_IDS_KEY] - read_slices_subvolume: bool = filter_args[ReadPeregrineHDF5File.READ_SLICES_SUBVOLUME_KEY] + read_slices_subvolume: bool = filter_args[ReadPeregrineHDF5File.ENABLE_SLICES_SUBVOLUME_KEY] slices_subvolume_minmax_x: list = filter_args[ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_X_KEY] slices_subvolume_minmax_y: list = filter_args[ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_Y_KEY] slices_subvolume_minmax_z: list = filter_args[ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_Z_KEY] slice_data_image_geom_path: nx.DataPath = filter_args[ReadPeregrineHDF5File.SLICE_DATA_KEY] slice_data_cell_attr_mat_name: str = filter_args[ReadPeregrineHDF5File.SLICE_DATA_CELL_ATTR_MAT_KEY] - camera_data_0_array_name: str = filter_args[ReadPeregrineHDF5File.CAMERA_DATA_0_ARRAY_NAME_KEY] - camera_data_1_array_name: str = filter_args[ReadPeregrineHDF5File.CAMERA_DATA_1_ARRAY_NAME_KEY] + camera_data_hdf5_parent_path: str = filter_args[ReadPeregrineHDF5File.CAMERA_DATA_HDF5_PARENT_PATH_KEY] + camera_data_datasets_str: str = filter_args[ReadPeregrineHDF5File.CAMERA_DATA_DATASETS_KEY] part_ids_array_name: str = filter_args[ReadPeregrineHDF5File.PART_IDS_ARRAY_NAME_KEY] sample_ids_array_name: str = filter_args[ReadPeregrineHDF5File.SAMPLE_IDS_ARRAY_NAME_KEY] - segmentation_results_str = segmentation_results_str.strip() - segmentation_results_list = segmentation_results_str.split(',') - if len(segmentation_results_list) == 0: - return Result(errors=[nx.Error(-3000, 'The segmentation results are empty. Please input the segmentation results dataset names that this filter should read from the input file, separated by commas.')]) + dims: List[int] = None - slice_dims_result: Result[List[int]] = self._read_slice_dimensions(h5_file_reader, segmentation_results_list) - if slice_dims_result.invalid(): - return slice_dims_result - - dims: List[int] = slice_dims_result.value + # Optionally read the segmentation results + if read_segmentation_results: + segmentation_results_str = segmentation_results_str.strip() + segmentation_results_list = segmentation_results_str.split(',') + if len(segmentation_results_list) == 0: + return Result(errors=[nx.Error(-3000, 'The segmentation results are empty. Please input the segmentation results dataset names that this filter should read from the input file, separated by commas.')]) + + slice_dims_result: Result[List[int]] = self._read_slice_dimensions(h5_file_reader, segmentation_results_list) + if slice_dims_result.invalid(): + return slice_dims_result + + dims = slice_dims_result.value + + # Optionally read the camera data + if read_camera_data: + camera_data_datasets_str = camera_data_datasets_str.strip() + camera_data_datasets = camera_data_datasets_str.split(',') + if len(camera_data_datasets) == 0: + return Result(errors=[nx.Error(-3001, 'The camera data datasets are empty. Please input the camera data dataset names that this filter should read from the input file, separated by commas.')]) + + for camera_data_dataset in camera_data_datasets: + camera_data_dataset_path = Path(camera_data_hdf5_parent_path) / camera_data_dataset + if dims is None: + dims_result: Result[List[int]] = self._read_dataset_dimensions(h5_file_reader, str(camera_data_dataset_path)) + if dims_result.invalid(): + return dims_result + dims = dims_result.value + else: + dims_result = self._validate_dataset_dimensions(h5_file_reader, str(camera_data_dataset_path), dims) + if dims_result.invalid(): + return Result(errors=dims_result.errors) + + # Optionally read the part ids dataset + if read_part_ids: + if dims is None: + dims_result: Result[List[int]] = self._read_dataset_dimensions(h5_file_reader, ReadPeregrineHDF5File.PART_IDS_H5_PATH) + if dims_result.invalid(): + return dims_result + dims = dims_result.value + else: + validate_result = self._validate_dataset_dimensions(h5_file_reader, ReadPeregrineHDF5File.PART_IDS_H5_PATH, dims) + if validate_result.invalid(): + return Result(errors=validate_result.errors) + + # Optionally read the sample ids dataset + if read_sample_ids: + if dims is None: + dims_result: Result[List[int]] = self._read_dataset_dimensions(h5_file_reader, ReadPeregrineHDF5File.PART_IDS_H5_PATH) + if dims_result.invalid(): + return dims_result + dims = dims_result.value + else: + validate_result = self._validate_dataset_dimensions(h5_file_reader, ReadPeregrineHDF5File.SAMPLE_IDS_H5_PATH, dims) + if validate_result.invalid(): + return Result(errors=validate_result.errors) - if read_slices_subvolume: + # Optionally get and validate subvolume dimensions + if dims is None: + preflight_value = nx.IFilter.PreflightValue() + preflight_value.name = "Original Slices Dimensions (in pixels)" + preflight_value.value = "No slice data has been selected to be read." + preflight_updated_values.append(preflight_value) + elif read_slices_subvolume: slices_dims_str = ( f"Extents:\n" f"X Extent: 0 to {dims[2] - 1} (dimension: {dims[2]})\n" @@ -289,49 +362,38 @@ def _preflight_slice_datasets(self, h5_file_reader: h5py.File, origin: List[floa return Result(errors=result.errors) subvolume_dims = [slices_subvolume_minmax_z[1] - slices_subvolume_minmax_z[0] + 1, slices_subvolume_minmax_y[1] - slices_subvolume_minmax_y[0] + 1, slices_subvolume_minmax_x[1] - slices_subvolume_minmax_x[0] + 1] - actions.append_action(nx.CreateImageGeometryAction(slice_data_image_geom_path, subvolume_dims[::-1] if read_slices_subvolume else dims[::-1], origin, spacing, slice_data_cell_attr_mat_name)) + # Create the image geometry if there is data to import + if dims is not None: + actions.append_action(nx.CreateImageGeometryAction(slice_data_image_geom_path, subvolume_dims[::-1] if read_slices_subvolume else dims[::-1], origin, spacing, slice_data_cell_attr_mat_name)) - for segmentation_result in segmentation_results_list: - segmentation_result_path: nx.DataPath = slice_data_image_geom_path.create_child_path(slice_data_cell_attr_mat_name).create_child_path('Segmentation Result ' + segmentation_result) - actions.append_action(nx.CreateArrayAction(nx.DataType.uint8, subvolume_dims if read_slices_subvolume else dims, [1], segmentation_result_path)) + # Optionally create the segmentation results data arrays + if read_segmentation_results: + for segmentation_result in segmentation_results_list: + segmentation_result_path: nx.DataPath = slice_data_image_geom_path.create_child_path(slice_data_cell_attr_mat_name).create_child_path('Segmentation Result ' + segmentation_result) + actions.append_action(nx.CreateArrayAction(nx.DataType.uint8, subvolume_dims if read_slices_subvolume else dims, [1], segmentation_result_path)) + # Optionally create the camera data arrays if read_camera_data: - dims_result = self._validate_dataset_dimensions(h5_file_reader, ReadPeregrineHDF5File.CAMERA_DATA_0_H5_PATH, dims) - if dims_result.invalid(): - return Result(errors=dims_result.errors) - - camera_data_0_path: nx.DataPath = slice_data_image_geom_path.create_child_path(slice_data_cell_attr_mat_name).create_child_path(camera_data_0_array_name) - actions.append_action(nx.CreateArrayAction(nx.DataType.float32, subvolume_dims if read_slices_subvolume else dims, [1], camera_data_0_path)) - - validate_result = self._validate_dataset_dimensions(h5_file_reader, ReadPeregrineHDF5File.CAMERA_DATA_1_H5_PATH, dims) - if validate_result.invalid(): - return Result(errors=validate_result.errors) - - camera_data_1_path: nx.DataPath = slice_data_image_geom_path.create_child_path(slice_data_cell_attr_mat_name).create_child_path(camera_data_1_array_name) - actions.append_action(nx.CreateArrayAction(nx.DataType.float32, subvolume_dims if read_slices_subvolume else dims, [1], camera_data_1_path)) + for camera_data_dataset in camera_data_datasets: + camera_data_dataset_path: nx.DataPath = slice_data_image_geom_path.create_child_path(slice_data_cell_attr_mat_name).create_child_path(f"Camera Data {camera_data_dataset}") + actions.append_action(nx.CreateArrayAction(nx.DataType.float32, subvolume_dims if read_slices_subvolume else dims, [1], camera_data_dataset_path)) + # Optionally create the part ids data array if read_part_ids: - validate_result = self._validate_dataset_dimensions(h5_file_reader, ReadPeregrineHDF5File.PART_IDS_H5_PATH, dims) - if validate_result.invalid(): - return Result(errors=validate_result.errors) - part_ids_path: nx.DataPath = slice_data_image_geom_path.create_child_path(slice_data_cell_attr_mat_name).create_child_path(part_ids_array_name) actions.append_action(nx.CreateArrayAction(nx.DataType.uint32, subvolume_dims if read_slices_subvolume else dims, [1], part_ids_path)) - + + # Optionally create the sample ids data array if read_sample_ids: - validate_result = self._validate_dataset_dimensions(h5_file_reader, ReadPeregrineHDF5File.SAMPLE_IDS_H5_PATH, dims) - if validate_result.invalid(): - return Result(errors=validate_result.errors) - sample_ids_path: nx.DataPath = slice_data_image_geom_path.create_child_path(slice_data_cell_attr_mat_name).create_child_path(sample_ids_array_name) actions.append_action(nx.CreateArrayAction(nx.DataType.uint32, subvolume_dims if read_slices_subvolume else dims, [1], sample_ids_path)) return Result() - + def _preflight_registered_datasets(self, h5_file_reader: h5py.File, origin: List[float], spacing: List[float], filter_args: dict, actions: nx.OutputActions, preflight_updated_values: List[nx.IFilter.PreflightValue]) -> Result: registered_data_image_geom_path: nx.DataPath = filter_args[ReadPeregrineHDF5File.REGISTERED_DATA_KEY] registered_data_cell_attr_mat_name: str = filter_args[ReadPeregrineHDF5File.REGISTERED_DATA_CELL_ATTR_MAT_KEY] - read_registered_data_subvolume: bool = filter_args[ReadPeregrineHDF5File.READ_REGISTERED_DATA_SUBVOLUME_KEY] + read_registered_data_subvolume: bool = filter_args[ReadPeregrineHDF5File.ENABLE_REGISTERED_DATA_SUBVOLUME_KEY] registered_data_subvolume_minmax_x: list = filter_args[ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_X_KEY] registered_data_subvolume_minmax_y: list = filter_args[ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_Y_KEY] registered_data_subvolume_minmax_z: list = filter_args[ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_Z_KEY] @@ -340,16 +402,31 @@ def _preflight_registered_datasets(self, h5_file_reader: h5py.File, origin: List read_x_ray_ct: bool = filter_args[ReadPeregrineHDF5File.READ_X_RAY_CT_KEY] xray_ct_array_name: str = filter_args[ReadPeregrineHDF5File.XRAY_CT_ARRAY_NAME_KEY] - anomaly_detection_dims_result: Result[Tuple[int]] = self._read_dataset_dimensions(h5_file_reader, ReadPeregrineHDF5File.REGISTERED_ANOMALY_DETECTION_H5_PATH) - if anomaly_detection_dims_result.invalid(): - return Result(errors=anomaly_detection_dims_result.errors) - registered_dims: List[int] = anomaly_detection_dims_result.value + registered_dims: List[int] = None - xray_ct_dims_validation_result: Result = self._validate_dataset_dimensions(h5_file_reader, ReadPeregrineHDF5File.REGISTERED_XRAY_CT_H5_PATH, registered_dims) - if xray_ct_dims_validation_result.invalid(): - return Result(errors=xray_ct_dims_validation_result.errors) + if read_anomaly_detection: + registered_dims_result: Result[Tuple[int]] = self._read_dataset_dimensions(h5_file_reader, ReadPeregrineHDF5File.REGISTERED_ANOMALY_DETECTION_H5_PATH) + if registered_dims_result.invalid(): + return Result(errors=registered_dims_result.errors) + registered_dims: List[int] = registered_dims_result.value - if read_registered_data_subvolume: + if read_x_ray_ct: + if registered_dims is None: + registered_dims_result: Result[Tuple[int]] = self._read_dataset_dimensions(h5_file_reader, ReadPeregrineHDF5File.REGISTERED_XRAY_CT_H5_PATH) + if registered_dims_result.invalid(): + return Result(errors=registered_dims_result.errors) + registered_dims: List[int] = registered_dims_result.value + else: + xray_ct_dims_validation_result: Result = self._validate_dataset_dimensions(h5_file_reader, ReadPeregrineHDF5File.REGISTERED_XRAY_CT_H5_PATH, registered_dims) + if xray_ct_dims_validation_result.invalid(): + return Result(errors=xray_ct_dims_validation_result.errors) + + if registered_dims is None: + preflight_value = nx.IFilter.PreflightValue() + preflight_value.name = "Original Registered Data Dimensions (in pixels)" + preflight_value.value = "No registered data has been selected to be read." + preflight_updated_values.append(preflight_value) + elif read_registered_data_subvolume: registered_dims_str = ( f"Extents:\n" f"X Extent: 0 to {registered_dims[2] - 1} (dimension: {registered_dims[2]})\n" @@ -369,7 +446,8 @@ def _preflight_registered_datasets(self, h5_file_reader: h5py.File, origin: List registered_dims = [registered_data_subvolume_minmax_z[1] - registered_data_subvolume_minmax_z[0] + 1, registered_data_subvolume_minmax_y[1] - registered_data_subvolume_minmax_y[0] + 1, registered_data_subvolume_minmax_x[1] - registered_data_subvolume_minmax_x[0] + 1] - actions.append_action(nx.CreateImageGeometryAction(registered_data_image_geom_path, registered_dims[::-1], origin, spacing, registered_data_cell_attr_mat_name)) + if registered_dims is not None: + actions.append_action(nx.CreateImageGeometryAction(registered_data_image_geom_path, registered_dims[::-1], origin, spacing, registered_data_cell_attr_mat_name)) if read_anomaly_detection: anomaly_detection_path: nx.DataPath = registered_data_image_geom_path.create_child_path(registered_data_cell_attr_mat_name).create_child_path(anomaly_detection_array_name) @@ -382,15 +460,23 @@ def _preflight_registered_datasets(self, h5_file_reader: h5py.File, origin: List return Result() def _preflight_scan_datasets(self, h5_file_reader: h5py.File, filter_args: dict, actions: nx.OutputActions, preflight_updated_values: List[nx.IFilter.PreflightValue]) -> Result: + read_scan_datasets: bool = filter_args[ReadPeregrineHDF5File.READ_SCAN_DATASETS_KEY] scan_data_edge_geom_path: nx.DataPath = filter_args[ReadPeregrineHDF5File.SCAN_DATA_KEY] scan_data_vertex_attr_mat_name: str = filter_args[ReadPeregrineHDF5File.SCAN_DATA_VERTEX_ATTR_MAT_KEY] scan_data_edge_attr_mat_name: str = filter_args[ReadPeregrineHDF5File.SCAN_DATA_CELL_ATTR_MAT_KEY] vertex_list_array_name: str = filter_args[ReadPeregrineHDF5File.SCAN_DATA_VERTEX_LIST_NAME_KEY] edge_list_array_name: str = filter_args[ReadPeregrineHDF5File.SCAN_DATA_EDGE_LIST_NAME_KEY] time_of_travel_array_name: str = filter_args[ReadPeregrineHDF5File.TIME_OF_TRAVEL_ARRAY_NAME] - read_scan_data_subvolume: bool = filter_args[ReadPeregrineHDF5File.READ_SCAN_DATA_SUBVOLUME_KEY] + read_scan_data_subvolume: bool = filter_args[ReadPeregrineHDF5File.ENABLE_SCAN_DATA_SUBVOLUME_KEY] scan_data_subvolume_minmax: list = filter_args[ReadPeregrineHDF5File.SCAN_DATA_SUBVOLUME_MINMAX_KEY] + if not read_scan_datasets: + preflight_value = nx.IFilter.PreflightValue() + preflight_value.name = "Available Scans" + preflight_value.value = "The scan datasets have not been selected to be read." + preflight_updated_values.append(preflight_value) + return Result() + result: Result[h5py.Group] = self._open_hdf5_data_object(h5_file_reader, ReadPeregrineHDF5File.SCANS_GROUP_H5_PATH) if result.invalid(): return Result(errors=result.errors) @@ -399,7 +485,7 @@ def _preflight_scan_datasets(self, h5_file_reader: h5py.File, filter_args: dict, try: num_scans: int = sum(1 for item in group_reader.values() if isinstance(item, h5py.Dataset)) except Exception as e: - return make_error_result(code=-4032, message=f"Error counting objects at path '{ReadPeregrineHDF5File.SCANS_GROUP_H5_PATH}' in HDF5 file '{h5_file_reader.name}': {e}") + return make_error_result(code=-4032, message=f"Error counting objects at path '{ReadPeregrineHDF5File.SCANS_GROUP_H5_PATH}' in HDF5 file '{h5_file_reader.filename}': {e}") if scan_data_subvolume_minmax[1] > num_scans - 1: return make_error_result(code=-4033, message=f"The scan data subvolume maximum value ({scan_data_subvolume_minmax[1]}) cannot be larger than the largest scan number ({num_scans - 1}).") @@ -428,7 +514,7 @@ def _preflight_scan_datasets(self, h5_file_reader: h5py.File, filter_args: dict, return Result(errors=scan_dims_result.errors) scan_dims: List[int] = scan_dims_result.value if len(scan_dims) != 2: - return make_error_result(code=-4035, message=f"Scan dataset at path '{ReadPeregrineHDF5File.SCANS_GROUP_H5_PATH}' in HDF5 file '{h5_file_reader.name}' MUST have 2 dimensions, but instead it has {len(scan_dims)} dimensions.") + return make_error_result(code=-4035, message=f"Scan dataset at path '{ReadPeregrineHDF5File.SCANS_GROUP_H5_PATH}' in HDF5 file '{h5_file_reader.filename}' MUST have 2 dimensions, but instead it has {len(scan_dims)} dimensions.") num_edges += scan_dims[0] actions.append_action(nx.CreateEdgeGeometryAction(scan_data_edge_geom_path, num_edges, 0, scan_data_vertex_attr_mat_name, scan_data_edge_attr_mat_name, vertex_list_array_name, edge_list_array_name)) @@ -440,14 +526,14 @@ def _preflight_scan_datasets(self, h5_file_reader: h5py.File, filter_args: dict, def _open_hdf5_data_object(self, h5_file_reader: h5py.File, h5_dataset_path: str) -> Result: if h5_dataset_path not in h5_file_reader: - return make_error_result(code=-4001, message=f"Error opening object at path '{h5_dataset_path}' in HDF5 file '{h5_file_reader.name}': Object does not exist!") + return make_error_result(code=-4001, message=f"Error opening object at path '{h5_dataset_path}' in HDF5 file '{h5_file_reader.filename}': Object does not exist!") try: dataset = h5_file_reader[h5_dataset_path] except KeyError as e: - return make_error_result(code=-4002, message=f"Error opening object at path '{h5_dataset_path}' in HDF5 file '{h5_file_reader.name}': {e}") + return make_error_result(code=-4002, message=f"Error opening object at path '{h5_dataset_path}' in HDF5 file '{h5_file_reader.filename}': {e}") except Exception as e: - return make_error_result(code=-4003, message=f"Error opening object at path '{h5_dataset_path}' in HDF5 file '{h5_file_reader.name}': {e}") + return make_error_result(code=-4003, message=f"Error opening object at path '{h5_dataset_path}' in HDF5 file '{h5_file_reader.filename}': {e}") return Result(value=dataset) @@ -462,7 +548,7 @@ def _read_dataset_dimensions(self, h5_file_reader: h5py.File, h5_dataset_path: s def _validate_dataset_dimensions(self, h5_file_reader: h5py.File, h5_dataset_path: str, sliceDims: List[int]) -> Result: dims_result = self._read_dataset_dimensions(h5_file_reader, h5_dataset_path) if dims_result.invalid(): - return nx.IFilter.PreflightResult(nx.OutputActions(), dims_result.errors) + return Result(errors=dims_result.errors) dims = dims_result.value if dims != sliceDims: @@ -555,78 +641,68 @@ def _calculate_spacing(self, h5_file_reader: h5py.File) -> Result[List[float]]: return Result(value=spacing) def _read_slice_datasets(self, h5_file_reader: h5py.File, data_structure: nx.DataStructure, filter_args: dict, message_handler: nx.IFilter.MessageHandler, should_cancel: nx.AtomicBoolProxy) -> Result: - segmentation_results_str: str = filter_args[ReadPeregrineHDF5File.SEGMENTATION_RESULTS_KEY] + read_segmentation_results: bool = filter_args[ReadPeregrineHDF5File.READ_SEGMENTATION_RESULTS_KEY] + segmentation_results_str: str = filter_args[ReadPeregrineHDF5File.SEGMENTATION_RESULTS_VALUES_KEY] read_camera_data: bool = filter_args[ReadPeregrineHDF5File.READ_CAMERA_DATA_KEY] read_part_ids: bool = filter_args[ReadPeregrineHDF5File.READ_PART_IDS_KEY] read_sample_ids: bool = filter_args[ReadPeregrineHDF5File.READ_SAMPLE_IDS_KEY] - read_slices_subvolume: bool = filter_args[ReadPeregrineHDF5File.READ_SLICES_SUBVOLUME_KEY] + read_slices_subvolume: bool = filter_args[ReadPeregrineHDF5File.ENABLE_SLICES_SUBVOLUME_KEY] slices_subvolume_minmax_x: list = filter_args[ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_X_KEY] slices_subvolume_minmax_y: list = filter_args[ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_Y_KEY] slices_subvolume_minmax_z: list = filter_args[ReadPeregrineHDF5File.SLICES_SUBVOLUME_MINMAX_Z_KEY] slice_data_image_geom_path: nx.DataPath = filter_args[ReadPeregrineHDF5File.SLICE_DATA_KEY] slice_data_cell_attr_mat_name: str = filter_args[ReadPeregrineHDF5File.SLICE_DATA_CELL_ATTR_MAT_KEY] - camera_data_0_array_name: str = filter_args[ReadPeregrineHDF5File.CAMERA_DATA_0_ARRAY_NAME_KEY] - camera_data_1_array_name: str = filter_args[ReadPeregrineHDF5File.CAMERA_DATA_1_ARRAY_NAME_KEY] + camera_data_hdf5_parent_path: str = filter_args[ReadPeregrineHDF5File.CAMERA_DATA_HDF5_PARENT_PATH_KEY] + camera_data_datasets_str: str = filter_args[ReadPeregrineHDF5File.CAMERA_DATA_DATASETS_KEY] part_ids_array_name: str = filter_args[ReadPeregrineHDF5File.PART_IDS_ARRAY_NAME_KEY] sample_ids_array_name: str = filter_args[ReadPeregrineHDF5File.SAMPLE_IDS_ARRAY_NAME_KEY] # Read the segmentation results - segmentation_results_list: list[str] = segmentation_results_str.split(',') - for i in range(len(segmentation_results_list)): - if should_cancel: - return Result() - - segmentation_result = segmentation_results_list[i] - message_handler(nx.IFilter.Message(nx.IFilter.Message.Type.Info, f"Reading Segmentation Result '{segmentation_result}' ({i + 1}/{len(segmentation_results_list)})...")) - segmentation_result_nx_path = slice_data_image_geom_path.create_child_path(slice_data_cell_attr_mat_name).create_child_path('Segmentation Result ' + segmentation_result) - segmentation_result_nx = data_structure[segmentation_result_nx_path].npview() - segmentation_result_nx = np.squeeze(segmentation_result_nx) - segmentation_result_h5_path = Path(ReadPeregrineHDF5File.SEGMENTATION_RESULTS_H5_PARENT_PATH) / segmentation_result - segmentation_result_h5_result: Result[h5py.Dataset] = self._open_hdf5_data_object(h5_file_reader, str(segmentation_result_h5_path)) - if segmentation_result_h5_result.invalid(): - return segmentation_result_h5_result - segmentation_result_h5 = segmentation_result_h5_result.value - - if read_slices_subvolume: - segmentation_result_nx[:] = segmentation_result_h5[slices_subvolume_minmax_z[0]:slices_subvolume_minmax_z[1] + 1, slices_subvolume_minmax_y[0]:slices_subvolume_minmax_y[1] + 1, slices_subvolume_minmax_x[0]:slices_subvolume_minmax_x[1] + 1] - else: - segmentation_result_nx[:] = segmentation_result_h5 + if read_segmentation_results: + segmentation_results_list: list[str] = segmentation_results_str.split(',') + for i in range(len(segmentation_results_list)): + if should_cancel: + return Result() + + segmentation_result = segmentation_results_list[i] + message_handler(nx.IFilter.Message(nx.IFilter.Message.Type.Info, f"Reading Segmentation Result '{segmentation_result}' ({i + 1}/{len(segmentation_results_list)})...")) + segmentation_result_nx_path = slice_data_image_geom_path.create_child_path(slice_data_cell_attr_mat_name).create_child_path('Segmentation Result ' + segmentation_result) + segmentation_result_nx = data_structure[segmentation_result_nx_path].npview() + segmentation_result_nx = np.squeeze(segmentation_result_nx) + segmentation_result_h5_path = Path(ReadPeregrineHDF5File.SEGMENTATION_RESULTS_H5_PARENT_PATH) / segmentation_result + segmentation_result_h5_result: Result[h5py.Dataset] = self._open_hdf5_data_object(h5_file_reader, str(segmentation_result_h5_path)) + if segmentation_result_h5_result.invalid(): + return segmentation_result_h5_result + segmentation_result_h5 = segmentation_result_h5_result.value + + if read_slices_subvolume: + segmentation_result_nx[:] = segmentation_result_h5[slices_subvolume_minmax_z[0]:slices_subvolume_minmax_z[1] + 1, slices_subvolume_minmax_y[0]:slices_subvolume_minmax_y[1] + 1, slices_subvolume_minmax_x[0]:slices_subvolume_minmax_x[1] + 1] + else: + segmentation_result_nx[:] = segmentation_result_h5 # Read the camera data if read_camera_data: if should_cancel: return Result() - message_handler(nx.IFilter.Message(nx.IFilter.Message.Type.Info, 'Reading Camera Dataset 0...')) - camera_data_0_nx_path: nx.DataPath = slice_data_image_geom_path.create_child_path(slice_data_cell_attr_mat_name).create_child_path(camera_data_0_array_name) - camera_data_0_h5_result: Result[h5py.Dataset] = self._open_hdf5_data_object(h5_file_reader, ReadPeregrineHDF5File.CAMERA_DATA_0_H5_PATH) - if camera_data_0_h5_result.invalid(): - return Result(errors=camera_data_0_h5_result.errors) - camera_data_0_h5 = camera_data_0_h5_result.value - camera_data_0_nx: np.array = data_structure[camera_data_0_nx_path].npview() - camera_data_0_nx = np.squeeze(camera_data_0_nx) - - if read_slices_subvolume: - camera_data_0_nx[:] = camera_data_0_h5[slices_subvolume_minmax_z[0]:slices_subvolume_minmax_z[1]+1, slices_subvolume_minmax_y[0]:slices_subvolume_minmax_y[1]+1, slices_subvolume_minmax_x[0]:slices_subvolume_minmax_x[1]+1] - else: - camera_data_0_nx[:] = camera_data_0_h5 - - if should_cancel: - return Result() - - message_handler(nx.IFilter.Message(nx.IFilter.Message.Type.Info, 'Reading Camera Dataset 1...')) - camera_data_1_nx_path: nx.DataPath = slice_data_image_geom_path.create_child_path(slice_data_cell_attr_mat_name).create_child_path(camera_data_1_array_name) - camera_data_1_h5_result: Result[h5py.Dataset] = self._open_hdf5_data_object(h5_file_reader, ReadPeregrineHDF5File.CAMERA_DATA_1_H5_PATH) - if camera_data_1_h5_result.invalid(): - return Result(errors=camera_data_1_h5_result.errors) - camera_data_1_h5 = camera_data_1_h5_result.value - camera_data_1_nx: np.array = data_structure[camera_data_1_nx_path].npview() - camera_data_1_nx = np.squeeze(camera_data_1_nx) - - if read_slices_subvolume: - camera_data_1_nx[:] = camera_data_1_h5[slices_subvolume_minmax_z[0]:slices_subvolume_minmax_z[1]+1, slices_subvolume_minmax_y[0]:slices_subvolume_minmax_y[1]+1, slices_subvolume_minmax_x[0]:slices_subvolume_minmax_x[1]+1] - else: - camera_data_1_nx[:] = camera_data_1_h5 + for camera_data_dataset in camera_data_datasets_str: + if should_cancel: + return Result() + + message_handler(nx.IFilter.Message(nx.IFilter.Message.Type.Info, f'Reading Camera Dataset "{camera_data_dataset}"...')) + camera_data_nx_path: nx.DataPath = slice_data_image_geom_path.create_child_path(slice_data_cell_attr_mat_name).create_child_path(f"Camera Data {camera_data_dataset}") + camera_data_h5_path: Path = Path(camera_data_hdf5_parent_path) / camera_data_dataset + camera_data_h5_result: Result[h5py.Dataset] = self._open_hdf5_data_object(h5_file_reader, str(camera_data_h5_path)) + if camera_data_h5_result.invalid(): + return Result(errors=camera_data_h5_result.errors) + camera_data_h5 = camera_data_h5_result.value + camera_data_nx: np.array = data_structure[camera_data_nx_path].npview() + camera_data_nx = np.squeeze(camera_data_nx) + + if read_slices_subvolume: + camera_data_nx[:] = camera_data_h5[slices_subvolume_minmax_z[0]:slices_subvolume_minmax_z[1]+1, slices_subvolume_minmax_y[0]:slices_subvolume_minmax_y[1]+1, slices_subvolume_minmax_x[0]:slices_subvolume_minmax_x[1]+1] + else: + camera_data_nx[:] = camera_data_h5 if read_part_ids: message_handler(nx.IFilter.Message(nx.IFilter.Message.Type.Info, 'Reading Part Ids...')) @@ -663,7 +739,7 @@ def _read_slice_datasets(self, h5_file_reader: h5py.File, data_structure: nx.Dat def _read_registered_datasets(self, h5_file_reader: h5py.File, data_structure: nx.DataStructure, filter_args: dict, message_handler: nx.IFilter.MessageHandler, should_cancel: nx.AtomicBoolProxy) -> Result: registered_data_image_geom_path: nx.DataPath = filter_args[ReadPeregrineHDF5File.REGISTERED_DATA_KEY] registered_data_cell_attr_mat_name: str = filter_args[ReadPeregrineHDF5File.REGISTERED_DATA_CELL_ATTR_MAT_KEY] - read_registered_data_subvolume: bool = filter_args[ReadPeregrineHDF5File.READ_REGISTERED_DATA_SUBVOLUME_KEY] + read_registered_data_subvolume: bool = filter_args[ReadPeregrineHDF5File.ENABLE_REGISTERED_DATA_SUBVOLUME_KEY] registered_data_subvolume_minmax_x: list = filter_args[ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_X_KEY] registered_data_subvolume_minmax_y: list = filter_args[ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_Y_KEY] registered_data_subvolume_minmax_z: list = filter_args[ReadPeregrineHDF5File.REGISTERED_DATA_SUBVOLUME_MINMAX_Z_KEY] @@ -743,13 +819,14 @@ def _read_scan_data(self, h5_file_reader: h5py.File, scan_path: str, z_offset: i return Result(value=(vertices,edges,tot)) def _read_scan_datasets(self, h5_file_reader: h5py.File, data_structure: nx.DataStructure, filter_args: dict, message_handler: nx.IFilter.MessageHandler, should_cancel: nx.AtomicBoolProxy) -> Result: + read_scan_datasets: bool = filter_args[ReadPeregrineHDF5File.READ_SCAN_DATASETS_KEY] scan_data_edge_geom_path: nx.DataPath = filter_args[ReadPeregrineHDF5File.SCAN_DATA_KEY] scan_data_vertex_attr_mat_name: str = filter_args[ReadPeregrineHDF5File.SCAN_DATA_VERTEX_ATTR_MAT_KEY] scan_data_edge_attr_mat_name: str = filter_args[ReadPeregrineHDF5File.SCAN_DATA_CELL_ATTR_MAT_KEY] vertex_list_array_name: str = filter_args[ReadPeregrineHDF5File.SCAN_DATA_VERTEX_LIST_NAME_KEY] edge_list_array_name: str = filter_args[ReadPeregrineHDF5File.SCAN_DATA_EDGE_LIST_NAME_KEY] time_of_travel_array_name: str = filter_args[ReadPeregrineHDF5File.TIME_OF_TRAVEL_ARRAY_NAME] - read_scan_data_subvolume: bool = filter_args[ReadPeregrineHDF5File.READ_SCAN_DATA_SUBVOLUME_KEY] + read_scan_data_subvolume: bool = filter_args[ReadPeregrineHDF5File.ENABLE_SCAN_DATA_SUBVOLUME_KEY] scan_data_subvolume_minmax: list = filter_args[ReadPeregrineHDF5File.SCAN_DATA_SUBVOLUME_MINMAX_KEY] vertex_attr_mat_path: nx.DataPath = scan_data_edge_geom_path.create_child_path(scan_data_vertex_attr_mat_name) @@ -763,75 +840,77 @@ def _read_scan_datasets(self, h5_file_reader: h5py.File, data_structure: nx.Data time_of_travel_path: nx.DataPath = scan_data_edge_geom_path.create_child_path(scan_data_edge_attr_mat_name).create_child_path(time_of_travel_array_name) time_of_travel_array: nx.Float32Array = data_structure[time_of_travel_path] - # Resize the vertex attribute matrix and vertex list to the estimated size - number_of_tuples: int = 1 - for tdim in edge_list.tdims: - number_of_tuples *= tdim - vertex_attr_mat.resize_tuples([number_of_tuples * 2]) - vertex_list.resize_tuples([number_of_tuples * 2]) - - # Read scan datasets - result: Result[h5py.Group] = self._open_hdf5_data_object(h5_file_reader, ReadPeregrineHDF5File.SCANS_GROUP_H5_PATH) - if result.invalid(): - return Result(errors=result.errors) - scan_group_reader: h5py.Group = result.value - - # Read the Z thickness value - if ReadPeregrineHDF5File.LAYER_THICKNESS_PATH not in h5_file_reader.attrs: - return make_error_result(code=-3007, message=f"Attribute at path '{ReadPeregrineHDF5File.LAYER_THICKNESS_PATH}' does not exist in HDF5 file '{h5_file_reader.name}', so the scan datasets cannot be read!") - try: - z_thickness: float = h5_file_reader.attrs[ReadPeregrineHDF5File.LAYER_THICKNESS_PATH] - except Exception as e: - return make_error_result(code=-3008, message=f"Attribute at path '{ReadPeregrineHDF5File.LAYER_THICKNESS_PATH}' cannot be accessed in HDF5 file '{h5_file_reader.name}', so the scan datasets cannot be read!\n\n{e}") - - # Calculate the start and end values for the scans - z_start: int = 0 - - try: - z_end: int = sum(1 for item in scan_group_reader.values() if isinstance(item, h5py.Dataset)) - except Exception as e: - return make_error_result(code=-4032, message=f"Error counting objects at path '{ReadPeregrineHDF5File.SCANS_GROUP_H5_PATH}' in HDF5 file '{h5_file_reader.name}': {e}") - - if read_scan_data_subvolume: - z_start = scan_data_subvolume_minmax[0] - z_end = scan_data_subvolume_minmax[1] - - # This section loops over each scan, reads the scan data as vertices and edges, eliminates any duplicate vertices, and then copies the data into the edge geometry. - edge_tuple_offset = 0 - vertex_tuple_offset = 0 - vertex_list_view = np.squeeze(vertex_list.npview()) - edge_list_view = np.squeeze(edge_list.npview()) - time_of_travel_array_view = np.squeeze(time_of_travel_array.npview()) - for z in range(z_start, z_end + 1): - if should_cancel: - return Result() + # Read the scan datasets + if read_scan_datasets: + # Resize the vertex attribute matrix and vertex list to the estimated size + number_of_tuples: int = 1 + for tdim in edge_list.tdims: + number_of_tuples *= tdim + vertex_attr_mat.resize_tuples([number_of_tuples * 2]) + vertex_list.resize_tuples([number_of_tuples * 2]) + + # Read scan datasets + result: Result[h5py.Group] = self._open_hdf5_data_object(h5_file_reader, ReadPeregrineHDF5File.SCANS_GROUP_H5_PATH) + if result.invalid(): + return Result(errors=result.errors) + scan_group_reader: h5py.Group = result.value - # Read the scan data into memory as vertices and edges - scan_path = Path(ReadPeregrineHDF5File.SCANS_GROUP_H5_PATH) / str(z) - message_handler(nx.IFilter.Message(nx.IFilter.Message.Type.Info, f"Reading Scan Dataset '{str(scan_path)}' ({z - z_start + 1}/{z_end - z_start + 1})...")) - scan_data_result: Result[Tuple[np.array, np.array, np.array]] = self._read_scan_data(h5_file_reader, str(scan_path), z * z_thickness) - if scan_data_result.invalid(): - return scan_data_result - vertices, edges, tot = scan_data_result.value - - # Copy the vertices into the edge geometry - v_end = vertex_tuple_offset + vertices.shape[0] - vertex_list_view[vertex_tuple_offset:v_end, :] = vertices - - # Update edges values to match the actual vertices indices - edges += vertex_tuple_offset - - # Copy the edges and time of travel into the edge geometry - e_end = edge_tuple_offset + edges.shape[0] - edge_list_view[edge_tuple_offset:e_end, :] = edges - time_of_travel_array_view[edge_tuple_offset:e_end] = tot - - edge_tuple_offset += edges.shape[0] - vertex_tuple_offset += vertices.shape[0] - - # Resize the vertex attribute matrix and vertex list to the actual size. - # This needs to be done because duplicate vertices may have been removed. - vertex_attr_mat.resize_tuples([vertex_tuple_offset]) - vertex_list.resize_tuples([vertex_tuple_offset]) + # Read the Z thickness value + if ReadPeregrineHDF5File.LAYER_THICKNESS_PATH not in h5_file_reader.attrs: + return make_error_result(code=-3007, message=f"Attribute at path '{ReadPeregrineHDF5File.LAYER_THICKNESS_PATH}' does not exist in HDF5 file '{h5_file_reader.filename}', so the scan datasets cannot be read!") + try: + z_thickness: float = h5_file_reader.attrs[ReadPeregrineHDF5File.LAYER_THICKNESS_PATH] + except Exception as e: + return make_error_result(code=-3008, message=f"Attribute at path '{ReadPeregrineHDF5File.LAYER_THICKNESS_PATH}' cannot be accessed in HDF5 file '{h5_file_reader.filename}', so the scan datasets cannot be read!\n\n{e}") + + # Calculate the start and end values for the scans + z_start: int = 0 + + try: + z_end: int = sum(1 for item in scan_group_reader.values() if isinstance(item, h5py.Dataset)) + except Exception as e: + return make_error_result(code=-4032, message=f"Error counting objects at path '{ReadPeregrineHDF5File.SCANS_GROUP_H5_PATH}' in HDF5 file '{h5_file_reader.filename}': {e}") + + if read_scan_data_subvolume: + z_start = scan_data_subvolume_minmax[0] + z_end = scan_data_subvolume_minmax[1] + + # This section loops over each scan, reads the scan data as vertices and edges, eliminates any duplicate vertices, and then copies the data into the edge geometry. + edge_tuple_offset = 0 + vertex_tuple_offset = 0 + vertex_list_view = np.squeeze(vertex_list.npview()) + edge_list_view = np.squeeze(edge_list.npview()) + time_of_travel_array_view = np.squeeze(time_of_travel_array.npview()) + for z in range(z_start, z_end + 1): + if should_cancel: + return Result() + + # Read the scan data into memory as vertices and edges + scan_path = Path(ReadPeregrineHDF5File.SCANS_GROUP_H5_PATH) / str(z) + message_handler(nx.IFilter.Message(nx.IFilter.Message.Type.Info, f"Reading Scan Dataset '{str(scan_path)}' ({z - z_start + 1}/{z_end - z_start + 1})...")) + scan_data_result: Result[Tuple[np.array, np.array, np.array]] = self._read_scan_data(h5_file_reader, str(scan_path), z * z_thickness) + if scan_data_result.invalid(): + return scan_data_result + vertices, edges, tot = scan_data_result.value + + # Copy the vertices into the edge geometry + v_end = vertex_tuple_offset + vertices.shape[0] + vertex_list_view[vertex_tuple_offset:v_end, :] = vertices + + # Update edges values to match the actual vertices indices + edges += vertex_tuple_offset + + # Copy the edges and time of travel into the edge geometry + e_end = edge_tuple_offset + edges.shape[0] + edge_list_view[edge_tuple_offset:e_end, :] = edges + time_of_travel_array_view[edge_tuple_offset:e_end] = tot + + edge_tuple_offset += edges.shape[0] + vertex_tuple_offset += vertices.shape[0] + + # Resize the vertex attribute matrix and vertex list to the actual size. + # This needs to be done because duplicate vertices may have been removed. + vertex_attr_mat.resize_tuples([vertex_tuple_offset]) + vertex_list.resize_tuples([vertex_tuple_offset]) return Result() \ No newline at end of file