-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use hdf5 or nexus file in XRD #113
base: main
Are you sure you want to change the base?
Conversation
f2bef40
to
d583974
Compare
d583974
to
3aeb548
Compare
@hampusnasstrom @aalbino2 I merged the implementation of the HDF5Handler and support for The Plotly plots are removed in favor of the plots from H5Web. @budschi's current viewpoint is that Plotly plots have better visualizations and it might be a good idea to preserve them for 1D scans. This can be a point of discussion when we review this PR after the vacations @RubelMozumder will soon merge his implementations from #147 which will allow to use |
71b6952
to
19dec87
Compare
@RubelMozumder I have combined the common functionality from |
TODO
|
Have you checked what is the root cause of the issue? |
@TLCFEM I wasn't able to investigate it yet. But this will be among the first things I do in the new year and will reach out to you with my findings. Happy Holidays! |
If it is not the case, then all discussions are not valid anymore. |
If I explain the situation that may lay bear the scenario. Issue: In the second attempt of reprocessing the entire upload ( Temporary solution: |
@RubelMozumder what prevents you from checking the existence of the .nxs file and create a new one only in the case it doesn't exist yet? |
Co-authored-by: Sarthak Kapoor <[email protected]>
Co-authored-by: Sarthak Kapoor <[email protected]> Co-authored-by: Hampus Näsström <[email protected]>
* Implement write nexus section based on the populated nomad archive * app def missing. * mapping nomad_measurement. * All concept are connected, creates nexus file and subsection. * adding links in hdf5 file. * Remove the nxs file. * back to the previous design. * Include pynxtools plugins in nomad.yaml and extend dependencies including pynxtools ans pnxtools-xrd. * PR review correction. * Remove the entry_type overwtitten. * Remove comments. * Replace __str__ function. * RUFF * Update pyproject.toml Co-authored-by: Sarthak Kapoor <[email protected]> * Update src/nomad_measurements/xrd/schema.py Co-authored-by: Sarthak Kapoor <[email protected]> * Update src/nomad_measurements/xrd/nx.py * Replace Try-block. --------- Co-authored-by: Sarthak Kapoor <[email protected]> Co-authored-by: Sarthak Kapoor <[email protected]>
* updated plugin structure * added pynxtools dependency * Apply suggestions from code review Co-authored-by: Sarthak Kapoor <[email protected]> Co-authored-by: Hampus Näsström <[email protected]> * Add sections for RSM and 1D which uses HDF5 references * Abstract out data interaction using setter and getter; allows to use same methods for classes with hdf5 refs * Use arrays, not references, in the `archive.results` section * Lock the state for using nexus file and corresponding references * Populate results without references * Make a general reader for raw files * Remove nexus flags * Add quantity for auxialiary file * Fix rebase * Make integration_time as hdf5reference * Reset results (refactor) * Add backward compatibility * Refactor reader * add missing imports * AttrDict class * Make concept map global * Add function to remove nexus annotations in concept map * Move try block inside walk_through_object * Fix imports * Add methods for generating hdf5 file * Rename auxiliary file * Expect aux file to be .nxs in the beginning * Add attributes for hdf5: data_dict, dataset_paths * Method for adding a quantity to hdf5_data_dict * Abstract out methods for creating files based on hdf5_data_dict * Add dataset_paths for nexus * Some reverting back * Minor fixes * Refactor populate_hdf5_data_dict: store a reference to be made later * Handle shift from nxs to hdf5 * Set hdf5 references after aux file is created * Cleaning * Fixing * Redefine result sections instead of extending * Remove plotly plots from ELN * Read util for hdf5 ref * Fixing * Move hdf5 handling into a util class * Refactor instance variables * Reset data dicts and reference after each writing * Fixing * Overwrite dataset if it already exists * Refactor add_dataset * Reorganize and doctrings * Rename variable * Add read_dataset method * Cleaning * Adapting schema with hdf5 handler * Cooments, minor refactoring * Fixing; add `hdf5_handler` as an attribute for archive * Reorganization * Fixing * Refactoring * Cleaning * Try block for using hdf5 handler: dont fail early, as later normalization steps will have the handler! * Extract units from dataset attrs when reading * Fixing * Linting * Make archive_path optional in add_dataset * Rename class * attrs for add_dataset; use it for units * Add add_attribute method * Refactor add_attribute * Add plot attributes: 1D * Refactor hdf5 states * Add back plotly figures * rename auxiliary file name if changed by handler * Add referenced plots * Allow hard link using internel reference * Add sections for plots * Comment out validation * Add archive paths for the plot subsections * Add back validation with flag * Use nexus flag * Add interpolated intensity data into h5 for qspace plots * Use prefix to reduce len of string * Store regularized linespace of q vectors; revise descriptions * Remove plotly plots * Bring plots to overview * Fix tests * Linting; remove attr arg from add_dataset * Review: move none check into method * Review: use 'with' for opening h5 file * Review: make internal states as private vars * Add pydantic basemodel for dataset * Use data from variables if available for reading * Review: remove lazy arg * Move DatasetModel outside Handler class * Remove None from get, as it is already a default * Merge if conditions --------- Co-authored-by: Andrea Albino <[email protected]> Co-authored-by: Andrea Albino <[email protected]> Co-authored-by: Hampus Näsström <[email protected]>
* Remove the Nexus file before regenerating it. * Reference to the NeXus entry. * PR review comments.
2c838b5
to
6e47b5b
Compare
After discussing with @TLCFEM, we found the following things:
Some directions for resolving this:
|
Currently, the handler exposes the write_file method that can be used at any point multiple times during the normalization. We should limit this so that the resource contention problems are more tractable. One write file per normalization, this also allows the nexus entry to contain the latest changes of the nexus file. |
It may resolve the race condition reading/writing function on the same file. There is another issue, I think that needs to be fixed by area-D to delete an entry (corrupted) and its related file from the single process thread running normalizer. The PR: #157 can help, you see that the test is completely failed. |
@lauri-codes, is there any functionality that deletes a entry, associated mainfile and the residue (if there is something e.g. ES data) of that deleted entry? This deletion must happens inside the eln normalization process. Just a quick overview of implementation:
Currently, You may want to take a quick view of code in function
I have created a small function to delete mainfile, entry and ES (here:
If you could please suggest any functionality that is available in NOMAD. |
@RubelMozumder: There is no such functionality, and I doubt there ever will be. Deleting entries during processing is not something we can really endorse in any way: there are too many ways to screw this up (what happens if the entry is deleted and then an expection happens before the new data is stored? What happens when some other processed entry tries to read the deleted entry simultaneously? What happens if the file is opened by another process and there is a lock on it when someone tries to delete it?) I would instead want to try and understand what is the goal you are trying to achieve with this normalizer. It is reasonable to create temporary files during normalization and also reasonable to create new entries at the end of normalization (assuming there are no circular processing steps or parallel processes that might cause issues). |
When array data from XRD measurements is added to the archives, the loading time increases as the archives become heavier (especially in the case of RSM which stores multiple 2D arrays). One solution is to use an auxiliary file to offload the heavy data and only save references to the auxiliary files in the archives.
To implement, we can use
.h5
files to store the data and make references to the offloaded datasets using HDF5Reference. Additionally, we can also generate a nexus.nx
file instead of.h5
file. Nexus file uses the.h5
file as the base file type and validates the data with the data models built by the Nexus community.The current plots are generated using Plotly. The
.json
files containing the plot data is also being stored in the archive. This also needs to be offloaded to make the archives lighter. UsingH5WebAnnotations
of NOMAD, we can leverage the H5Web to generate plots from the.h5
or.nx
files.To this end, the following steps are needed
HDF5Reference
as the type of the Quantity for array data: intensity, two_theta, q_parallel, q_perpendicular, q_norm, omega, phi, chi.HDF5Handler
or functions to create auxiliary files from the normalizers of the schema.h5
to store the data and save references to its datasets inHDF5Reference
quantities..nxs
file based on the archive. This happens in theHDF5Handler
and usespynxtools
.