You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
When Neo is used to load and export proprietary formats in .nwb files, the source electrophysiological recording is fully loaded into memory, converted and ultimately saved in the output format.
This procedure is particularly problematic when converting large files, where the process is interrupted due to saturation of the available memory.
Unlike Neo, the neuroconv package sequentially converts smaller chunks of the original signal, avoiding any memory-related issue.
Wouldn't it be appropriate to modify the NWBIO class so that it follows the same approach as neuroconv?
These considerations hold for any output format, as reading the whole electrophysiological recording in many cases might be unfeasible.
To Reproduce
The issue arises when reading and subsequently writing any large enough recording to .nwb. In my case, I tried to export a .pl2 Plexon file with a size of about 50 GB.
Expected behaviour
The desirable behaviour, as previously stated, would be a chunk-by-chunk conversion, without the need to load in-memory the whole recording, avoiding memory saturation and subsequent process interruption.
Environment:
OS: Ubuntu (in a Windows-based Docker container)
Python version: 3.12
Neo version: 0.14.0
The text was updated successfully, but these errors were encountered:
As a maintainer of neuroconv I am curious on whether neuroconv is missing any feature that would make you use neo instead of neuroconv.
Implementing buffered reading might be a major effort on neo side, effort that might be better spread to other cases that are not solved in the community.
Describe the bug
When Neo is used to load and export proprietary formats in
.nwb
files, the source electrophysiological recording is fully loaded into memory, converted and ultimately saved in the output format.This procedure is particularly problematic when converting large files, where the process is interrupted due to saturation of the available memory.
Unlike Neo, the neuroconv package sequentially converts smaller chunks of the original signal, avoiding any memory-related issue.
Wouldn't it be appropriate to modify the
NWBIO
class so that it follows the same approach as neuroconv?These considerations hold for any output format, as reading the whole electrophysiological recording in many cases might be unfeasible.
To Reproduce
The issue arises when reading and subsequently writing any large enough recording to
.nwb
. In my case, I tried to export a.pl2
Plexon file with a size of about 50 GB.Expected behaviour
The desirable behaviour, as previously stated, would be a chunk-by-chunk conversion, without the need to load in-memory the whole recording, avoiding memory saturation and subsequent process interruption.
Environment:
The text was updated successfully, but these errors were encountered: