You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 23, 2021. It is now read-only.
This is a matter of testing it. Internally we're using a gem called roo for loading the spreadsheet data, and I believe this gem loads all the spreadsheet data onto memory, so it could be a problem if there's not enough RAM available for loading a file too big.
But I guess that a spreadsheet with even a few thousand records (50k) should not be too big for today availability of RAM in any decent server hosting. In any case if you plan for even more (hundreds of thousands or even millions of rows) I'd recommend not importing everything from a single file, but split your data in several files.
Again, this is a matter of testing with actually big files and see how it goes. And do not hesitate to let us know how it went, or any suggestions for improvements on this matter.
@dkrusenstrahle I did a quick research and I found an open issue on the roo gem related to the possibility of streaming the reading of really large .xlsx files. I did not go deep into it, but it is certainly related to this questions of yours.
Hello,
Will this gem be able to handle importing of houndres or thousands of records/lines?
The text was updated successfully, but these errors were encountered: