Skip to content
This repository has been archived by the owner on Nov 23, 2021. It is now read-only.

Handle big amount of data well? #5

Closed
dkrusenstrahle opened this issue Dec 20, 2013 · 3 comments
Closed

Handle big amount of data well? #5

dkrusenstrahle opened this issue Dec 20, 2013 · 3 comments

Comments

@dkrusenstrahle
Copy link

Hello,

Will this gem be able to handle importing of houndres or thousands of records/lines?

@gnapse
Copy link
Contributor

gnapse commented Dec 20, 2013

This is a matter of testing it. Internally we're using a gem called roo for loading the spreadsheet data, and I believe this gem loads all the spreadsheet data onto memory, so it could be a problem if there's not enough RAM available for loading a file too big.

But I guess that a spreadsheet with even a few thousand records (50k) should not be too big for today availability of RAM in any decent server hosting. In any case if you plan for even more (hundreds of thousands or even millions of rows) I'd recommend not importing everything from a single file, but split your data in several files.

Again, this is a matter of testing with actually big files and see how it goes. And do not hesitate to let us know how it went, or any suggestions for improvements on this matter.

@gnapse gnapse closed this as completed Dec 20, 2013
@gnapse
Copy link
Contributor

gnapse commented Dec 20, 2013

@dkrusenstrahle I did a quick research and I found an open issue on the roo gem related to the possibility of streaming the reading of really large .xlsx files. I did not go deep into it, but it is certainly related to this questions of yours.

@dkrusenstrahle
Copy link
Author

Sounds cool! I think most imports will contain houndreds of lines not millions so I should be alright I guess :)

Thanks!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants