You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Lazy does not stop after the tenth line, it sweeps through the whole file. I've tried with a fairly large one (20 000 lines). If I handle the stream events myself (separating lines myself), I can get the expected behavior.
I've also asked this on StackOverflow, but since I got no answer there I thought I'd go straight to the source. :)
The text was updated successfully, but these errors were encountered:
Initially, I thought the same thing with a project that I was working on. However, after I dug into it, it appears that Lazy is working properly.
The stream that you are passing into Lazy is being paused; however, you need to consider the stream's block size. In prior versions of Node, you could pass blockSize into the fs.createReadStream. However, it is now deprecated and not used. As such, you may not realize that the stream is reading in a large amount of data with each call (OS dependent).
While Lazy is handling all of this behind the scenes, you need to realize that the filesystem reads are coming in large chunks and the stream pause is only pausing the stream from further reads, not pausing Lazy's processing of the buffer returned by the read.
In my case, the fs.read would return 1000s of lines. If I paused the stream in one of Lazy's lines.forEach(), it would pause the stream, but the 1000s of lines in the Lazy buffer would be processed before the pause would be noticed.
I've tried this simple code using node-lazy:
Lazy does not stop after the tenth line, it sweeps through the whole file. I've tried with a fairly large one (20 000 lines). If I handle the stream events myself (separating lines myself), I can get the expected behavior.
I've also asked this on StackOverflow, but since I got no answer there I thought I'd go straight to the source. :)
The text was updated successfully, but these errors were encountered: