-
Notifications
You must be signed in to change notification settings - Fork 199
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory allocation issue when running on windows #457
Comments
Just an update .. running against the same repository from Linux client completes successfully. |
Can you try running dust with more memory: eg: |
C:\DUST>C:\DUST\dust.exe -S 1073741824 -D -p -j -r -f -n 100 -d 7 -z 200000 "\\srv\c$\folder" |
I'm not sure I can do anything here. If windows is failing to assign enough memory to run dust, I'm not sure if there is anything I can do. I'd recommend repeatedly halving the number in -S and then repeatedly doubling it and seeing if you can get a good run. |
I see the same also on linux on file systems with many million files. I will try to play with the -S but as far as I can see it is a general scalability issue. BTW, did you try it on file systems with 20-30 million files or more? |
The same on linux ? Ok, let me try and recreate on linux. Using these 2 scripts I made a large number of files on my ext4 filesystem:
Gives:
|
I think by the time you are getting up to tracking a few tens of million files you are pushing the memory limits of your average system. HTOP certainly wasn't very happy when I ran the above ^ |
I ran an identical command to what you did and It worked.
Anyway, the servers I use have 32G of RAM and are doing nothing else. Thanks! |
I'm not sure I can offer much more. adding '-d' doesn't make it useless memory. I can only suggest cd-ing into a subdirectory so it has less data to trawl through. |
Thanks ! I will learn some rust and run some debugs myself. I will let you know if something pops |
Hi.
I'm trying to count file on 30M files dataset on SMB.
Anything can be done to overcome it, or I reached the maximum scale of dust ?
Thanks !
.\dust -F -j -r -d 4 -n 100 -s 400000 -f \\server\share$\Groups
Indexing: \\server\share$\Groups 9949070 files, 9.5M ... /memory allocation of 262144 bytes failed
The text was updated successfully, but these errors were encountered: