Replies: 2 comments 7 replies
-
And another question 😄 : Would you appreciate a possibility to store the analysis into some fast embedded key-value store instead of memory? E.g. BadgerDB. The analysis would be of course slower a lot but it will use very little memory and the result could be opened again very fast. |
Beta Was this translation helpful? Give feedback.
-
I have implemented first step as a start :) Gdu will now meassure how much memory is used and will compare with the total free memory on the host. When will be used more than is free, gdu will enable the garbage collection. The GC will be called more often when more memory is used (and less is free). |
Beta Was this translation helpful? Give feedback.
-
Hi all,
are you satisfied with memory consumption and performance on huge file systems when using the
-g
flag?I am considering if there should some garbage collector tuning applied, e.g. something like https://eng.uber.com/how-we-saved-70k-cores-across-30-mission-critical-services/
It would introduce for example flag
--max-memory
which will tell gdu what is the limit of memory usage it should not cross.Gdu will then be able to do garbage collection less often in the beginning of analysis when less memory is used (gaining more performance) and then to do it more often when memory consumption will approach the limit.
What do you think?
//cc @sss123next @daniejstriata
Beta Was this translation helpful? Give feedback.
All reactions