You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently there is no easy way to find out the impact on memory usage while making changes to in-memory store such as updating retention window or perf improvements
What would you like to see?
memory dump which can be analyzed further by go tool pprof and memory stats like in-use heap and alloc
What alternatives are there?
Run rpc on docker and analyze using linux htop command. (still time consuming when it comes to increasing retention window >= 24hrs)
The text was updated successfully, but these errors were encountered:
Memory Usage for txn retention window of 2 hours and 24 hours:
I ran memory profiling code by downloading ledger files using LedgerExporter and used it for ingestion.
A. 2 hour window: 479 MB
B. 24 hour window: 907 MB
Note: Files from Ledger exporter does not seem to include events, so memory of event store is not contributing heavily for above numbers but still useful to understand the uptick when we increase the Txn retention window from 2 hour to 24 hours.
What problem does your feature solve?
Currently there is no easy way to find out the impact on memory usage while making changes to in-memory store such as updating retention window or perf improvements
What would you like to see?
memory dump which can be analyzed further by
go tool pprof
and memory stats likein-use heap and alloc
What alternatives are there?
Run rpc on docker and analyze using linux
htop
command. (still time consuming when it comes to increasing retention window >= 24hrs)The text was updated successfully, but these errors were encountered: