-
Notifications
You must be signed in to change notification settings - Fork 416
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: benchmark memory usage with jemalloc
#1114
Comments
@dpc would it please be possible to share the commits being used in https://twitter.com/dpc_pw/status/1857090601042198921? |
I'm still working and investigating, but I'm currently testing: fedimint/fedimint#6354 |
When running with |
10% seems far off from only using half the memory, sure, electrs itself also uses memory, but does #1115 also change the allocator used by rocksdb? Setting the global allocator in Rust will use it for Rust's heap allocated data structures (Vec, HashMap, BTreeMap, etc), but i am unsure about if this will cause rocksdb to also use it. (see facebook/rocksdb#12364, rust-rocksdb/rust-rocksdb#863) |
@antonilol I think you're right and actually enabling jemalloc for rocksdb requires some work. In Fedimint it seems we're hitting some weird patterns around iterator usage on the Rust side where jemalloc helps significantly. I need to investigate enabling jemalloc for rocksdb itself and will share and follow this thread with anyone else trying the same thing. |
Runtime performance should also be measured, if memory usage drops by 50% but electrs gets significantly slower (at indexing or lookups), people should be able to make the decision about this tradeoff themselves (at compile time is fine, for me at least). |
BTW. I have enabled jemalloc on rocksdb itself in Fedimint and we're seeing much better memory usage now. It's kind of hard to quantify, as it's a long running process that was accumulating memory asymptotically over weeks, but the difference is clearly visible. |
Following https://twitter.com/dpc_pw/status/1856972589140152700
The text was updated successfully, but these errors were encountered: