-
Notifications
You must be signed in to change notification settings - Fork 154
[WIP]Feat[BMQC]: batched object pool #789
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Build 2781 of commit b5613d7 has completed with FAILURE
813e080
to
db18498
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Build 2784 of commit db18498 has completed with FAILURE
db18498
to
a56df7a
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Build 2808 of commit a56df7a has completed with FAILURE
Signed-off-by: Evgeny Malygin <[email protected]>
a56df7a
to
f8f5e51
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Build 2913 of commit f8f5e51 has completed with FAILURE
Signed-off-by: Evgeny Malygin <[email protected]>
f8f5e51
to
ad082bb
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Build 2917 of commit ad082bb has completed with FAILURE
Problem
Multi-thread access to object pools might become slow due to thread safety mechanisms used in the pool implementation.
Batching
One of the way of dealing with this problem is to acquire/release more objects from the pool at a time (use batches of objects). This PR introduces
bmqc::BatchedObjectPool
class that operatesbdlcc::ObjectPool
under the hood but gets objects in batches. These batches are hidden from the library users who can operate smaller objects as before. As a result, batching allows to reduce thread contention significantly while also benefiting from the thread safety mechanisms existing inbdlcc::ObjectPool
.Benchmarks
amd64
Mac M2 Darwin
Conclusions
bmqc::BatchedPool threads=1 batch=1
works 2x times slower than the original object pool as expected, because we have the introduced batch overhead, but don't benefit from the batch size (here the batch size is 1 so it is equal to the original object pool).bmqc::BatchedPool threads=1 batch=32
works 1.5x times faster than the original object pool. We have the batch overhead but also we don't suffer that much from thread safety mechanisms used in the original object pool.bmqc::BatchedPool threads=4 batch=32
works 30x times faster than the corresponding benchmark of the original object pool. The same withbatch=128
.bmqc::BatchedPool threads=10 batch=128
works 90x times faster than the corresponding benchmark of the original object pool.Note that CPU cannot really process 64, 128 or 256 threads at the same time in the last benchmarks, so the actual work time increases linearly to the number of threads. It also means that even in these conditions the batched pool doesn't suffer that much from the thread contention problem as the original object pool.