Lock-Less Concurrent Programming in Raku

By Vadim Belman

Elevator Pitch

Anybody writing concurrent code knows how important is it to protect data integrity with locks. Anybody doing heavy concurrency knows how locks could be costly for the performance. Could they be avoided? Yes and here is how…

Description

This talk is about sharing techniques used by implementation of Concurrent::PChannel module. As the name suggests, the module implements a channel object for organizing data passing between different components of the same application including concurrently running threads. What makes the module different from the core Channel is support for prioritization of items sent over it. In other words, if two items are being sent simultaneously the one with higher priority will most likely be pulled out first even if in fact it was submitted second.

I was facing two challenges in module implementation. First, it must depend as little as possible on the number of priorities used by an application. Second, it had to be fast. Ideally, as fast as the core Channel. The first challenge wasn’t that hard to accomplish. But the implementation used lock protection around the most performance-critical parts of the code resulting in certain degradation under heavily-concurrent item submissions. And so I went onto the hunt for a solution which wouldn’t use locking.

The result? My benchmarks show median loss of 20-30% of performance compared to the core Channel. Is it much? I’d say it’s definitely not considering that the benchmark is run over 1000 different priorities! Think of it as of managing 1000 individual queues compared to a single one maintained by Channel. Moreover, it doesn’t really matter whether it’s 10, or 100, or 1000 priorities, the time it takes to send same number of items will always be the same.

Notes

This talk is proposed to fill in the gap caused by cancellation of ‘Creating a distributable gui application in raku’ talk. I hope to to know if it’s accepted as soon as possible because there is nothing done yet, I’ll be making it from the scratch.