Meetup notes: Back to the future with C++ and Seastar
ScyllaDB founder Avi Kivity presented “Back to the future with C++ and Seastar” at the recent Sayeret Lambda Meetup group, helping to revise the audience’s impressions of the C++ language. C++ is often thought of as a legacy imperative language with roots in 1970s C. But in the past few years it has been thoroughly modernized, now offering streamlined support for modern paradigms such as lambda, metaprogramming, and functional programming, while retaining no-compromise performance.
Photo: Tzach Livyatan for Cloudius Systems
Seastar is a modern, open source server application framework written in C++ that presents a future/promise based API to the user while delivering top-of-the line performance—more than five times the nearest competitor, with 7 million requests per second served on a single machine.
The Meetup group had a good attendance of about 35 people. Some of the Seastar questions included:
- Can you run Seastar on a subset of the cores? (answer: yes)
- How do you pin memory? (answer: Each thread is preallocated with a large piece of memory. By default, the machine’s entire memory except a small reservation left for the OS (defaulting to 512 MB) is pre-allocated for the application. Pages are NUMA bound to the local node with mbind.)
Here are some additional questions that were asked, and you’re welcome to use the mailing list to get answers:
- On the Memcache results: did you try running Memcache on DPDK without Seatsar?
- Did you test with and without HyperThreads?
- How do you debug/trace the execution task?
- Can you run Seastar on a subset of the cores?
- How do you use external libs from Seastar?
- Can you use it from Python?
- What happened when the system/queues are overloaded?
- Can you simulate Boost.Asio ?
- How do you pin memory?
- Are you production ready? when will 1.0 be available?
- Who Framed Roger Rabbit?
Seastar is designed to make it possible to write code that is both scalable to large numbers of CPU cores and also straightforward to work with.