See all blog posts

Bryan Cantrill on What’s Next for Infrastructure, Open Source & Rust

“As technologists, we live partially in the future: We are always making implicit bets based on our predictions of what the future will bring. To better understand our predictions of the future, it can be helpful to understand the past – and especially our past predictions of the future.”

– Bryan Cantrill at ScyllaDB Summit 2022

If you know Bryan Cantrill, you know that his mind works in mysterious ways to dare mighty things. So it shouldn’t surprise you that Cantrill’s take on the age-old practice of New Year’s predictions is a bit unexpected…and yields predictably perspicacious results.

For nearly a decade at Sun Microsystems (which his 10-year-old daughter suspected was a microbrewery) Cantrill and a dozen or so fellow infrastructure technologists made it a habit to cast their one-, three-, and six-year predictions. What did they get wildly wrong? Uncannily correct? And what does it all mean for the future? Let’s take a look.

Cantrill crafted this talk for ScyllaDB Summit, a virtual conference for exploring what’s needed to power instantaneous experiences with massive distributed datasets. You can watch his complete session below. Also, register now (free + virtual) to join us live for ScyllaDB Summit 2023 featuring experts from Discord, Hulu, Strava, ShareChat, Percona, ScyllaDB and more, plus industry leaders on the latest in WebAssembly, Rust, NoSQL, SQL, and event streaming trends.

REGISTER NOW FOR SCYLLADB SUMMIT 2023

 

 

Looking Back to Future Technology Predictions from 2000-2007

Here are some of the more notable one-, three- and six-year predictions that Cantrill & Co made during the early 2000s – exactly as they recorded them:

  • Six-year, 2000: “Most CPUs have four or more cores.”
  • Three-year, 2003: “Apple develops new ‘must-have’ gadget: iPhone. Digital camera/MP3 player/cell phone.”
  • Six-year, 2003: “Internet bandwidth grows to the point that TV broadcasters become largely irrelevant; former TV networks begin internet broadcasts.”
  • One-year, 2005: “Spam turns corner, less of a problem than year before.”
  • One-year, 2006: “Google embarrassed by revelation of unauthorized U.S. government spying at Gmail.”
  • Six-year, 2006: “Volume CPUs still less than 5 GHz.”
  • Many of these predictions nailed the trend, but were a bit off on the timing. Let’s review each in turn.

Six-Year, 2000: ‘Most CPUs Have Four or More Cores’

From the perspective of 2022, where any laptop has six or eight cores, this seems like a no-brainer. But it wasn’t yet a reality by 2006. File under “Right trend, wrong timing.”

Three-Year, 2003: ‘Apple Develops New ‘Must-Have’ Gadget: iphone. Digital Camera/MP3 Player/Cell Phone’

This prescient prediction was Cantrill’s own (and yes, he did actually predict the name “iPhone”). But he was a bit off in one not-so-minor respect. He admits:

“I was almost making fun of myself for making this prediction because I thought this thing would be ridiculous and that nobody would want it. So this prediction was correct, but it was also really, really, deeply wrong.”

Six-Year, 2003: ‘Internet Bandwidth Grows to the Point That Tv Broadcasters Become Largely Irrelevant; Former TV Networks Begin Internet Broadcasts’

Cantrill remembers his disbelief when his colleague shared this one. It’s now hard to believe that we once lived in a world where you had to sit in front of a television to learn about breaking news. Nevertheless, in 2003, the whole concept of getting news online felt like an impossible future.

One-Year, 2005: ‘Spam Turns Corner, Less of a Problem Than Year Before’

Difficult as it may be to believe, Cantrill assures us that the spam problem was previously much worse than it is today. Around 2005, it felt hopeless. Yet, it did turn the corner right around 2006 – exactly as predicted. It’s probably worth noting that this precise short-term prediction came from a technologist who worked on a mail server, and was thus intimately involved with the spam problem.

One-Year, 2006: ‘Google Embarrassed by Revelation of Unauthorized U.S . Government Spying at Gmail’

We have witnessed a variety of scandals involving the government having unauthorized access to large services, so this prediction did capture that general zeitgeist. However, the specific details were off.

Six-year, 2006: ‘Volume CPUs Still Less than 5 GHz’

Before you dismiss this one as obvious, realize that in 2006 it wasn’t yet clear when Dennard scaling (the idea that transistors get faster as they get smaller) was going to end. But, as Cantrill’s colleague predicted, it did end – maybe sooner than anticipated (around 2006, 2007). And we did top out at less than 5 GHz: more like 4 or even 3.

So What? And What About Missing that Whole ‘Cloud Computing’ Thing?

As we’re on the cusp of 2023, why are we looking back at technology predictions from the early 2000s? Cantrill’s response: “The thing that is so interesting about them is that they tell us so much about what we were thinking at the time. I think predictions tell us much more about the present than they do about the future, and that’s a bit of a paradox.”

In retrospect, looking at the types of predictions that came true was ultimately more intriguing than the fate of the individual predictions. Their longer-term predictions were often more accurate than their one-year ones. Even though that one-year horizon is right in front of your eyes, so much can change in a year that it’s difficult to predict.

Even more interesting: megatrends that this group of infrastructure technologists overlooked. Cantrill explains, “Yes, we predicted the end of Dennard scaling… Yes, we predicted, albeit mockingly, the iPhone. But, we did not predict cloud computing or Software as a Service at all anywhere over that period of time.”

Then, the epiphany: “The things that we missed were the ramifications of the broadening of things that we already knew about at the time.” The list of their megatrend misses is populated by technologies that were right under their noses in the early 2000s – just not (yet!) at the scope that tapped their potential and made them truly transformational. For instance, they underestimated the impact of:

  • The internet
  • Distributed version control
  • Moore’s Law
  • Open source

A little more color on this, taking the example of open source: The technologists making the predictions were users of open source. They had open sourced their own software. They were ardent supporters of open source. However, according to Cantrill, they underestimated its power to truly transform the industry because they “just didn’t fully understand what it meant for everything to be open source.”

Back to the Future

So how does all this analysis of past predictions inform Cantrill’s expectations for the future?

He’s focusing less on what new things will be created and more on evolutions that tap the power of things that already exist today. Some specifics…

Compute is Becoming Ubiquitous

Cantrill’s first prediction is that powerful compute will become even more broadly available – not just with respect to moving computers into new places (à la IoT) but also having CPUs where we once thought of components For example, open 32-bit CPUs are replacing hidden, closed 8-bit microcontrollers. We’re already seeing CPUs on the NIC (SmartNIC), CPUs next to flash (open-channel SSD) and also on the spindle (WD’s SweRV). Cantrill is confident that this compute proliferation will bring new opportunities for hardware/software co-design. (See “Bryan Cantrill on Rust and the Future of Low-Latency Systems” for more details on this thread.)

Open FPGAs/HDLs are Real

Field programmable gate arrays (FPGA) are integrated circuits that can be programmed, post manufacturing, to do arbitrary things. To change an FPGA’s functionality, you reconfigure it with a bitstream that uses Hardware Description Language (HDL) designs.

Historically, these bitstreams were entirely proprietary, so anyone programming them was entirely dependent on proprietary toolchains that were completely closed. Claire Wolf changed this. Wolf’s terrific work of reverse engineering the Lattice iCE40 bitstream and other bitstreams opened the door to truly open FPGAs: FPGAs where you can synthesize the bitstream, where you can synthesize what you’re going to program onto that FPGA with 100% open source tools.

Cantrill believes this will be a game changer. Just as having open source development tools has democratized software development, the same will happen with FPGAs. With the toolchains opening up, many more people can actually synthesize bitstreams. In Cantrill’s words, “This stuff is amazing. It’s not the solution to all problems, for certain. But if you have a problem that’s amenable to special-purpose compute, FPGA can provide you a quick and easy way there.”

Likewise, HDLs are also opening up, and Cantrill believes this too will be transformative.

HDLs have traditionally been dominated by Verilog and (later) SystemVerilog. Their compilers have been historically proprietary, and the languages themselves are error-prone. But the past few years have yielded an explosion of new, open HDLs; for example, Chisel, nMigen, Bluespec, SpinalHDL, Mamba (PyMTL 3) and HardCaml.

Of these, Bluespec is the most interesting to the team at Oxide Computer, where Cantrill is co-founder and CTO. He explains, “The way one of our engineers describes it, ‘Bluespec is to SystemVerilog what Rust is to Assembly. It is a much higher-level way of thinking about the system, using types in the compiler to actually generate a reliable system, a verifiable system.’”

Open Source EDA is Becoming Real

Proprietary software has historically dominated electronic design automation (EDA), but open source options are coming to light in this domain as well.

Open source alternatives have existed for years, but one in particular, KiCad, has enjoyed sufficiently broad sponsorship to close the gaps with professional-grade software. The maturity of KiCad (especially KiCad 6), coupled with the rise of quick turn printed circuit board (PCB) manufacturing/assembly, has allowed for astonishing speed. It’s now feasible to go from conception to manufacture in hours, then from manufacture to shipping board in a matter of days.

Oxide has been using KiCad for its smaller boards (prototype boards), but envisions a future in which it can use KiCad for its bigger boards – and move off of proprietary EDA software. Cantrill explains, “This proprietary EDA software has all of the problems that proprietary software has. Like many shops, we have lost time because a license server crashed or a license server needed to be restarted…No one should be blocked from their work because a license server is down. The quality that we’re getting at KiCad now is really professional grade, which allows us to iterate so much faster on hardware, to go from that initial conception to a manufacturer in just hours. When that manufacturer can ship a board to you in days, and you can go from something that existed in your head to a PCB in your hand in a week, it’s remarkable. It’s a whole new era.”

Open Source Firmware is (Finally!) Happening

The Oxide team is just as bullish about open source firmware as they are about KiCad.

Cantrill laments, “The open source revolution has been so important all the way through the stack. There’s open source databases, with ScyllaDB and many others, open source system software, and open source operating systems. But the firmware itself has been resistant to it.”

The result? All the same problems that tend to plague other proprietary software. They take a long time to develop. When they come out, they’re buggy. They’ve got security problems. Cantrill continues, “We know that open source software is the way to deliver economic software, reliable software, secure software. We need to get that all the way into firmware.”

He believes that we’re finally getting there, though. The software that runs closest to the hardware is increasingly open, with drivers almost always open. The firmware of unseen parts of the system is also increasingly becoming open as well (see the Open Source Firmware Conference). This trend is slower in the 7nm SoCs, but it is indeed happening. The only real straggler is the boot ROMs. Even in putatively open architectures, the boot ROMs remain proprietary. This is a problem, but Cantrill is confident that we’ll get beyond it soon.

Rust is Revolutionary for Deeply Embedded Systems

Last but not least, Rust. Rust has proven to be a revolution for systems software, thanks to how its rich type system, algebraic types and ownership model allow for fast, correct code. Rust’s somewhat unanticipated ability to get small – coupled with its lack of a runtime – means it can fit practically everywhere. Cantrill believes that with its safety and expressive power, Rust represents a quantum leap over C – and without losing performance or sacrificing size. And embedded Rust is a prime example of the potential for hardware-software co-design.

The Oxide team members are big believers in Rust. They don’t use Rust by fiat, but they have found that Rust is the right tool for many of their needs.

Cantrill’s personal take on Rust: “Speaking personally as someone who was a C programmer for 2+ decades, Rust is emphatically the biggest revolution in system software since C. It is a very, very, very big deal. It is hard to overstate how important Rust is for system software. I’m shocked – and delighted – that a programming language is so revolutionary for us in system software. For so long, all we had really was C and then this offshoot in terms of C++ that … well … we’ll leave C++ alone. Suffice it to say that I was in a bad relationship with C++.

“But Rust solves so many of those problems and especially for this embedded use case. Where we talked about that ubiquitous compute earlier, Rust allows us to get into those really tiny spaces. At Oxide, we’ve developed a new Rust-embedded operating system called Hubris. The debugger, appropriately enough, is called Humility, and I definitely encourage folks to check that out.”

Evenly Distributing Our Present into the Future

The technologies featured in this latest batch of predictions are not new. In some cases, they’ve actually been around for decades. But, Cantrill believes they’ve all reached an inflection point where they are ready to take off and become (with a nod to the famous quote attributed to William Gibson) much more “evenly distributed.”

Cantrill concludes, “We believe that the future is one in which hardware and software are co-designed, and again, we are seeing that very concretely. And the fact that all of these technologies are open assures that they will survive. So we can quibble with the timing, but these technologies will endure. It may take time for them to broaden, but their trajectory seems likely, and we very much look forward to evenly distributing our present into the future.”

About Cynthia Dunlop

Cynthia is Senior Director of Content Strategy at ScyllaDB. She has been writing about software development and quality engineering for 20+ years.