By Nicola Phillips, Contributing Writer
Data centers trace their beginnings to the computer rooms of the 1940s. But the data center as we know it was invented by a team at Google led by Brazilian Luiz André Barroso in the early aughts.
Barroso’s team fundamentally changed how data centers were built and operated. Rather than treating the data center as a housing station for individual computing, storage and networking components, Barroso conceived of the entire facility as a singular (warehouse-scale) computer. In Barroso’s conception, all parts of the facility were designed to work in concert, woven together to operate as a highly efficient machine.
Barroso’s innovation ushered in the age of hyperscale data centers, the likes of which are now staples of tech powerhouses Google (where Barroso spent the vast majority of his career), Microsoft, Amazon and IBM. These massive facilities (which are typically in the realm of 10,000 square feet) can hold hundreds of thousands of servers and millions of disk drives. Barroso’s architecture, which has become commonplace for hyperscale data centers, was what Larry Page and Sergey Brin considered Google’s primary innovation and proprietary advantage, not its search algorithm, as is commonly understood. These facilities, and the range of applications they make possible, underpin much of modern life.
Another Barroso innovation was to swap out traditional servers with low-end ones specially designed for internet services. Low-end servers are cheaper and, when optimized for a specific application, more efficient than their traditional counterparts. Barroso’s team introduced a modular approach to hardware, which simplified building and maintenance, brought down costs, and made systems more adaptable to changing workloads.
We’re on the threshold of a new era of data centers, that of the specialized AI-focused facility. Companies like CoreWeave have begun to establish this niche. Whereas traditional data centers are outfitted with general-purpose central processing units (CPUs) that can perform a wide range of tasks, these specialized data centers are outfitted with customizable graphic processing units (GPUs), each programmed for specific tasks. GPUs can run multiple computations at one time, making them ideal for artificial intelligence applications that use a tremendous amount of computing power.
This reimagining of the data center as we know it recalls Barroso’s own contributions two decades ago. In twenty years time, we might look back on this period as being similarly transformative. Data centers are already being reinvented for the growth of artificial intelligence applications; how else can they adapt to the changing times?
What if we started from scratch with these specialized data centers and built facilities that were specially designed to work more in concert with the environment than existing data centers? AI presents opportunities for data centers to look and run differently. Data companies have the opportunity to incorporate environmental considerations in their new designs. Data centers don’t have to guzzle fossil fuels or consume millions of gallons of water. They don’t even have to be massive facilities that take up thousands of acres of space.
Barroso’s team proved the value of a modular approach. Instead of one massive building, the data centers of tomorrow could be designed as a series of small facilities built around a central power source. Modular facilities are quicker to build and more cost-efficient to maintain. They don’t require the same enormous environmental footprint. The power source in question can be renewable. This is the future that Soluna envisions.
Luiz André Barroso was only 59 when he died unexpectedly last month. His legacy, at Google and in the broader data and technology community, outlives him. What the next generation of data centers will look like, how Barroso’s vision will be adapted and built upon, remains to be seen.