Outside Tulsa, Oklahoma, among a series of nondescript warehouses off Route 69, lies a Google campus. Beyond its peculiar location — about 2,000 miles from the company's Mountain View, California headquarters — the site has all the standard hallmarks. There are polka-dotted employee badges, free breakfast and lunch buffets, and the company’s bright logo colors splashed over prominent slabs of wall space. It is, however, unlike most other satellite offices owned by the search giant.
That’s because it’s the location of Google’s largest US data center, one of 13 around the world. Construction began there in 2007, when Google repurposed an existing warehouse to begin handling the flood of data flowing between its fast-growing consumer products. Now, nine years later, Google operates seven different services with more than 1 billion users each: Gmail, search, Google Maps, Android, Chrome, YouTube, and the Play Store. Each of those products relies on Google’s vast network of servers and other cloud infrastructure, all of which must be supported and maintained by sprawling physical operations in places like Oklahoma, Georgia, and Oregon.
This exponential influx of users has had many positive effects. With a market cap of more than $531 billion, Google parent company Alphabet is now the second most valuable corporation on the planet, behind only Apple. Over the years, the Google name has grown from being synonymous with information and knowledge on the web to becoming the software backbone for a majority of the world’s smartphones. Google products now help us trawl the internet with ease, communicate in an instant, watch and upload 4K videos with the press of a button, and maneuver the physical world with a mobile app.
But there are inherent downsides to Google’s growth. As it has grown massive, so too have Google’s energy needs. Nowhere is that more apparent than places like its Oklahoma data center. Truck-sized diesel generators sit idly by in the event of a power outage, while cavernous halls of Halloween-colored power cables snake from substations just to provide electricity for one floor of each building on the multi-unit campus. In 2015 alone, Alphabet consumed 5.2 terawatt hours of electricity, almost as much as the entire city of San Francisco.
Google knows it can’t eliminate the problem — the company, as far as we know, is not working on cracking cold fusion. But it can ease the pain, at least in the short-term, with renewable energy. Starting next year, Google says its vast network of global operations will start purchasing as much renewable energy as it uses across all 13 data centers and all of its office complexes.
The lynchpin of this milestone is Google’s 20 renewable contracts it has accumulated over the years, which provide it with enough solar and wind power to account for every unit of electricity it consumes. Through those contracts, which have a total capacity of 2.6 gigawatts of power, Google says it’s helping spur development of clean energy projects, and in turn encouraging other companies in and outside the tech industry to follow suit.
Beyond the obvious benefits it provides the environment, Google also sees renewable energy as a competitive advantage. According to Neha Palmer, the head of strategy for Google’s global infrastructure division, investing in green energy — which in turn leads to more infrastructure investment — helps bring down the cost of wind and solar over time. Focusing on renewable energy also gives Google the opportunity to set fixed-price contracts for power in areas of high volatility.
To be clear, Google is not saying it’s currently powering all of its facilities with clean energy. The company stresses that, due to a number of a factors, it’s difficult to source every unit of electricity from a renewable source, especially when drawing power from the grid, where electrons are neither clean nor dirty. Those factors include not being able to build its own wind and solar farms to power facilities every hour of the day, and having to deal with local utility monopolies in certain parts of the world. Instead, the company purchases excess electricity in places where it can more easily obtain wind or solar energy, helping those industries grow, while unused electricity remans in the grid for anyone to consume.
The company also purchases the corresponding commodities, known as Renewable Energy Certificates (RECs), from green energy providers it sources power from. Think of RECs as a measurement of how green the energy you’re purchasing is, like the opposite of a carbon tax. Google then takes those RECs off the market, or “retires” them in utility parlance. That signals that the corresponding electricity is being used by its owner, and that the REC itself is not being swapped or sold on the commodities market.
In the long-term, Google plans to power its operations every day of every year with clean, zero-carbon energy. Doing that, says Palmer, means “working toward buying more renewable energy in every area that we operate.”
That investment begins at places like Minco-II, a wind farm located about 50 miles outside of Oklahoma City. The site is operated by NextEra Energy Resources, the largest operator of wind and solar-generating sites in the US. Google now uses all 64 turbines at Minco-II, which, as a 100 megawatt wind farm, generates 438,000 megawatt hours of electricity in a year. That’s enough to power 30,000 homes.
There is another part of the equation, one Google has more direct control over: efficiency. By improving how its data centers make use of the massive amount of power they consume, Google can both lower its energy consumption and, in some cases, squeeze more out of the same amount of power. Compared to five years ago, the company says it now delivers 3.5 times more computing power with the same amount of electricity in certain portions of its data centers.
This is an area where Google’s artificial intelligence prowess is being put to use. In collaboration with its DeepMind subsidiary, Google now uses machine learning algorithms to optimize its data center performance. In effect, the very same algorithms that are trained with mass troves of data and run across Google’s vast network of data centers are now being deployed to make those same data centers more efficient.
After tinkering with the AI software, Google found that it could reduce the amount of energy used for cooling by up to 40 percent. “In principal, the machine learning algorithms are sucking in all of this data and being able to project with very good accuracy what’s coming next,” says Chris Malone, a distinguished engineer at Google who oversees data center efficiency.
Because a data center contains so many facilities, each with specific cooling needs, there are billions of settings combinations to ensure equipment is kept at an optimal temperature. Humans could never figure out what works best on any given day due to changes in wind speed and humidity and dozens of other factors, Malone says. But a machine can.
This kind of insight is something Google thinks other companies could make use of. “I do know that a lot of learnings can be applied to other industries,” says Joe Kava, the head of Google’s data center division. “If you’re a large industrial plant, whether it’s petrochemical, oil and gas, or just large metal fabrication, it’s a lot of the same things. Power going in, which means heat, and getting that heat out.”
Kava says the company is in talks to provide its machine learning algorithms as part of the Google Cloud platform. The ultimate goal, of course, is not necessarily to reap profits — Google makes more with its ad business than it could ever generate through licensing data center software.
Instead, the company stresses that what’s good for the planet is simply good for business, and it thinks other industries will see the light. “It goes beyond this sector. You have chemical companies and healthcare companies seeing that it could be done in a cost-effective manner,” Palmer says. “More customers will demand this.”