Just how far will Google go to hide its custom-built data center hardware from the rest of the world?
In one Silicon Valley data center, the company is apparently so paranoid about competitors catching a glimpse of its gear, it’s been known to keep its server cages in complete darkness, outfitting its technical staff like miners and sending them spelunking into the cages with lights on their heads. “Many [companies] try to keep things covered up. There’s a lot of valuable intellectual property in here,” says Chris Sharp, general manager of content and cloud at Equinix, as he walks through the company’s data center. “But we were always amazed by Google and the helmets.” Google is one of many big-name Web outfits that lease data center space from Equinix—a company whose massive computing facilities serve as hubs for the world’s biggest Internet providers. All the big Web names set up shop in these data centers, so that they too can plug into the hub. The irony is that they must also share space with their biggest rivals, and this may cause some unease with companies that see their hardware as a competitive advantage best hidden from others. About two years ago, Chris Sharp says, Google unscrewed all the light bulbs inside the hardware cages it occupied at that Equinix data center. “They had us turn off all overhead lights too, and their guys put on those helmets with lights you see miners wear,” he tells Wired. “Presumably, they were bringing up custom-built gear they didn’t want anyone else to see.” Google declined to comment on Sharp’s little anecdote. But the tale is not surprising. Google designs its own servers and its own networking gear, and though it still leases space in third-party data centers such as the Equinix facility, it’s now designing and building its own data centers as well. These designs are meant to improve the performance of the company’s Web services but also save power and money. More so than any other outfit, Google views its data center work as an important advantage over competitors. That said, Google has actually loosened up in recent years. In 2009, the company opened a window into the first custom-built data center it had built five years before, and it has discussed parts of its new facilities. But many of its operations remain a mystery. Some believe this should change. Facebook now designs its own data centers and servers, and as a direct response to Google’s approach, the social-networking outfit has “open sourced” its designs, hoping to encourage collaboration on designs across the industry. This, Facebook says, will allow the rest of the world to save power in much the same way Google has done and ultimately, well, save the planet. Several companies have already embraced this effort, including Netflix, the Texas-based cloud provider Rackspace and Japanese tech giant NTT Data. But others still prefer to keep their secret hardware secret. Amazon, for instance, takes a Google-like approach. The company says very little about the facilities it runs or the hardware in those facilities. Apparently, the company is working with server sellers such as ZT Technologies to customize its servers, and it has followed Google’s lead in constructing its data centers with modular shipping containers. But it’s unclear just how far the company has gone towards designing and building its own hardware. This week, the Internet is rife with speculation about just how many machines back the company’s Elastic Compute Cloud service. At Google, employees sign strict non-disclosure agreements that bar them from discussing what goes on inside the company’s data centers—and apparently, this agreement is open-ended. That alone puts a lid on Google’s deepest secrets. We’ve seen the NDA in action—many times. But for Google, and others, there’s an added problem when they set up shop in a “co-location” facility like the data centers run by Equinix. The nature of the beast is that you’re sharing space with competitors. Equinix trumpets its data centers as places where the giants of the Web can improve performance by plugging their gear straight into the world’s biggest Internet carriers—and into each other. The company began life offering a service—the Internet Core Exchange—that connected all the major Internet service providers, and now it lets other outfits plug into this carrier hub. According to Sharp, over 70 carriers used the company’s main data center in San Jose, California. “We were a place for network operators to efficiently hand off traffic, and that’s the legacy that created Equinix,” Sharp says. “Not only are networks leveraging that to talk to each other, but [websites] are too.” Security is high in the company’s facilities. Hand geometry readers—i. e. Fingerprint readers that extend beyond fingerprints—guard access to the data center floor. There’s a security camera looking at you every time you turn around. And each company can contain their gear in their own cages, protected by still more hand readers. But if you’re on the floor, you can peer into the cages. For cooling purposes, they’re not walled off. While some companies proudly display their logo on the side of their machines, the Googles of the world do their best to hide themselves. To keep competitors from eying their gear, Sharp says, many companies keep the lights off inside their cages when no one’s working on them. But others go even further.