
By Dan Mitchell, contributor
The data center business, as it currently exists, is a little too much like “Fight Club” (The first rule of servers: you do not talk about our servers) says Facebook’s Jonathan Heiliger, vice president of technical operations.
For that reason, Facebook will share the designs for its new, innovative data center in Oregon with any company that wants to use them, or to improve upon them. Given the companies it worked with – Dell, AMD, Intel (INTC) and others — it looks as though data centers are going open source.
The company on Thursday unveiled its Open Compute Project during an event at its Palo Alto, Calif., headquarters. The center and the servers it houses were designed in-house by Facebook to save energy and costs. As impressive as the designs are, only a company like Facebook could pull off-putting on such a big show to unveil a data center. Hundreds of reporters and others were on hand for the announcement. A video was shown that included heart-tugging, almost maudlin music played over sweeping shots of motherboards and server racks as if they were breathtaking mountain vistas. The company had shrouded the event in mystery, perhaps knowing that data center talks don’t normally set (non-geeky) hearts pounding.Assuming that all its assertions are accurate, Facebook has a right to be proud, and even to go a little overboard. Data centers are notoriously hard on the environment, largely because of all the energy it takes to cool them. The new server farm in Prineville, Ore., will use no air conditioning and contains no ductwork. The building was designed to be cooled with outside air, misted with water to increase humidity. In winter, hot air that is removed from the data center area is used to heat the office area. In all, the facility delivers a 38% increase in energy efficiency, Facebook says.
Several other companies were on hand for the announcement, including Hewlett-Packard (HPQ), Zynga, Dell (DELL), AMD (AMD), and Rackspace (RAX). Facebook is making the specs and design documents available to any company that wants to use them.
The power grid for the data center, also designed in-house, pumps 277 volts directly to each server, which is more efficient than the industry standard 208 volts.
The servers themselves are a leap ahead, from the chassis down to the motherboards. “It’s a beautiful chassis,” said Amir Michael, head of Facebook’s hardware design team — “functionally beautiful” in that it is “vanity free,” he said. The boxes — 90 of them housed in each three-section rack — use about 22% less material than a typical server and contain no plastic, no paint, no front panel and almost no screws. They contain fewer components than standard servers, and weigh six pounds less. The company says that if a “typical” data center were to use the servers, it would eliminate the need to manufacture, transport, and eventually dispose of about 120 tons of various materials.
Aside from the energy-efficiency considerations, Facebook will benefit from running its own data center, said CEO Mark Zuckerberg. Leasing server space, as most companies do, creates “bottlenecks,” he said. That should happen a lot less with Facebook in charge of its own servers.
Jacking in to leased server farms is highly inefficient from a data perspective as well as from an energy-use perspective, Heiliger said.
With big-name partners on board, it seems safe to say that lots of companies will follow Facebook’s lead and use — and improve upon — its technology. “We’re trying to foster this ecosystem where developers can easily build startups, and by sharing this we think it’s going to make this ecosystem more efficiently grow,” Zuckerberg said.
Given the companies that partnered with Facebook, including Intel and its rival AMD, it seems safe to say that lots of companies will make use of the technology. And there is a bit of a data center race on these days, with Apple (AAPL), Amazon (AMZN), Dell and scores of others all building new data centers nearly as fast as they can. With many centers in planning stages, it’ll be interesting to see who actually adopts Open Compute’s plans and who plays “wait and see.” The announcement was surely heard up the road at Google’s (GOOG) Mountain View headquarters, which jealously guards its technology — not only its software and search algorithms, but its server and data center designs as well.
“We didn’t just do it for ourselves, Michael said. “I feel like we did it for the community at large.”
More from Fortune: