As President of 7x24 Exchange International, an educational forum focused on mission-critical issues, and as a past executive with Digital Realty Trust and Goldman Sachs, David Schirmacher can bear witness to the proliferation of data centers for storage, processing, and distribution of digital information worldwide.
There are more than six million data centers in the U.S., a number that’s expected to grow to 8.6 million by 2017, according to forecasts by market research firm IDC. The group predicts that, over the next several years, the majority of businesses and organizations will stop managing their own data center infrastructure and turn to co-location centers (colos) and cloud service providers.
“Cloud services, digital content, and new data sovereignty laws are setting the data center market on fire,” says Bo Bond, Managing Director and Co-leader of JLL’s Data Center Solutions group. JLL projects that multi-tenant co-location centers will increase at a 12.1% compound annual rate from 2015 through 2018, and the cloud-managed services market will double, to $76.7 billion, by 2021.
By the Department of Energy’s reckoning, the majority of existing data centers—some dating back to the dotcom boom era—are outmoded and inefficient. Newer facilities are dramatically reducing their energy and water consumption, thanks to advances in cooling and server technologies, a growing reliance on renewable energy, and higher indoor operating temperature tolerances. JLL notes that while the number of data center servers installed is expected to increase by 40% from 2010 to 2020, the industry is on pace to reduce its energy use by another 10% to 40% during that period.
What Schirmacher can’t fathom is why companies still venture outside of their core competencies to build and manage data centers, when, he’s convinced, co-location facilities could accommodate most businesses’ IT and data processing and distribution needs. Schirmacher recounts a recent conversation with an executive of a financial institution that’s planning to build a data center at a cost equivalent to $30,000 per kilowatt-hour. By comparison, colos are being built at $9,500 per kW.
“There’s no reason to spend more than that in most cases,” says Schirmacher. “And I can tell you the additional value [of the pricier facility] is zero.” But, he concedes, “some people want the red car with the loud engine.”
Emerson’s Liebert DSE pumped-refrigerant economization system is the basis for the design for a data center of one of Vanderweil Engineers’s confidential telecom clients. The Building Team was able to reduce this slab-on-grade facility’s energy consumption by adding a perimeter-cooled raised floor and removing two of its six cooling units. Photo courtesy of Vanderweil Engineers.
Enterprise data centers operated by individual companies aren’t going away, and they aren’t automatically jumping into the cloud, either. A recent survey conducted by Uptime Institute found that enterprise-owned data centers still host 71% of enterprise IT assets. Paul Schlattman, ESD’s Senior Vice President and Consulting Practice Leader in Chicago, notes that large enterprise companies like Boeing are sending just 7% to 10% of their critical IT applications to the cloud.
But the economics of data centers are evolving, driven by seemingly insatiable demand for information, content, and computing. Data centers “are at a scale and size that are unprecedented,” says Terence Deneny, Vice President–Mission Critical with Structure Tone. At the same time, servers are getting more powerful, racks denser, and the cost of computing cheaper.
Consequently, companies are scrutinizing data centers’ total cost of ownership, and are using more sophisticated metrics and tools to anticipate their processing needs with greater precision and to match those needs with infrastructure requirements.
Such calculations are informing companies’ decisions about whether owning and operating a data center is preferable to leasing space and power; whether to build new or retrofit existing facilities; and how much operational and computing autonomy they want to relinquish or hold onto.
The good news for AEC firms is that companies—including the largest tech firms with expanding data center networks—are showing a willingness to share information about their demand peaks and valleys as a prelude to making those decisions. “That was unheard of before,” says Christopher McLean, PE, LEED AP BD+C, Director of Mission Critical Projects with Vanderweil Engineers.
The next wave of expertise and opportunity for AEC firms could be to use that information to help clients understand exactly what they need from data centers now and in the future, and to recommend what type of data center best suits the client’s business model, states Robert Sty, PE, SCPM, LEED AP, Mission Critical Studio Leader with SmithGroupJJR.
While several massive data center projects are under construction, most of what AEC firms work on these days are retrofits of older facilities. QTS Realty Trust, which provides data center, managed hosting, and cloud services, recently purchased a 360,000-sf data center in Piscataway, N.J., where QTS intends to redevelop portions of the facility to double its total raised floor capacity to 176,000 sf, and boost its critical power to over 26 MW, from 18MW. Photo courtesy of QTS Realty Trust.
Speed to market sets the stage
Have you heard of SMAC? That’s an acronym for what Schirmacher says are the four core drivers of data center demand growth: social media, mobile, analytics, and cloud. Other industry sources would probably add another “S” at the end of that contraction for “speed to market”—shorthand for how companies adapt to the rapid pace of technological change with minimal operational or delivery disruption.
“It’s all about speed and economics,” says Rajan Battish, Vice President and Mission Critical Expert with CallisonRTKL.
Speed to market requires elasticity. Vanderweil’s McLean says that it’s not uncommon for a data center’s core and shell to be built out entirely, but for its interior infrastructure to be installed in stages as demand warrants or as technology advances.
Companies are also building and renovating with the expectation of eventual expansion. T5 Data Centers in August acquired a new 208,000-sf building in Elk Grove Village, Ill., and the deal included a vacant four-acre lot on which T5 could add another 20 megawatts of capacity as needed. “This offers us speed to market,” Aaron Wangenheim, T5’s COO, told Crain’s Chicago Business.
Retrofits, say AEC firms, account for the bulk of their data center work these days, with the primary goal often boiling down to lowering the cost per kW, says Brian Schafer, Principal with New York-based AE firm Highland Associates.
“We’re identifying cities where we can step into a retrofit environment and take advantage of an existing building’s infrastructure while mitigating our risk,” says Brian Johnston, Chief Technology Officer for QTS Realty Trust, which operates 24 data centers internationally with more than 1,000 customers. QTS is one of the few operators that offer clients the option of customized, co-location, and cloud services.
In August, QTS purchased a 360,000-sf, 18-MW, LEED Gold-certified data center in Piscataway, N.J., from DuPont Fabros Technology for $125 million. This 38-acre property includes an on-site 112kVA substation and solar panels that produce 2 MWs of power. QTS will add 8 MWs over the next few years and plans to double the facility’s raised floor capacity to 176,000 sf and over 26 MW of critical power.
The country’s glut of vacant real estate makes adaptive reuse an enticing, yet risky proposition for data center conversions. All spaces aren’t created equal, cautions JLL’s Bond. Office buildings, in particular, can be dicey, because their floors can’t handle the weight load of data center equipment without reinforcement, their ceilings aren’t high enough, they don’t have sufficient capacity or access to power, water, or telecom connectivity, and they aren’t seismic-resistant.
Andrew Wheeler, Senior Vice President of Corporate, Health & Science with RS&H, points out other considerations when evaluating buildings for possible data center conversion: Is the building in a flood zone? What’s the condition of its roof? What’s the building’s power sourcing and availability?
The National Cancer Institute at Fort Detrick (Md.) employs data center infrastructure management (DCIM) strategies. ©Paul Burk Photography/courtesy HDR
Standardization, flexibility bump heads
Speed to market is leading the data center industry toward more standardization in how much size and capacity are necessary for their facilities.
“The way the market is going, and the way overall IT is going with all of the devices that have to be connected to something, the cloud and colo guys need to be able to expand rapidly,” says Deneny of Structure Tone.
Standardization might also be a fast-growing industry’s concession that qualified engineers are getting harder to find, say AEC sources. The industry’s sheer velocity is also causing “an uptick in lead times” for products and construction, says Deneny.
Schirmacher believes that AEC firms can succeed in this environment by “productionizing” the process. “The breakthrough in this industry will not come from a better piece of equipment or technology,” says Schirmacher. “It will come from a new way of what and how we deliver that aligns with a business’s needs.”
Battish of CallisonRTKL says that some clients are even asking for “canned” designs. “They don’t want to reinvent the wheel,” he says. But how far can standardization go?
Robert Haley, CISSP, Director of Mission Critical Facilities and IT at HDR, acknowledges that the giant cloud service providers are “completely standardized” in their operations and supply chain. But he notes that colleges, research centers, and the military have “very specific IT needs,” to say nothing of their security concerns. “I’ve seen racks up to 100kW at some of these sites,” says Haley.
Some companies and institutions also use their data centers as showcases for recruitment. “They are part of the tour,” says Haley, and, therefore, are taking into account workflow and people flow. One data center he’s seen includes a coffee shop and water feature.
It would appear that data center clients and their AEC partners are trying to strike a balance between standardization—which helps control costs—and design and construction latitude—which has become imperative in an expanding market. “What used to be maximally modular is now minimally modular for maximum flexibility,” says Vanderweil’s McLean.