Whether it's an earthquake, tornado, blackout, flood or terrorist attack, MasterCard can rest assured that its new data center will be operational, processing 26 million credit card transactions every day. Built like a fortress, the facility is part of the company's new Global Technology and Operations Center, O'Fallon, Mo., outside of St. Louis.
The 550,000-sq.-ft. campus consolidates the company's five St. Louis operations into three four-story office wings and the three-story data center, housing 1,500 employees in all. Completed last September, the complex is the anchor of the 1,100-acre, $600 million WingHaven residential, commercial and industrial development, spearheaded by McEagle Development, St. Louis. The project includes 2,000 residential units, 400,000 square feet of retail and commercial space, 1 million square feet of office space and 1.2 million square feet of high-tech assembly and laboratory facilities.
According to Jerry McElhatton, president of MasterCard Global Technology and Operations, the company evaluated 20 cities across the United States for the high-tech center, and chose the O'Fallon location because of WingHaven's secure and reliable technology infrastructure, as well as its 'live, learn, work and play' environment. A $44 million tax incentive package from the state also helped.
The $135 million project was led by two St. Louis-based firms: design architect Helmuth, Obata + Kassabaum Inc. (HOK) and general contractor Paric Corp. Subcontractors included structural engineer EDM Inc., St. Louis, and M/E/P engineer Mazzetti & Associates, San Francisco.
Built to resist most anything
Although steel is often specified for the construction of data centers because it offers greater floor-to-floor heights, the building team selected concrete to create a simple, clean space for sensitive computer equipment.
'The office wings were built using traditional steel framing and metal decking,' says Ralph Cali, principal in charge with HOK. 'The data center, however, is a poured-in-place concrete building. We chose concrete for really one reason: it does not need fireproofing. MasterCard needed a clean space to house data center equipment, so we did not want a steel structure with spray-applied fireproofing. The data center does not have dropped ceilings or windows and the concrete slabs are painted white - it's truly a white box.'
The building team also took measures to reinforce the structure. For instance, says Ron Koenig, lead superintendent with Paric, the incorporation of 'shear keys' strengthens the connection of the concrete walls and decks.
'We created a key, or a void, in the walls at deck height as the walls were poured,' says Koenig. 'We then connected the rebar through the void, so when we poured the concrete for the deck, it tied the wall and deck together with both the rebar and concrete into that key.'
The concrete walls are designed to withstand impacts from objects traveling as fast as 125 miles per hour, and can resist wind uplifts 20 times greater than a standard building. The structure also exceeds the seismic code requirements for the St. Louis area.
'There was so much reinforcement steel in the columns and slabs that it became hard to pour the concrete between all the steel,' recalls Art Ackerman, project director with Paric, who adds that the concrete subcontractor, Jacobsmeyer-Mauldin Construction, Pacific, Mo., employed a special German forming system from that allowed the 33-ft.-tall walls to be poured in one lift, instead of stages. This helped the team complete the concrete frame a month ahead of schedule.
Like the walls and floors, the building's roof is structural concrete, designed to take a heavy impact as well as to prevent water leakage. It is an inverted roof membrane assembly system, in which the waterproofing is installed beneath the insulation and ballast material to protect it against sunlight, extreme temperatures, high winds and punctures.
If water were to leak into the building, the floor decks in the computer areas are sloped to carry it to drains located at the perimeter walls. Approximately 3 feet above the concrete floor decks is a raised floor system that can support up to 200 pounds per square inch, plenty of capacity for the computer equipment.
Like many data centers constructed currently, the MasterCard data center is an 'N+1' facility, which means 'that if the requirement is one particular system for the building, two are installed,' says Cali. 'For instance, the requirement here was two backup generators, so we put in three.'
The building features three telecommunications lines and two independent electrical feeds from independent substations. If both feeds are cut off during a blackout, earthquake or other disaster, an uninterruptible power supply system will keep the electronics operating until the backup generators can power up to full capacity and be switched over.
Similarly, if the building's water supply is severed, reserve water will be drawn from a 2,220-ft.-deep well on site. 'If the chiller plant malfunctions,' adds Cali, 'there is a provision for a truck-mounted, air-cooled chiller to be driven to the site and hooked up.'
The data center houses the chillers, power system and electrical switchgear on the first level, and equipment for data processing and telecommunications on the upper two levels.
'The entire computer infrastructure is designed for expansion,' says Frantz Vincent, MasterCard's facility manager. 'It's a huge plug-and-play system designed for at least three times what's there.'
To protect the computer equipment against unnecessary water damage during a fire, a 'preaction' fire protection system was specified for the space. It incorporates a preaction valve assembly that holds water back from the sprinkler system piping until the detection unit senses a fire condition and initiates the opening of the preaction valve. Water can flow into the piping, but only discharges after the sprinkler head actuates due to the presence of heat at the individual sprinkler.
Building team collaboration
Cali and Ackerman concur that an integrated building team was a key to meeting the budget and schedule, as well as the needs of the owner. Besides weekly construction meetings at the job site, HOK hosted several design workshops at its New York City office.
Ackerman recalls one instance where a potential problem was avoided during a workshop. 'We were reviewing the drawings with the clients and someone spotted a problem with the design regarding the location of a concrete shear wall,' he notes. '[The person asked] 'How can I have a big open space with large monitor screens at the end of the building to monitor the network with this big concrete wall going through the space?' It was early enough in the project that it was not a big deal to correct. The structural engineer redesigned the surrounding concrete wall to be stronger and thicker so that the shear wall could be eliminated.'
Coordinating the installation of the computer equipment was also a challenge, says Mike Bray, manager of the data center for MasterCard. The tight project schedule required work phases to be done in parallel, he notes.
'We started to move equipment in and perform tests in late March and were operational by the end of May - all while construction work was being done,' adds Bray. 'So it was vital that the space be very clean.'
To ensure cleanliness in the data center, Paric set up a station at the entrance where construction workers were required to clean their boots and slip on shoe covers to avoid tracking dirt into the building.
'This was typical of our collaboration efforts,' says Ackerman.