flexiblefullpage -
billboard - default
interstitial1 - interstitial
catfish1 - bottom
Currently Reading

Keeping pace

Keeping pace

A peek inside a pair of high-tech facilities reveals plumbing and HVAC strategies that integrate well with technology-driven jobs


By Jim Crockett | August 11, 2010

Keeping pace

A peek inside a pair of high-tech facilities reveals plumbing and HVAC strategies that integrate well with technology-driven jobs

By Jim Crockett

As the march of technology advances, it’s a constant curve to deliver buildings capable of meeting end users’ needs. No where is this more telling than in computer-intensive facilities such as telecom and data centers and hospitals and research facilities where state-of-the-art equipment is sometimes outdated by the time it’s actually installed. Some things, however, are constant: the need for water and heating and cooling. And although these mechanical systems tend to be fairly unchanging as far as the technology curve, more high-tech facilities often demand greater results than what can be produced via traditional plumbing and HVAC practices. But the design community has responded, as the following examples illustrate.

Concrete plumbing solution

Located on the campus of the University of Wisconsin in Madison, the Waisman Center has been involved in multi-disciplinary research for developmental disabilities for 30 years. But calling it a multi-use facility is an understatement. The first floor alone houses a magnetic resonance imaging scanner, a positron emission tomography scanner and a linear accelerator. The second floor contains an auditorium, lecture halls, administrative services and the main mechanical equipment room for a new tower addition. Auditorium and mechanical space occupy what would be the third floor, and the fourth floor is a biomanufacturing facility where products for clinical phase-one trials are made. The fifth and sixth floors include bench laboratories and support spaces, including researcher offices and warm and cold rooms.

Although it might be one of the least important functions one would associate with such high-tech equipment, the plumbing system is the core for all of the building’s operations, at least when in came to the center’s latest addition. According to Tom Boehnen, with Madison-based M/E/P engineer Arnold & O’Sheridan, the building team couldn’t just think about one floor. Rather, the interrelation of all functions and spaces had to be at the forefront of planning. Expansion also had to be factored.

Furthermore, he says a solid plumbing scheme was critical because much of the building’s piping systems would be locked in concrete. The existing building’s structural system—which uses concrete waffle slabs—was not carried into the addition. Collaboration with the architect [??WHO??]and structural engineer [??WHO??NEEDS QUOTES] became critical and resulted in a concrete joist system that allowed greater integration of the horizontal plumbing distribution necessary to reach the various lab locations without compromising valuable ceiling or floor space

For example, branch plumbing drain lines are installed in the depth of concrete joists. Piping systems include plumbing and piping for domestic hot and cold water, purified water, specialty gases and nuclear injectables, as well as vacuum systems and steam piping.

The piping networks are accommodated by a central vertical riser near the elevator. Collector mains and stacks for chemical drain lines were purposely located on the outside walls at columns, says Boehnen, and the structural design accommodates stack location by connecting concrete joists on the side of the columns. This arrangement leaves a space at the column face to sleeve the stacks through the floor.

Plumbing and piping system space above on the first floor are distributed through floor trenches, access flooring and suspended ceilings. Throughout the rest of the building, plumbing and piping systems have vertical risers and horizontal distribution above suspended ceilings. Plumbing stacks are located at permanent vertical elements such as columns and elevators, so that if functions in the building change, the plumbing stacks won’t need to be relocated. (For more on the lab specialty piping systems, visit www.bdcmag.com).

Tower of Cool

Another building type constantly fighting to keep abreast of the technology curve is the telecom/data center. Having experienced this first hand, RTKL Associates Inc., Baltimore, has adapted its own design technology to better serve client needs.

“Server room cooling systems have remained virtually unchanged since the inception of the raised floor,” says Stephen Spinazzola, an RTKL vice president. “Traditional methods of cooling data centers are no longer adequate to keep pace with the increasing power and heat load densities we’re seeing today and expect to see moving forward.”

Indeed, he points to anecdotal evidence which shows that conventional data-center cooling design becomes ineffective as power density exceeds 150 watts/sq. ft.—or approximately 3.8 kilowatts (kW) per rack. With this in mind, RTKL proceeded to raise the cooling effectiveness of air-conditioning systems for data centers and computer rooms. The result is a custom computer rack, dubbed the “Tower of Cool” (TOC), that conveys cool air directly to electronic equipment via specially designed doors.Specifically, Spinazzola explains the computer rack isolates the heat load of the electronic equipment from the rest of the space. “Conveying cooling air directly to and from the electronic equipment, instead of mixing the cooling air in the space to cool the electronic equipment doubles the cooling effectiveness of the air-conditioning equipment,” he says.

After picking up equipment heat, return air discharges from the TOC directly to a ceiling plenum where it is conveyed to a computer room air-conditioning unit (CRACU)—units designed and manufactured specifically for the raised-floor data center environment.

Spinazzola notes this increases the effectiveness of the cooling system because standard temperature air can be used to cool equipment, but higher temperature air (around 95ºF) returns to the CRACU.

Conventional data center air-conditioning design, he explains, delivers 55ºF air to the electronic equipment space where it mixes with 110ºF air discharged from electronic equipment, producing approximately 70ºF air to cool equipment. Air is returned to the CRACUs at approximately 75ºF. The TOC computer rack design permits heat transfer across the cooling coil of the CRACU to be raised by a process known as high Delta-T cooling (HDTC) (see Figure 1).

Thus, with each CRACU providing twice the cooling heat transfer for the same airflow, the number of CRACUs for a particular facility can be reduced by half. A standard chilled-water coil will double in capacity with a control valve change to increase water flow, Spinazzola notes. This leads to significant reductions—as much as 16%—in operating costs for a data center’s cooling plant. The use of HDTC racks also has the potential to reduce the cost of construction by 7% because less overall space is required.

Real-world numbers

RTKL claims the HDTC concept is effective in cooling approximately 7.4kW of electronic equipment in one rack, and that the TOC can operate without adverse effects through normal day-to-day operations of an active data center. This, of course, begs the question as to how such testing translates into real-world numbers and benefits. With that in mind, the following cost model was developed based on a server farm prototype designed for a large telecommunications company by RTKL in 1999.

Spinazzola claims the use of HDTC should allow electronic equipment density to increase, and provide more effective cooling. For the purpose of this analysis, he says, two circuits per cabinet—approximately 191 watts/sq. ft.—were used. This is an increase of 38% above the base prototype.

As is the case with any type of formal comparison, he adds there must be at least one constant between the base and the alternate. In this analysis, the size of the electrical service for the electronic equipment is the constant. The electrical service is the highest cost element of the project. Therefore, if the size of the electrical service for the electronic equipment—and the associated quantity of circuits—is constant, the variables are the size of the building, the number of cabinets and the size of the cooling plant.

With the electrical service size fixed at 6,600 circuits, and using two circuits per cabinet in the HDTC prototype, the cabinet count is reduced from the base prototype, and from the associated raised floor. Additionally, the HDTC prototype reduces the quantity of CRACUs from 122 to 61. This not only lowers the cost of the mechanical system, but also brings down the cost of the electrical system significantly. The use of the HDTC approach reduces the power requirements by approximately 450 kW.

Of further benefit, Spinazzola claims the reduction of CRACUs reduces the size of the chiller plant via a reduction in fan motor heat. This results in a net reduction in the power requirements of approximately 360 kW. These combined factors result in a total power reduction of 810 kW.

“The bottom line is that the TOC concept will allow owners and operators of data centers to install more electronic equipment into each TOC and cool it more effectively and efficiently,” says Spinazzola.

TABLE/BOX INFO

Base Prototype

• Gross building area: 127,500 sq. ft.

• Raised-floor area: 91,700 sq. ft.

• Equipment load: 12.7 Megawatts (6,600 circuits, approx. 138 watts/ sq. ft.)

• Cooling plant: 5,140 tons, (5.0 Megawatts connected electrical load)

• Airside cooling: 122 CRACUs on the raised floor

• 4,250 cabinets (approximately) with 1.5 circuits per cabinet

HDTC Prototype

• Gross building: 105,000 sq. ft.

• Raised floor: 66,400 sq. ft.

• Equipment load: 12.7 Megawatts (6,600 circuits, approx. 191 watts/sq. ft.)

• Cooling plant: 4,620 tons, (4.3 Megawatts connected electrical load)

• Airside cooling: 61 CRACUs on the raised floor

• 3,150 cabinets (approximately) two circuits per cabinet

Based on the above data, the following cost analysis compares the base prototype and HDTC prototype based on an energy cost of $0.08/ kWh:

Base Prototype

Shell @ $35/sq. ft. (125,500 sq. ft.) $4,463,000

Infrastructure $52,048,000

Cabinets (4,720 @ $1,025/ Cab.) $4,838,000

Energy (NPV 10 years, 6%) $62,712,000

Net Present Value $124,061,000

HDTC Prototype

Shell @ $35/sq. ft. (105,000 sq. ft.) $3,675,000

Infrastructure $48,822,000

Cabinets (3,150 at $2,010/Cab.) $6,331,500

Energy (NPV 10 years, 6%) $59,080,500

Net Present Value $117,909,000

NPV Savings: $6,152,000 (10% of first cost)

boombox1 - default
boombox2 -
native1 -
halfpage1 -

Most Popular Content

  1. 2021 Giants 400 Report
  2. Top 150 Architecture Firms for 2019
  3. 13 projects that represent the future of affordable housing
  4. Sagrada Familia completion date pushed back due to coronavirus
  5. Top 160 Architecture Firms 2021