Architectural concrete as we know it today was invented in the 19th century. It reached new heights in the U.S. after World War II when mid-century modernism was in vogue, following in the footsteps of a European aesthetic that expressed structure and permanent surfaces through this exposed material. Concrete was treated as a monolithic miracle, waterproof and structurally and visually versatile.
Construction techniques based on contractors’ experience with infrastructure introduced cast-in-place concrete combined with precast elements to replace natural stone on façades. Architects designed exposed concrete façades, cantilevered concrete balconies, and their associated slabs as if the material were uniformly waterproof, which it was not. Thermal conductivity was not addressed. No one discussed embodied carbon back then.
The history of concrete construction between 1950 and 1970 offers architects and construction professionals a framework for how to rehabilitate these buildings today using both time-tested and emerging technologies. Most exposed architectural concrete in the U.S. was in structures built by institutions, especially universities, which expanded rapidly after WWII. Planning for these structures began in the 1950s, and the first wave of buildings was in place by 1965. Many were built with perimeter radiation for heating and without ducts for air conditioning. Comfort standards were less exacting then, and energy conservation was a minor concern.
After reading this article, you should be able to:
+ Discover the history of mid-century modern concrete buildings
+ Explore the primary sources of deterioration in concrete buildings
+ Discuss methods for diagnosing and repairing concrete structures
+ List the advantages of reinforced concrete construction