flexiblefullpage -
billboard - default
interstitial1 - interstitial
catfish1 - bottom
Currently Reading

Solving the data conundrum with better tools to capture, share, and analyze information

BIM and Information Technology

Solving the data conundrum with better tools to capture, share, and analyze information

At a recent Thornton Tomasetti symposium, experts showed how designs and projects can be improved by granular information that’s accessible to more users. 


By John Caulfield, Senior Editor | November 24, 2015
John Caulfield, Senior Editor

Dozens of AEC tech experts, including Nathan Miller, Founder of Proving Ground, gathered recently for Thornton Tomasetti’s AEC Technology Symposium and Hackathon. Images courtesy Thornton Tomasetti

“We haven’t really cracked the nut to get people to talk to one another.” That’s Nathan Miller, the former Case consultant who is the founder of Proving Ground, a startup that specializes in helping businesses leverage data to inform their building projects.

Miller made this observation during Thornton Tomasetti’s third annual AEC Technology Symposium and Hackathon, where he and other speakers attempted to address the question of whether technology tools are helping or hindering communication and production among AEC firms and their clients.

While they didn’t provide definitive answers, the speakers took turns describing the necessity and value of creating products that facilitate the exchange and understanding of data-driven design by all levels of users.

A “pleasant interface” is what any tool should strive for, said Andrew Heumann, an Associate and Design Computation Leader with NBBJ, who discussed how computational tools are evolving and engaging with design teams. One example he pointed to is the Human UI (for utility interface) plug-in that makes the widely used Grasshopper algorithmic modeling software more of a customized app and “less intimidating” to designers.

While some of the presentations were thinly veiled product pitches, they didn’t detract from the symposium’s overriding message that technology tools are useful primarily when they smooth a path to a satisfying end result.

‘The future is a lot flatter than before. Open source software gives the freedom to use, study, modify, and share information.’

—Gareth Price, Ready Set Rocket

“Stop designing around the tool and start designing to the interface,” said Owen Derby, Technical Product Manager and Software Engineer with Flux, whose platform provides cloud-based collaboration tools to exchange data and streamline complex design workflows.

The symposium, held on September 25 at Baruch College in New York, touched on how different tools can be used to collect, analyze, and disseminate information for the purposes of design and construction.

Here’s a recap of some of the hot topics discussed at the event:

 

Sensors monitor a building’s heartbeat.

Constantine Kontokosta, PE, AICP, RICS, Deputy Director, Academics, of the NYU Center for Urban Science + Progress, quoted urban activist Jane Jacobs—about cities being laboratories—as a starting point to discuss how sensing technologies are helping researchers, urban planners, developers, and AEC teams understand the pulse of cities in order to predict future changes.

Data, he explained, can and should be teased out of just about anything: smartphones, social media, lighting patterns and plume rates, carbon and steam emissions, even taxi rides. CUSP has been working with New York City on a test project called Quantified Communities in two New York City neighborhoods, Hudson Yards and Lower Manhattan. It uses expanded sensor networks to collect and analyze real-time data to determine how neighborhoods are performing as a means toward urban planning that incorporates how people live, work, and play.

KieranTimberlake, a research-centered design practice, recently converted a historic bottling plant in Philadelphia for its new architectural studio. Christopher Connock, a Researcher and Prototyper at the firm, said the project team determined where the MEP system would be taxed the most by measuring the temperature versus humidity across the building using 124 interior surface sensors, 56 relative temperature/humidity sensors, 60 ceiling sensors, and 120 surface and core structure slab sensors. The project ended up dispensing with a conventional HVAC system in favor of an exhausted cooling system.

 

Pictured (l. to r.): Josh Wentz, Technical Product Manager of BuildingOS, Lucid; Constantine Kontokosta, PE, AICP, RICS, Deputy Director, Academics, of the NYU Center for Urban Science + Progress; Owen Derby, Technical Product Manager, Software Engineer with Flux Factory; Nathan Miller, Founder, Proving Ground; and Robert Otani, PE, LEED AP BD+C, Principal with Thornton Tomasetti.

 

But sensors still aren’t universally applied as tools for data collection and analysis. Josh Wentz, Technical Product Manager of BuildingOS with Lucid, lamented how the vast majority of the five million commercial buildings in the U.S. still lacks automated technologies. And what’s out there is usually fragmented, often proprietary, and disconnected from other buildings.

With the emergence of the Internet of Things (IoT), there now are “great opportunities,” Wentz said, to standardize the language of buildings and their occupants’ activities in order to quantify the value of different post-construction performance metrics. Lucid currently has 10,000 buildings and 50,000 devices from which it is aggregating and synthesizing data via its BuildingOS cloud-based interface. Lucid plans to launch an application program interface (API) for this product next year.

 

Open source collaboration moves the needle.

In September, Thornton Tomasetti’s CORE Studio released Spectacles, an HTML5 BIM Web viewer designed to be hacked, extended, and modified. This platform is one of several that CORE Studio has developed as open source projects.

“The future is a lot flatter than before,” observed Gareth Price, Technical Director for Ready Set Rocket, a digital marketing agency. “Open source software gives the freedom to use, study, modify, and share information.”

During the symposium, a number of speakers touched on the advantages of open source for producing the best tools. Matt Jezyk, Senior Product Line Manager, AEC Conceptual Design Products, with Autodesk, explained how his company’s team developed its Dynamo suite as a side project in collaboration with architects, engineers, and programmers. “It came out of the needs of the AEC community,” said Jezyk.

He said Autodesk is now building more of its tools so they can interface with other tools. For example, its VRX (virtual reality exchange) platform “has become a collaborative tool” through which users can share BIM models. He added that Dynamo and other Autodesk products are freely accessible on GitHub.com.

Other speakers, though, lamented that open source is still more of an ideal than a reality for AEC firms and their clients, who view data as proprietary and a competitive advantage.

 

Challenging assumptions with better data.

“Failure is the new R&D,” Price exclaimed, to make a point about the value of experimentation in pushing the industry forward. Stephen Van Dyck, a Partner at LMN Architects, and Scott Crawford, a Design Technologist at the firm, confirmed that notion as they discussed the evolving role of R&D in their firm’s recent public works projects.

LMN had three months to deliver the design for the Global Center for Health Innovation, Cleveland. So it employed a plug-in that allowed for solutions in fabrication and a design study that ultimately became the building. In the process, LMN was able to deliver a façade system on time at $65/sf. 

Luc Wilson runs the X-Information Modeling think tank at Kohn Pedersen Fox, which focuses on using urban data and digital analysis tools. During the symposium, Wilson spoke about the efficacy of creating diagnostic tools that are capable of conducting an “urban MRI” by integrating a project’s competing objectives, visualizing data, and coming up with multiple options.

 

Ana Garcia Puyol, Computational Designer with Thornton Tomasetti, speaks to the crowd.

 

In one case study Wilson presented, One Vanderbilt in New York, the city cared most about preserving pedestrian space and street daylighting, whereas the developer was primarily interested in optimizing the value of the surface area. To reconcile those objectives, KPF tested a variety of designs for a better rate of performance within the design scheme. It also engaged city planners on zoning to test their assumptions. The same was true of its Lower Residential Block project, part of a master plan at London’s Covent Garden.

“We needed to show why the historic block typology wouldn’t work,” said Wilson. The way KPF did this was by comparing the density of this block with densities of similar blocks in China and New York. It then calibrated the block typology with the preferred block density.

Related Stories

AEC Tech | May 9, 2016

Is the nation’s grand tech boom really an innovation funk?

Despite popular belief, the country is not in a great age of technological and digital innovation, at least when compared to the last great innovation era (1870-1970).

Big Data | May 5, 2016

Demand for data integration technologies for buildings is expected to soar over the next decade

A Navigant Research report takes a deeper dive to examine where demand will be strongest by region and building type. 

BIM and Information Technology | May 2, 2016

How HDR used computational design tools to create Omaha's UNO Baxter Arena

Three years after writing a white paper about designing an arena for the University of Nebraska Omaha, HDR's Matt Goldsberry says it's time to cherry-pick the best problem-solving workflows.

Drones | Apr 25, 2016

The Tremco SkyBEAM UAV is the first to be approved by the FAA for nighttime commercial operation

The SkyBEAM UAV is used for identifying energy leaks, rooftop damage, deteriorating façades, and safety issues without requiring scaffolding or cranes.

AEC Tech | Apr 15, 2016

Should architects learn to code?

Even if learning to code does not personally interest you, the growing demand for having these capabilities in an architectural business cannot be overlooked, writes computational design expert Nathan Miller.

Building Tech | Apr 12, 2016

Should we be worried about a tech slowdown?

Is the U.S. in an innovative funk, or is this just the calm before the storm?

BIM and Information Technology | Apr 8, 2016

Turner streamlines construction progress tracking using predictive visual data analytics

The construction giant teams with a computer science and engineering professor to develop a clever drone- and rover-based construction monitoring tool.

BIM and Information Technology | Apr 5, 2016

Interactive 3D map shows present and future Miami skyline

The Downtown Miami Interactive 3-D Skyline Map lets users see the status of every downtown office, retail, residential, and hotel project. 

AEC Tech | Mar 31, 2016

Deep Learning + AI: How machines are becoming master problem solvers

Besides revolutionary changes to the world’s workforce, artificial intelligence could have a profound impact on the built environment and the AEC industry.

boombox1 - default
boombox2 -
native1 -

More In Category

AEC Tech

Lack of organizational readiness is biggest hurdle to artificial intelligence adoption

Managers of companies in the industrial sector, including construction, have bought the hype of artificial intelligence (AI) as a transformative technology, but their organizations are not ready to realize its promise, according to research from IFS, a global cloud enterprise software company. An IFS survey of 1,700 senior decision-makers found that 84% of executives anticipate massive organizational benefits from AI. 




halfpage1 -

Most Popular Content

  1. 2021 Giants 400 Report
  2. Top 150 Architecture Firms for 2019
  3. 13 projects that represent the future of affordable housing
  4. Sagrada Familia completion date pushed back due to coronavirus
  5. Top 160 Architecture Firms 2021