flexiblefullpage -
billboard - default
interstitial1 - interstitial
catfish1 - bottom
Currently Reading

Solving the data conundrum with better tools to capture, share, and analyze information

BIM and Information Technology

Solving the data conundrum with better tools to capture, share, and analyze information

At a recent Thornton Tomasetti symposium, experts showed how designs and projects can be improved by granular information that’s accessible to more users. 


By John Caulfield, Senior Editor | November 24, 2015
John Caulfield, Senior Editor

Dozens of AEC tech experts, including Nathan Miller, Founder of Proving Ground, gathered recently for Thornton Tomasetti’s AEC Technology Symposium and Hackathon. Images courtesy Thornton Tomasetti

“We haven’t really cracked the nut to get people to talk to one another.” That’s Nathan Miller, the former Case consultant who is the founder of Proving Ground, a startup that specializes in helping businesses leverage data to inform their building projects.

Miller made this observation during Thornton Tomasetti’s third annual AEC Technology Symposium and Hackathon, where he and other speakers attempted to address the question of whether technology tools are helping or hindering communication and production among AEC firms and their clients.

While they didn’t provide definitive answers, the speakers took turns describing the necessity and value of creating products that facilitate the exchange and understanding of data-driven design by all levels of users.

A “pleasant interface” is what any tool should strive for, said Andrew Heumann, an Associate and Design Computation Leader with NBBJ, who discussed how computational tools are evolving and engaging with design teams. One example he pointed to is the Human UI (for utility interface) plug-in that makes the widely used Grasshopper algorithmic modeling software more of a customized app and “less intimidating” to designers.

While some of the presentations were thinly veiled product pitches, they didn’t detract from the symposium’s overriding message that technology tools are useful primarily when they smooth a path to a satisfying end result.

‘The future is a lot flatter than before. Open source software gives the freedom to use, study, modify, and share information.’

—Gareth Price, Ready Set Rocket

“Stop designing around the tool and start designing to the interface,” said Owen Derby, Technical Product Manager and Software Engineer with Flux, whose platform provides cloud-based collaboration tools to exchange data and streamline complex design workflows.

The symposium, held on September 25 at Baruch College in New York, touched on how different tools can be used to collect, analyze, and disseminate information for the purposes of design and construction.

Here’s a recap of some of the hot topics discussed at the event:

 

Sensors monitor a building’s heartbeat.

Constantine Kontokosta, PE, AICP, RICS, Deputy Director, Academics, of the NYU Center for Urban Science + Progress, quoted urban activist Jane Jacobs—about cities being laboratories—as a starting point to discuss how sensing technologies are helping researchers, urban planners, developers, and AEC teams understand the pulse of cities in order to predict future changes.

Data, he explained, can and should be teased out of just about anything: smartphones, social media, lighting patterns and plume rates, carbon and steam emissions, even taxi rides. CUSP has been working with New York City on a test project called Quantified Communities in two New York City neighborhoods, Hudson Yards and Lower Manhattan. It uses expanded sensor networks to collect and analyze real-time data to determine how neighborhoods are performing as a means toward urban planning that incorporates how people live, work, and play.

KieranTimberlake, a research-centered design practice, recently converted a historic bottling plant in Philadelphia for its new architectural studio. Christopher Connock, a Researcher and Prototyper at the firm, said the project team determined where the MEP system would be taxed the most by measuring the temperature versus humidity across the building using 124 interior surface sensors, 56 relative temperature/humidity sensors, 60 ceiling sensors, and 120 surface and core structure slab sensors. The project ended up dispensing with a conventional HVAC system in favor of an exhausted cooling system.

 

Pictured (l. to r.): Josh Wentz, Technical Product Manager of BuildingOS, Lucid; Constantine Kontokosta, PE, AICP, RICS, Deputy Director, Academics, of the NYU Center for Urban Science + Progress; Owen Derby, Technical Product Manager, Software Engineer with Flux Factory; Nathan Miller, Founder, Proving Ground; and Robert Otani, PE, LEED AP BD+C, Principal with Thornton Tomasetti.

 

But sensors still aren’t universally applied as tools for data collection and analysis. Josh Wentz, Technical Product Manager of BuildingOS with Lucid, lamented how the vast majority of the five million commercial buildings in the U.S. still lacks automated technologies. And what’s out there is usually fragmented, often proprietary, and disconnected from other buildings.

With the emergence of the Internet of Things (IoT), there now are “great opportunities,” Wentz said, to standardize the language of buildings and their occupants’ activities in order to quantify the value of different post-construction performance metrics. Lucid currently has 10,000 buildings and 50,000 devices from which it is aggregating and synthesizing data via its BuildingOS cloud-based interface. Lucid plans to launch an application program interface (API) for this product next year.

 

Open source collaboration moves the needle.

In September, Thornton Tomasetti’s CORE Studio released Spectacles, an HTML5 BIM Web viewer designed to be hacked, extended, and modified. This platform is one of several that CORE Studio has developed as open source projects.

“The future is a lot flatter than before,” observed Gareth Price, Technical Director for Ready Set Rocket, a digital marketing agency. “Open source software gives the freedom to use, study, modify, and share information.”

During the symposium, a number of speakers touched on the advantages of open source for producing the best tools. Matt Jezyk, Senior Product Line Manager, AEC Conceptual Design Products, with Autodesk, explained how his company’s team developed its Dynamo suite as a side project in collaboration with architects, engineers, and programmers. “It came out of the needs of the AEC community,” said Jezyk.

He said Autodesk is now building more of its tools so they can interface with other tools. For example, its VRX (virtual reality exchange) platform “has become a collaborative tool” through which users can share BIM models. He added that Dynamo and other Autodesk products are freely accessible on GitHub.com.

Other speakers, though, lamented that open source is still more of an ideal than a reality for AEC firms and their clients, who view data as proprietary and a competitive advantage.

 

Challenging assumptions with better data.

“Failure is the new R&D,” Price exclaimed, to make a point about the value of experimentation in pushing the industry forward. Stephen Van Dyck, a Partner at LMN Architects, and Scott Crawford, a Design Technologist at the firm, confirmed that notion as they discussed the evolving role of R&D in their firm’s recent public works projects.

LMN had three months to deliver the design for the Global Center for Health Innovation, Cleveland. So it employed a plug-in that allowed for solutions in fabrication and a design study that ultimately became the building. In the process, LMN was able to deliver a façade system on time at $65/sf. 

Luc Wilson runs the X-Information Modeling think tank at Kohn Pedersen Fox, which focuses on using urban data and digital analysis tools. During the symposium, Wilson spoke about the efficacy of creating diagnostic tools that are capable of conducting an “urban MRI” by integrating a project’s competing objectives, visualizing data, and coming up with multiple options.

 

Ana Garcia Puyol, Computational Designer with Thornton Tomasetti, speaks to the crowd.

 

In one case study Wilson presented, One Vanderbilt in New York, the city cared most about preserving pedestrian space and street daylighting, whereas the developer was primarily interested in optimizing the value of the surface area. To reconcile those objectives, KPF tested a variety of designs for a better rate of performance within the design scheme. It also engaged city planners on zoning to test their assumptions. The same was true of its Lower Residential Block project, part of a master plan at London’s Covent Garden.

“We needed to show why the historic block typology wouldn’t work,” said Wilson. The way KPF did this was by comparing the density of this block with densities of similar blocks in China and New York. It then calibrated the block typology with the preferred block density.

Related Stories

BIM and Information Technology | Jul 25, 2016

Autodesk’s LIVE turns designs into video game-like experiences

Users can adjust navigation points, render styles, and even the time of day, with fluid and quick controls.

Building Tech | Jul 14, 2016

Delegates attending political conventions shouldn’t need to ask ‘Can you hear me now?’

Each venue is equipped with DAS technology that extends the building’s wireless coverage.

BIM and Information Technology | Jun 14, 2016

Autodesk and Trimble will share APIs to develop products that improve user workflow

Data and document management is likely to benefit the soonest. 

BIM and Information Technology | Jun 13, 2016

The race to digitize the globe with 3D imagery

Tech firms are creating a highly-detailed virtual planet available instantly for those who would like to scrutinize it. SmithGroupJJR's Stephen Conschafter details the new technologies being used to map our world.

BIM and Information Technology | Jun 7, 2016

Conquer computational design: 5 tips for starting your journey

Data-driven design expert Nathan Miller offers helpful advice for getting your firm ready to use CD tools and concepts. 

BIM and Information Technology | Jun 7, 2016

6 ways smart AEC firms are using computational design methods

Rapid prototyping, custom plug-ins, and data dashboards are among the common applications for computational design.

BIM and Information Technology | May 20, 2016

AIA and Autodesk introduce new feature to automate 2030 Commitment reporting data

The new automated connection will allow the more than 350 AIA 2030 committed firms to report their project and portfolio performance to the DDx directly from Autodesk Insight 360.

AEC Tech | May 10, 2016

Thornton Tomasetti launches new tech company

TTWiiN initially features six products and will add more via its own incubator.

Sponsored | BIM and Information Technology | May 10, 2016

Advanced laser scanning technology supports data collection and modeling efforts for Missouri’s Iatan 1 Power Plant

For the installation of a new heat exchanger, the power division of Black & Veatch contracted an engineering firm to laser scan the site, make a piping model in Autodesk® Revit®, and export it into AutoCAD® to deliver results.

AEC Tech | May 9, 2016

Is the nation’s grand tech boom really an innovation funk?

Despite popular belief, the country is not in a great age of technological and digital innovation, at least when compared to the last great innovation era (1870-1970).

boombox1 - default
boombox2 -
native1 -

More In Category




halfpage1 -

Most Popular Content

  1. 2021 Giants 400 Report
  2. Top 150 Architecture Firms for 2019
  3. 13 projects that represent the future of affordable housing
  4. Sagrada Familia completion date pushed back due to coronavirus
  5. Top 160 Architecture Firms 2021