In popular video games, 3D sound simulation accompanies 3D action to allow the player to move through the virtual environment. In gaming, virtual reality is sonically sophisticated to recreate a rich audiovisual experience. In games like Fortnite, Call of Duty, and others all the 3D sounds are fake—they are as imaginary as the places the video game characters inhabit.
What if we could report the scientifically accurate acoustic response in real 3D virtual environments? Would that be useful in design?
There’s a problem with the way we look at acoustics in design today—acoustics are often divorced from experience as they’re difficult for clients to understand. Most countries have their own mandatory legislation for acoustics and to comply with these requirements and regulations, we (acoustic engineers) are called in to make sure all projects meet these minimum standards and sometimes make them better.
Our industry and regulators have abstracted acoustics to a numerical rating to make these regulations possible. But the flip side of numerical ratings is that for clients and design teams, these numbers are hard to relate to. It is hard for many to conceive how the predicted acoustic properties of spaces will translate to the sound experience in built form. For example, very few non-acousticians can guess a decibel level of something they hear. Ratings aren’t the same as experiencing sound firsthand.
How can clients, designers, or stakeholders understand acoustic comfort in a space when they’re shown design changes that will decrease the reverberation time from 1.2 seconds to 0.7 seconds or an absorption coefficient from 0.9 to 0.2? How can they understand the value in investing beyond the minimum requirements?
Complexity of acoustics
Acoustics is a complex science because it has a significant subjective component.
I tell my clients that just because a project meets minimum requirements doesn’t mean it will be acoustically comfortable for everyone. I’ve looked for a way to communicate the way spaces sound during the design phase. Previously, I used a SketchUp plug-in so clients could hear the sound properties of designs in SketchUp, but the 3D visual lacked detail, and the tool is no longer available.
Inspired by gaming
With the goal of giving everyone the chance to experience the meaning of acoustic concepts, I looked at developing an internal tool for communicating acoustic conditions. With video games in mind, I looked for a gaming engine that could provide the platform to allow for an accurate acoustic simulation in a virtual reality environment.
To do this, I asked one of the world’s premiere middleware companies (which can tap into the power of the gaming engine to develop apps) if it would be possible to create the tool I desired. They said yes. They provided us with the path to develop a new tool using a video game engine at its core, creating a new way to present acoustic conditions—in virtual reality.
Now, our clients and design team members can put on a headset and listen to the spaces as they walk through them in virtual reality. It’s a game changer. We still produce the necessary report with its numerical ratings to demonstrate regulatory compliance, but our clients get a much better understanding of the sound of their space and options by listening to it. And that means happier clients.
We are already using the tool to refine designs for better acoustic comfort. For example, we recently walked a client through the design for a 39th floor executive conference room with glazing on three sides, timber surfaces, and conference table. We modeled the acoustics and demonstrated that all the hard reflective surfaces would create significant echo—it would be like an echo chamber and uncomfortable to use for meetings. We also modeled it with an acoustic intervention (curtains on various acoustic surfaces) to show that it could, with some modification, be used as a multipurpose meeting space. So rather than convince them with technical descriptions of the room, we had them walk through the space and listen to all the different options.
Status of the tool
We’ve taken acoustic VR from an idea to an experiment to a powerful tool we can use on real projects. But it is all a work in progress. Right now, our acoustic VR modeling service is more of a boutique offering we are using to find solutions on thorny acoustic projects. We need to streamline the tool, make it seamless, intuitive, and easier to deploy. Our goal is to apply acoustic VR on every project.
The tool has some powerful features that give us the ability to immerse the user in the acoustic experience. Four highlights:
1. We can look at multiple acoustic scenarios in one space.
We can present multiple options for acoustic treatments—one with bare surfaces, one with wall treatments, and one with an acoustic ceiling. We can pre-calculate and preload over a dozen scenarios into the tool.
2. We can switch between acoustic scenarios with the click of a button.
The ability to compare these scenarios (to “A/B” them in audiophile lingo) is a crucial feature as it gives the user a chance to hear, in real time, differences in the acoustical properties of designs.
3. We present the acoustics in a detailed VR environment.
With the VR goggles on—or using a laptop as if you were playing a video game—the user can walk anywhere in the space and experience how acoustic response changes. We’ve already calculated every position in the room and its acoustic impulse response. That gives the user a degree of freedom that simulates reality more closely.
4. We can simulate multiple sources.
The software is designed to feature one person’s voice for audio reference, but in its new iteration we can feature multiple sound sources. We can listen to stereo sources simultaneously!
Four future applications for acoustic VR
We can envision numerous uses for the acoustic virtual reality tools we’re developing, some even outside the traditional role of the design industry.
The school music room: Perhaps the first place where we’d like to apply an affordable version of acoustic VR is on interior design for music rooms at new schools. We can use the tool to evaluate some of the standard music room designs and improve them. Or we can develop a new set of optimized flexible music rooms.
Standard on every project: There’s a big reason we don’t rigorously analyze every project for acoustics: budget. If we can make this acoustic VR service truly affordable, we can offer it on every project. In fact, we will be able to show side-by-side comparisons for cost/acoustic strategy while presenting possible scenarios in acoustic VR. That way, the client and design team can hear the difference, decide what sounds good to them, and evaluate the investment.
Modelling environmental sounds: We’re looking at porting the currently working indoors algorithm to the outside world, e.g., integrating environmental sounds and moving sources from highway traffic to railroads in our model. If we can do this, we can predict how a new road will sound in a residential development and show the results in a VR model.
Heritage sonics and audiovisual replicas: We are now able to model rooms extremely accurately. Using photogrammetry—which in simple terms is taking thousands of pictures of a room—we can make a virtual visual record of a space while we simultaneously capture the sound of the room. This means we can record and create virtual versions of places featuring highly accurate interior details and acoustic properties. We can, therefore, create heritage acoustic models of real places, even historic places, like Abbey Road Studios or the Whispering Gallery at St. Paul’s Cathedral. We can store historic acoustic records on the cloud forever, making sure that both aural and visual characteristics are kept intact. Theoretically, we could model spaces such as a musician’s recording studio so that users can experience sound just as the artist did during its creation, providing a new way of experiencing music.