3D Audio and Virtual Reality
Here at Prism Acoustics, we are currently in the process of developing our own software for prediction of parameters used in room acoustics. Our first step is to release an accurate and easy to use piece of software for acoustic consultants to use in their calculations. After the initial release, the aim is to then provide auralisation functionality. In layman’s terms, auralisation is a virtual ‘sound field’ of a space that can give the listener the impression they are actually in the space (see more about our own software here).
Software using 3D auralisation techniques in virtual reality could be used to demonstrate the acoustical qualities of music spaces that have not been built yet (such as a concert hall). It could also be used to show the improvement in speech quality gained by installing absorption in a classroom.
The topic of 3D audio and virtual reality in acoustics is an interesting one. It allows consultants to show clients their work, without the use of graphs, and in the future, it could even be used in film, games and in other media.
A large, multi-disciplinary team of researchers has recently carried some particularly impressive work at the Notre Dame, Paris. In their work, they have been using 3D-audio and virtual reality techniques to create a ‘ghost orchestra’.
The ‘ghost orchestra’ is positioned in a visual and audible model of the Catherdral, where the previous recordings were made using a 45-channel close-mic of live orchestral music in the space. A 3D computer model of the space has then been created and used in a geometrical acoustics algorithm. The model was then calibrated using room acoustic data from the space.
For those who are interested in reading about the work. This article written by ScienceDaily provides an insight into the current and potential applications of 3D audio for professionals in acoustics and also in every day use.
Acoustical Society of America. “Improving virtual reality and exploring ear shape effects on 3-D sound.” ScienceDaily. ScienceDaily, 26 June 2017. <www.sciencedaily.com/releases/2017/06/170626093626.htm>.