One of the new features of iOS 16, and something that was highlighted again during Wednesday’s Apple event, is personalized spatial audio. Once you have the latest iOS version installed on your iPhone as of September 12th, you can create a custom sound profile that should enhance the sense of immersion and overall spatial audio experience you get from AirPods.
To create this personalized tuning, Apple uses the iPhone’s front-facing TrueDepth camera to scan your ears. The process, which involves holding your iPhone about four to eight inches from the side of your head, takes less than a minute, and the resulting data is then used to optimize spatial audio for your unique ear shape. “The way we all perceive sound is unique based on the size and shape of our heads and ears,” said Apple’s Mary-Ann Rau during the keynote. “Personalized spatial audio delivers the most immersive listening experience by placing precise sounds in space that are attuned to you.”
But Apple isn’t the first company to go down this path. Sony has been offering “Personalized 360 Reality Audio” for supported music services like Amazon Music, Tidal, Deezer and Nugs.net since 2019. Conceptually it’s very similar: both Sony and Apple try to determine your ear structure and adjust spatial audio processing to account for the unique folds and contours of your ears. The goal is to keep that 3D audio experience and remove any audio quirks that detract from the feeling.
Here’s how Sony explained the benefits to me in June, courtesy of Kaz Makiyama, vice president of video and sound at Sony Electronics:
Humans are able to identify spatial sound sources by the subtle shifts in intensity and timing of the sound entering the left and right ear from the sound source. Also, the sound can depend on our head and ear shape. By analyzing and reproducing the characteristics of both ears by photographing the ears, this technology makes it possible to reproduce the sound field when using headphones.
However, Sony’s approach is a bit more cumbersome than Apple’s. AirPods technology is built right into iOS settings. However, to create a personalized sound field with Sony’s products, you need to take an actual photo of each ear using the Headphones Connect app and your phone’s camera.
These images are uploaded to Sony’s servers for analysis – and Sony then keeps them for an additional 30 days for internal research and feature improvements. The company says the ear images are not personally linked to you during this window.
But that doesn’t mean that Apple has completely nailed the ear scan process. During the iOS 16 beta period, some on social media and Reddit mentioned that the process can feel tedious and sometimes no ear recognizes it. I think the truth of the matter is that there isn’t an absolutely easy way to pull this off during this time Also get a good and accurate reading of your ear shape.
There seems to be a consensus that the effort is worth it: these personalized profiles often make a noticeable difference and can enhance our perception of spatial audio. And Apple doesn’t take actual photos: the TrueDepth camera captures a depth map of your head and ear, much like Face ID learns your facial features.
Apple’s website notes that once you create a personalized spatial audio profile from an iPhone, it syncs with your other Apple devices, including Macs and iPads, for a consistent experience. This is true at least from October: you need upcoming updates for macOS and iPadOS for the synchronization to work. Personalized spatial audio is supported on AirPods 3rd generation, both generations of AirPods Pro and AirPods Max.
Apple never claimed to create anything new with personalized spatial audio. The company’s executives have regularly stated that their goal is to develop the best execution of meaningful functions, even when others – in this case Sony – were already pushing in that direction.
#Apples #personalized #spatial #audio #trick #Sony #idea
Leave a Comment