When you enter the building that houses Apple’s audio lab, venture just beyond reception and you’ll encounter a massive vintage stereo setup. The deck and accompanying speakers were a gift from Steve Jobs to the team of engineers who work in this office. The group sees the old-school tech as a source of inspiration, but also as a reminder of Jobs’ obsession with both music and sound.
More than inspiration, though, the stereo is a reminder to the experts in software, acoustics and sound design how important sound is to everything Apple builds.
Inside, I was led into a maze of nondescript halls, weaving from room to room with a trio of Apple engineers as my guides. I was in for a rare peek into the company’s product development facilities — a step further behind the curtain than what’s typically allowed during Apple events.
Validating the AirPods hearing test
Billy Steele for Engadget
As Apple’s audio team works to correct and calibrate the AirPods' fit for natural variations in ear geometry, they use a collection of audio metric booths to check their work. These rooms look like small, windowless offices. The walls are covered in sound-dampening panels and there’s a single workstation with a Mac and various tools for hearing analysis. If you’ll recall, one of the company’s major ambitions with AirPods has been the end-to-end hearing health experience that it debuted last year. In order to validate its claims of a “clinical grade hearing test,” engineers use devices you might see in an audiologist's office, like audiometers. These spaces aren’t unlike the small booths you’ve probably sat in for a professionally-administered hearing test.
In one booth alone, the team ran thousands of tests on the feature to ensure that the hearing screening in your pocket was as accurate as what you could get from a doctor. Not only does this allow AirPods users to set up a hearing aid at home (if needed), but it also creates an accurate hearing profile so that you can hear music the way it was intended.
Another important step in the product design process was making sure there was a tuning baseline for every person that listens to music with AirPods. Everyone hears various frequencies differently, so there needs to be an adjustment to achieve the desired consistency. With the hearing test and accompanying audio profile, Apple then has a starting point to make both technical and artistic decisions. It’s here that the technology and liberal arts expertise among the audio lab team starts to mix.
When your office is a tuning studio
Billy Steele for Engadget
The media tuning lab works on any product Apple makes that can reproduce recorded audio, including the iPhone, Mac and iPad. In order to bridge the gap between the art and science of that pursuit, this team comes from a variety of backgrounds — from live concert sound, to Broadway sound design and even traditional acoustic engineering. The various tuning studios in this area are set up like music creation rooms: complete mini studios with various instruments scattered around, a prime seat for listening in the back and a desk replacing the recording engineer’s sound board. As a nice touch, they're all named after famous recording studios (like Abbey Road).
The main idea there is that the tuning team needs to reference what the recorded content sounded like at the time it was created. That, in turn, provides a better picture of the artist’s intention that can then be applied to products like the AirPods Pro 3. Due to the combination of the ear tips’ seal and the computational audio inside Apple’s latest earbuds model, the tuning engineers believe these AirPods provide the most authentic sound thus far in the company’s lineup because the team has been able to reduce so much of the variation across users and fits.
To create a sound profile that’s exciting for customers and still maintains all of that authenticity, the media tuning team listens to thousands of hours of music, movies, podcasts and YouTube videos in mono, stereo and Dolby Atmos. There’s also a lot of vinyl lining the shelves of these tuning studios. During the development process, the team will test multiple versions of hardware with tons of tuning variations using computational audio. The goal is for all of the tuning decisions to translate better to all users, with a desire that everyone hears the same sound from Apple’s products.
In addition to listening to music through speakers and headphones, microphones are also important to the tuning work. In order to create features like the studio-quality audio recording on AirPods Pro, the team captured clips from the earbuds in the studio and out in the real world before comparing them with benchmarks from high-end recording mics. That analysis allows the engineers to translate pro-grade audio features for consumer products like AirPods. Studio-quality audio, for example, replaces a lavalier with your earbuds for iPhone videos. It won’t ever replace a studio microphone, obviously, but it does put more capable audio tools in your pocket.
The completely silent room
Billy Steele for Engadget
An important part of testing Apple’s audio products, and features like spatial audio, is to use them in a completely silent room. Known as an anechoic chamber, it’s a room within a room that’s physically separated from the rest of the building. This is essential because things like footsteps in the hall or cars driving by outside can create vibrational noise that would otherwise be transmitted into the chamber.
Inside, foam wedges on the walls, floor and ceiling absorb all sound that’s emitted in the space. There’s no echo (hence the name “anechoic”), so voices and claps just die. In fact, you have to walk on a suspended grid that looks like wire fencing, because the true "floor" of the room is more foam wedges meant to absorb sonic reflections from below. It’s an off-putting space to spend time in, since it looks like something out of a sci-fi movie — not to mention the lack of reverb.
Over the decades that Apple has been designing and manufacturing electronics, the company has learned a lot about all of the unwanted noise that its devices make. The anechoic chamber allows a dedicated team of acoustic engineers to listen very carefully to products like AirPods to determine if any sounds are unintended. They collaborate with other engineering teams to make sure the product isn’t doing anything the company doesn’t intend for it to do.
The anechoic chamber is also a vital part of spatial audio development. In its current configuration, there’s a chair in the room with a ring of tiny speakers around it. Engineers study the variable physiology of test subjects, like the way sound bounces off the body and around the inside of their ears. To then create the perception of sound coming from a particular direction, the team uses computational audio and signal processing to create the ideal angle for a person’s hearing signature. This sort of analysis was directly responsible for Personalized Spatial Audio, which takes a scan from an iPhone camera and analyzes it with various models and algorithms to tailor the sound to each person.
Fantasia Lab: Verifying ANC, transparency mode and spatial audio
The last stop on my tour was the most visually and sonically appealing. This room is called the Fantasia Lab, named for the first film that used surround sound. The name also speaks to the Apple engineers’ ability to generate (or simulate) any sound they can think of with the room’s spherical speaker setup. The audio lab team used this room to verify features on the AirPods Pro 3, including transparency mode, active noise cancellation (ANC) and spatial audio.
The array of dozens of loudspeakers enable the engineers to assess whether environmental sounds in transparency mode are as accurate and natural as possible. The team will have someone sit inside the sphere and have them indicate which direction the audio is coming from to eliminate any issues with the feature. To gauge ANC performance, different types of sounds at various volumes are piped in. This gives the engineers insight into the workings of the adaptation and oversight algorithms, the bits of software employed to make sure the ANC is steadily and effectively blocking as much noise as possible. And for spatial audio, the team will play sounds at different locations and angles from real speakers before trying to recreate the perception that sound is coming from the same place inside of the AirPods.
I was able to take a seat for a few seconds to get a sense of what the Fantasia Lab is capable of. One of the engineers played a live recording of a concert in spatial audio. With speakers all around me, sound was coming from all directions — including the roar of the crowd singing along. I closed my eyes and I was there, vibing to Omar Apollo with tens of thousands of people. Except, of course, I wasn’t. I was surrounded by speakers in a small room with four other people, through a maze of corridors, tucked into one of Apple’s myriad buildings around Cupertino.
Imagine my disappointment when I opened my eyes.