Science + Technology

Left and Right Ears Not Created Equal as Newborns Process Sound, UCLA/University of Arizona Scientists Discover

|

Challengingdecades of scientific belief that the decoding of sound originates from apreferred side of the brain, UCLA and University of Arizona scientists havedemonstrated that right-left differences for the auditory processing of soundstart at the ear.

Reportedin the Sept. 10 edition of Science, the new research could hold profoundimplications for rehabilitation of personswith hearing loss in one or both ears, and help doctors enhance speech andlanguage development in hearing-impaired newborns.

"Frombirth, the ear is structured to distinguish between various types of sounds andto send them to the optimal side in the brain for processing," said YvonneSininger, visiting professor of head and necksurgery at the David Geffen School of Medicine at UCLA. "Yet no one has lookedclosely at the role played by the ear in processing auditory signals."

Scientistshave long understood that the auditory regions of the two halves of the brainsort out sound differently. The left side dominates in deciphering speech andother rapidly changing signals, while the right side leads in processing tonesand music. Because of how the brain's neural network is organized, the lefthalf of the brain controls the right side of the body, and the left ear is moredirectly connected to the right side of the brain.

Priorresearch had assumed that a mechanism arising from cellular properties uniqueto each brain hemisphere explained why the two sides of the brain process sounddifferently. But Sininger's findings suggest that the difference is inherent inthe ear itself.

"Wealways assumed that our left and right ears worked exactly the same way," shesaid. "As a result, we tended to thinkit didn't matter which ear was impaired in a person. Now we see that it mayhave profound implications for the individual's speech and languagedevelopment."

Workingwith co-author Barbara Cone-Wesson, associate professor of speech and hearingsciences at the University of Arizona, Sininger studied tiny amplifiers in theouter hair cells of the inner ear.

"Whenwe hear a sound, tiny cells in our ear expand and contract to amplify thevibrations," Sininger said. "The inner hair cells convert the vibrations toneural cells and send them to the brain, which decodes the input."

"Theseamplified vibrations also leak back out to the ear in a phenomena callotoacoustic emission (OAE)," Sininger said. "We measured the OAE by inserting amicrophone in the ear canal."

In asix-year study, the UCLA/University of Arizona team evaluated more than 3,000newborns for hearing ability before they left the hospital. Sininger andCone-Wesson placed a tiny probe device in the baby's ear to test its hearing.The probe emitted a sound and measured the ear's OAE.

Theresearchers measured the babies' OAE with two types of sound. First, they usedrapid clicks and then sustained tones. They were surprised to find that theleft ear provides extra amplification for tones like music, while the right earprovides extra amplification for rapid sounds timed like speech.

"Wewere intrigued to discover that the clicks triggered more amplification in thebaby's right ear, while the tones induced more amplification in the baby's leftear," Sininger said. "This parallels how the brain processes speech and music,except the sides are reversed due to the brain's cross connections."

"Ourfindings demonstrate that auditory processing starts in the ear before it isever seen in the brain," Cone-Wesson said. "Even at birth, the ear isstructured to distinguish between different types of sound and to send it tothe right place in the brain."

Previousresearch supports the team's new findings. For example, earlier research showsthat children with impairment in the right ear encounter more trouble learningin school than children with hearing loss in the left ear.

"If aperson is completely deaf, our findings may offer guidelines to surgeons forplacing a cochlear implant in the individual's left or right ear and influencehow cochlear implants or hearing aids are programmed to process sound,"Cone-Wesson said. "Sound-processing programs for hearing devices could beindividualized for each ear to provide the best conditions for hearing speechor music."

"Ournext step is to explore parallel processing in brain and ear simultaneously,"Sininger said. "Do the ear and brain work together or independently in dealingwith stimuli? How does one-sided hearing loss affect this process? And finally,how does hearing loss compare to one‑sided loss in the right or left ear?"

TheNational Institute on Deafness and Other Communicative Disorders funded thestudy.

-UCLA-

ES412

Media Contact