Read my lips: Study finds lip-reading is easier when they’re yours
From George Costanza farcically misreading the lips of Jerry’s deaf girlfriend in an episode of Seinfeld to the Youtube channel “Bad Lip Reading,” which left many Wash U. students giggling behind their MacBooks in Olin library this election season, faulty lip-reading skits find their niche in a wide range of popular comedies, sitcoms and spoofs.
Lip-reading, also known as “vision only speech perception,” is actually rather difficult for most people. Just watch the news on mute and try to pick apart what the newscaster is saying.
But how does lip-reading really work?
According to Wash. U. audiologist Nancy Murray, lip-reading activates both the occipital and auditory cortices of the brain—regions responsible for visual processing and spoken language processing, respectively. However, the auditory cortex is only activated if lip movements are perceived as speech movements. Inform a participant that they are not witnessing someone speak and only the occipital cortex is activated.
Working with research psychologist Joel Myerson, Murray has been examining exactly what factors into a successful lip-reading attempt. In their most recent study, they’ve found that people may have better luck looking in the mirror—their study found that people on average are better at lip reading their own sentences than others.
First, they recorded each research participant speaking hundreds of random, nonsensical sentences constructed from a list of 36 words. Sentences included phrases like “the frog watched the girl” and “the duck watched the boy.” Two to three weeks later, participants returned to try their hands at reading lips in the recordings of ten people including themselves.
On average, participants lip-read their own sentences better than others, even if the phrases were complicated or confusing.
“The results were surprising,” Murray said. “Who sits in front of a mirror and watches themselves talk all the time?”
The study is actually one of the first to test the link between language perception and language production. But freshman Kjartan Brownell said he personally doesn’t find the results particularly startling.
“That doesn’t surprise me,” Brownell said. “I know which words and phrases I tend to use in everyday conversation, so I’d know what to look for when lip-reading myself.”
Although certain people seem to be naturally adept at reading lips, according to Murray, genetics may not be the only component.
“It is unknown whether lip reading ability is an innate skill, or can be improved by practice,” Murray said.
Those with congenital deafness possess an advantage in lip-reading over those with normal hearing, an advantage which increases with age. But someone with hearing loss who has practiced lip-reading may not necessarily maintain such an advantage over someone of the same age with normal hearing.
Moving forward, Murray and her colleagues plan to start offering lip-reading lessons and track participants’ learning curves.
But at least for now, bad lip-reading seems to be here to stay.