A new autism study out of Vanderbilt University is getting a ringing endorsement both from parents and those on the spectrum. For the first time, researchers have found that autistic children seem to have trouble integrating sight and sound information.

While a speaker’s mouth movements and words go together for a typically developing person, the researchers found in many autistics there’s a delay between what they see and what they hear. “It’s as if they’re watching a badly dubbed movie where the words and the pictures don’t match up,” according to a report on The CBS Evening News.

Mark Wallace, lead author of the study, told ScienceDaily, “one of the classic pictures of children with autism is they have their hands over their ears. We believe that one reason for this may be that they are trying to compensate for their changes in sensory function by simply looking at one sense at a time. This may be a strategy to minimize the confusion between the senses.”

Wallace’s team has developed a video game aimed at helping autistic kids practice putting sight and sound together by speeding up delayed auditory processing. Perhaps the researcher’s most startling suggestion is that not only is sensory integration a real issue with autistic kids—it is perhaps the central core deficit that gives rise to classic autism symptoms like impaired language and social communication.

Wallace notes that while “there is a huge amount of effort and energy going into the treatment of children with autism, virtually none of it is based on a strong empirical foundation tied to sensory function. If we can fix this deficit in early sensory function, then maybe we can see benefits in language and communication and social interactions.”

Blogger Rachel Kenyon, who was diagnosed on the spectrum as an adult and has a 7-year-old daughter also on the spectrum, says the study is on-target.

While she was thought to be typically developing growing up, Kenyon says, “It seemed that I was always a step behind socially or conversationally.” She now watches TV with closed captioning, finding that “I can keep up much more easily with processing the events, emotions, and actions taking place on the screen.”

She has also come to realize that she developed compensatory skills so she could understand conversations: One involves finding a visual point to focus on other than the person’s mouth while they are speaking. “For instance,” she says, “I might choose to stare at a speaker’s freckle or piece of jewelry so that I can use the remaining mental focus on listening, interpreting and constructing appropriate responses.” It’s easy to see how this could be viewed as the lack of of eye contact common in kids and adults on the spectrum.

Kenyon says of the study, “Any time we can broadly share and explain just one aspect of autism in plainly understood language, such as this study does, is a step closer to acceptance. As we move closer to acceptance and understanding, we move proportionally closer to developing educational approaches and technological advances that can help autistics such as my daughter and even myself improve quality of life.”

Joslyn Gray, another autism mom who blogs at stark. raving. mad. mommy., agrees. “I’m so excited about this research,” she says. “I’ll be following this to learn more about therapies that help kids learn to close that gap that makes audio and visual input out of sync for them. But more importantly, I’m excited about helping neurotypical people understand my daughter’s experience.”