Pages

Wednesday, November 16, 2005

Eye Tracking Machines and Autism: A Business Case ? Part deux


This is interesting: Andrew Meltzoff and Rechele Brooks at University of Washington came up with new results on how kids learn how to talk. In this new study they show that...
"... babies who simultaneously followed the eyes of the researcher and made vocalizations when they were 10 or 11 months old understood an average of 337 words at 18 months old while the other babies understood an average of 195 words.

"The sounds they are making are very simple, but some children are looking and making these sounds spontaneously," said Brooks. "They are creating a social interaction or a link. There seems to be something special about the vocalization when they are looking at the toy. They are using social information to pick out what we are focusing on. They can't vocalize words, but they are carefully watching where we are looking. We think they are using social information and getting a boost in figuring out the social and language world together."

"Although the babies are too young to talk to us, those individual babies who are most attuned to our eye gaze are the same babies who pick up language faster more than half a year later," said Meltzoff. "This is a fascinating connection between the social and linguistic world and suggests that language acquisition is supported by preverbal social interaction.

"To do this a baby has an important social regularity to master: follow mom's eyes and you can discover what she is talking about. This study shows that babies first master this social information between 10 and 11 months of age, and it may be no coincidence that there is a language explosion soon thereafter. It is as if babies have broken the code of what mom is talking about and words begin pouring out of the baby to the parents' delight," he said.

The UW researchers are following the same group of babies to see if gaze-following and vocalization at an early age predict increased language understanding and use at 24 and 30 months of age...
"

In the study of autism, scientists generally believe there is a dysfunction of the eye tracking of faces. This is all the more important since, for babies, watching faces and gestures is the elementary process that enables learning as shown in another paper by Rao and Meltzhoff. In short, researchers think that the deficiency in face tracking is responsible for an inability to learn much about social cues and even language yielding to diagnosis of Austim or PSOD.

Yet another interesting finding by Aysenil Belger and Gabriel Dichter reveals that the processing of faces in the brain of autistic kids is similar to that of normals kids. As it turns out, Kevin Pelphrey shows that gaze following deficiency is the issue, not the processing of face information, as I was initially thinking [second part of the SCM talk]. During his investigation, Pelphrey shows where gaze processing is localized in the brain:

using event-related functional MRI (fMRI), we show that in autism, brain regions involved in gaze processing, including the superior temporal sulcus (STS) region, are not sensitive to intentions conveyed by observed gaze shifts.... We conclude that lack of modulation of the STS region by gaze shifts that convey different intentions contributes to the eye gaze processing deficits associated with autism.


It seems there is a increasingly better business case of eye tracking software everyday.

No comments:

Post a Comment