“Look who's talking!” Gaze Patterns for Implicit and Explicit Audio-Visual Speech Synchrony Detection in Children With High-Functioning Autism
Corresponding Author
Ruth B. Grossman
Emerson College, Department of Communication Sciences and Disorders, 120 Boylston Street, Boston, Massachusetts
University of Massachusetts Medical School Shriver Center, 200 Trapelo Rd, Waltham, Massachusetts
Address for correspondence and reprints: Ruth B. Grossman, Emerson College, 120 Boylston Street, Boston, MA 02116. E-mail: [email protected]Search for more papers by this authorErin Steinhart
University of Massachusetts Medical School Shriver Center, 200 Trapelo Rd, Waltham, Massachusetts
Search for more papers by this authorTeresa Mitchell
University of Massachusetts Medical School Shriver Center, 200 Trapelo Rd, Waltham, Massachusetts
Search for more papers by this authorWilliam McIlvane
University of Massachusetts Medical School Shriver Center, 200 Trapelo Rd, Waltham, Massachusetts
Search for more papers by this authorCorresponding Author
Ruth B. Grossman
Emerson College, Department of Communication Sciences and Disorders, 120 Boylston Street, Boston, Massachusetts
University of Massachusetts Medical School Shriver Center, 200 Trapelo Rd, Waltham, Massachusetts
Address for correspondence and reprints: Ruth B. Grossman, Emerson College, 120 Boylston Street, Boston, MA 02116. E-mail: [email protected]Search for more papers by this authorErin Steinhart
University of Massachusetts Medical School Shriver Center, 200 Trapelo Rd, Waltham, Massachusetts
Search for more papers by this authorTeresa Mitchell
University of Massachusetts Medical School Shriver Center, 200 Trapelo Rd, Waltham, Massachusetts
Search for more papers by this authorWilliam McIlvane
University of Massachusetts Medical School Shriver Center, 200 Trapelo Rd, Waltham, Massachusetts
Search for more papers by this authorAbstract
Conversation requires integration of information from faces and voices to fully understand the speaker's message. To detect auditory-visual asynchrony of speech, listeners must integrate visual movements of the face, particularly the mouth, with auditory speech information. Individuals with autism spectrum disorder may be less successful at such multisensory integration, despite their demonstrated preference for looking at the mouth region of a speaker. We showed participants (individuals with and without high-functioning autism (HFA) aged 8–19) a split-screen video of two identical individuals speaking side by side. Only one of the speakers was in synchrony with the corresponding audio track and synchrony switched between the two speakers every few seconds. Participants were asked to watch the video without further instructions (implicit condition) or to specifically watch the in-synch speaker (explicit condition). We recorded which part of the screen and face their eyes targeted. Both groups looked at the in-synch video significantly more with explicit instructions. However, participants with HFA looked at the in-synch video less than typically developing (TD) peers and did not increase their gaze time as much as TD participants in the explicit task. Importantly, the HFA group looked significantly less at the mouth than their TD peers, and significantly more at non-face regions of the image. There were no between-group differences for eye-directed gaze. Overall, individuals with HFA spend less time looking at the crucially important mouth region of the face during auditory-visual speech integration, which is maladaptive gaze behavior for this type of task. Autism Res 2015, 8: 307–316. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.
References
- Bar-Haim, Y., Shulman, C., Lamy, D., & Reuveni, A. (2006). Attention to eyes and mouth in high-functioning children with autism. Journal of Autism and Developmental Disorders, 36(1), 131–137.
- Bebko, J.M., Weiss, J.A., Demark, J.L., & Gomez, P. (2006). Discrimination of temporal synchrony in intermodal events by children with autism and children with developmental disabilities without autism. Journal of Child Psychology and Psychiatry, 47(1), 88–98.
- Buchan, J.N., Pare, M., & Munhall, K.G. (2007). Spatial statistics of gaze fixations during dynamic face processing. Soc Neurosci, 2(1), 1–13.
-
de Gelder, B.,
Vroomen, J., &
van der Heide, L. (1991). Face recognition and lip-reading in autism. European Journal of Cognitive Psychology, 3(1), 69–86.
10.1080/09541449108406220 Google Scholar
- de Boer-Schellekens, L., Eussen, M., & Vroomen, J. (2013). Diminished sensitivity of audiovisual temporal order in autism spectrum disorder. Frontiers in integrative neuroscience, 7.
- Dunn, L.M., & Dunn, L.M. (1997). Peabody Picture Vocabulary Test ( 3 ed.). Circle Pines, MN: American Guidance Service.
- Foss-Feig, J., Kwakye, L., Cascio, C., Burnette, C., Kadivar, H., Stone, W., et al. (2010). An extended multisensory temporal binding window in autism spectrum disorders. Experimental Brain Research, 203(2), 381–389.
- Grossman, R.B., Schneps, M.H., & Tager-Flusberg, H. (2009). Slipped lips: Onset asynchrony detection of auditory-visual language in autism. Journal of Child Psychology and Psychiatry, 50(4), 491–497.
- Grossman, R.B., Smith, A., Steinhart, E., & Mitchell, T. (2012). Visual scanning of facial expression with high and low intensity. Poster presented at Gatlinburg Conference, Annapolis, MD.
- Hall, G.B., Szechtman, H., & Nahmias, C. (2003). Enhanced salience and emotion recognition in Autism: A PET study. Am J Psychiatry, 160(8), 1439–1441.
- Hampson, E., van Anders, S.M., & Mullin, L.I. (2006). A female advantage in the recognition of emotional facial expressions: Test of an evolutionary hypothesis. Evolution and Human Behavior, 27(6), 401–416.
- Iarocci, G., & McDonald, J. (2006a). Sensory integration and the perceptual experience of persons with autism. Journal of Autism & Developmental Disorders, 36(1), 77–90.
- Iarocci, G., & McDonald, J. (2006b). Sensory Integration and the Perceptual Experience of Persons with Autism. [Article]. Journal of Autism & Developmental Disorders, 36(1), 77–90.
- Iarocci, G., Rombough, A., Yager, J., Weeks, D. J., & Chua, R. (2010). Visual influences on speech perception in children with autism. Autism, 14(4), 305–320.
- Ijsseldijk, F. (1992). Speechreading performance under different conditions of video image, repetition, and speech rate. Journal of Speech Language & Hearing Research, 35(2), 466–471.
-
Irwin, J. (2006). Audiovisual speech integration in children with autism spectrum disorders. Journal of the Accoustic Society of America, 120(5, Pt 2), 3348.
10.1121/1.4781392 Google Scholar
- Jones, W., Carr, K., & Klin, A. (2008). Absence of preferential looking to the eyes of approaching adults predicts level of social disability in 2-year-old toddlers with autism spectrum disorder. Archives of General Psychiatry, 65(8), 946–954.
- Joseph, R.M., & Tanaka, J. (2003). Holistic and part-based face recognition in children with autism. Journal of Child Psychology and Psychiatry, 44(4), 529–542.
- Klin, A., Jones, W., Schultz, R., Volkmar, F., & Cohen, D. (2002). Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. Archives in General Psychiatry, 59(9), 809–816.
- Kozlowski, L., & Cutting, J. (1977). Recognizing the sex of a walker from a dynamic point-light display. Perception & Psychophysics, 21(6), 575–580.
- Kwakye, L.D., Foss-Feig, J.H., Cascio, C.J., Stone, W.L., & Wallace, M.T. (2011). Altered auditory and multisensory temporal processing in autism spectrum disorders. Front Integr Neurosci, 4, 129.
- Langdell, T. (1978). Recognition of faces: An approach to the study of autism. Journal of Child Psychology and Psychiatry, 19(3), 255–268.
- Lansing, C.R., & McConkie, G.W. (2003). Word identification and eye fixation locations in visual and visual-plus-auditory presentations of spoken sentences. [Research Support, U.S. Gov't, P.H.S.]. Perception & psychophysics, 65(4), 536–552.
- Lord, C., Rutter, M., DiLavore, P.C., & Risi, S. (1999). Autism Diagnostic Observation Schedule - WPS (ADOS-WPS). Los Angeles, CA: Western Psychological Services.
- Magnée, M.J., de Gelder, B., van Engeland, H., & Kemner, C. (2008). Audiovisual speech integration in pervasive developmental disorder: Evidence from event-related potentials. Journal of Child Psychology and Psychiatry, 49(9), 995–1000.
- Marassa, L.K., & Lansing, C.R. (1995). Visual word recognition in two facial motion conditions: Full-face versus lips-plus-mandible. Journal of Speech Language & Hearing Research, 38(6), 1387–1394.
- Massaro, D.W. (1998). Perceiving talking faces: From speech perception to a behavioral principle. Cambridge, MA: MIT Press, Bradford Books.
- McGurk, H., & MacDonald, J. (1976). Hearing lips and seeing voices. Nature, 264(5588), 746–748.
- Medeiros, K., & Winsler, A. (2014). Parent–child gesture use during problem solving in autistic spectrum disorder. Journal of Autism and Developmental Disorders, 44(8), 1946–1958.
- Neumann, D., Spezio, M.L., Piven, J., & Adolphs, R. (2006). Looking you in the mouth: Abnormal gaze in autism resulting from impaired top-down modulation of visual attention. Social Cognitive and Affective Neuroscience, 1(3), 194–202.
- Nishimura, M., Rutherford, M.D., & Maurer, D. (2008). Converging evidence of configural processing of faces in high-functioning adults with autism spectrum disorders. Visual Cognition, 16(7), 859–891.
- Nishiyama, T., & Kanne, S. (2014). On the Misapplication of the BAPQ in a Study of Autism. Journal of Autism and Developmental Disorders, 44(8), 2079–2080.
- Nuske, H., Vivanti, G., & Dissanayake, C. (2014). Brief Report: Evidence for normative resting-state physiology in autism. Journal of Autism and Developmental Disorders, 44(8), 2057–2063.
- Paré, M., Richler, R., Hove, M., & Munhall, K.G. (2003). Gaze behavior in audiovisual speech perception: The influence of ocular fixations on the McGurk effect. Perception & psychophysics, 65(4), 553–567.
- Paul, R., Campbell, D., Gilbert, K., & Tsiouri, I. (2013). Comparing spoken language treatments for minimally verbal preschoolers with autism spectrum disorders. Journal of Autism & Developmental Disorders, 43(2), 418–431.
- Pelphrey, K.A., Sasson, N.J., Reznick, J.S., Paul, G., Goldman, B.D., & Piven, J. (2002). Visual scanning of faces in autism. Journal of Autism & Developmental Disorders, 32(4), 249–261.
- Piven, J., & Sasson, N. (2014). On the misapplication of the broad autism phenotype questionnaire in a study of autism. Journal of Autism and Developmental Disorders, 44(8), 2077–2078.
- Roid, G.H., & Miller, L.J. (1997). Leiter International Performance Scale—Revised. Wood Dale, IL: Stoelting Co.
- Rutherford, M., & McIntosh, D. (2007). Rules versus Prototype Matching: Strategies of perception of emotional facial expressions in the autism spectrum. Journal of Autism and Developmental Disorders, 37(2), 187–196.
- Rutherford, M., & Towns, A. (2008). Scan path differences and similarities during emotion perception in those with and without autism spectrum disorders. Journal of Autism and Developmental Disorders, 38(7), 1371–1381.
- Shams, L., Kamitani, Y., & Shimojo, S. (2004). Modulation of visual perception by sound. In Calvert G.A. , Spence C. , & Stein B.E. (Eds.), The handbook of multisensory processes (pp. 26–33). Cambridge, MA: MIT Press.
- Smith, E.G., & Bennetto, L. (2007). Audiovisual speech integration and lipreading in autism. Journal of Child Psychology and Psychiatry, 48(8), 813–821.
- Spezio, M.L., Adolphs, R., Hurley, R.S., & Piven, J. (2007). Abnormal use of facial information in high-functioning autism. Journal of Autism & Developmental Disorders, 37(5), 929–939.
- Sumby, W.H., & Pollack, I. (1954). Visual contribution to speech intelligibility in noise. Journal of the Acoustical Society of America, 26, 212–215.
- Summerfield, Q., & McGrath, M. (1984). Detection and resolution of audio-visual incompatibility in the perception of vowels. The Quarterly Journal of Experimental Psychology Section A, 36(1), 51–74.
- Ulloa, E.R., & Pineda, J.A. (2007). Recognition of point-light biological motion: Mu rhythms and mirror neuron activity. Behavioural Brain Research, 183(2), 188–194.
- van der Smagt, M.J., van Engeland, H., & Kemner, C. (2007). Brief report: Can you see what is not there? low-level auditory-visual integration in autism spectrum disorder. Journal of Autism & Developmental Disorders, 37(10), 2014–2019.
- Vatikiotis-Bateson, E., Eigsti, I.M., & Yano, S. (1994). Listener eye movement behavior during audiovisual perception. Proceedings of the acoustical society of Japan, 94(3), 679–680.
- Vatikiotis-Bateson, E., Eigsti, I.M., Yano, S., & Munhall, K.G. (1998). Eye movement of perceivers during audiovisual speech perception. [Comparative Study]. Perception & psychophysics, 60(6), 926–940.
- Zainal, H., Magiati, I., Tan, J.-L., Sung, M., Fung, D.S., & Howlin, P. (2014). A Preliminary investigation of the spence children's anxiety parent scale as a screening tool for anxiety in young people with autism spectrum disorders. Journal of Autism and Developmental Disorders, 44(8), 1982–1994.