Skip to main content

ORIGINAL RESEARCH article

Front. Psychiatry, 23 October 2015
Sec. Child and Adolescent Psychiatry
This article is part of the Research Topic New perspectives on the interdisciplinary research in ADHD View all 11 articles

Face Scanning in Autism Spectrum Disorder and Attention Deficit/Hyperactivity Disorder: Human Versus Dog Face Scanning

  • 1Departamento de Psicobiologia, Universidade Federal de São Paulo, São Paulo, Brazil
  • 2Programa de Pós Graduação em Educação e Saúde, Universidade Federal de São Paulo, São Paulo, Brazil
  • 3Departamento de Psicologia Experimental, Instituto de Psicologia, Universidade de São Paulo, São Paulo, Brazil

This study used eye tracking to explore attention allocation to human and dog faces in children and adolescents with autism spectrum disorder (ASD), attention deficit/hyperactivity disorder (ADHD), and typical development (TD). Significant differences were found among the three groups. TD participants looked longer at the eyes than ASD and ADHD ones, irrespective of the faces presented. In spite of this difference, groups were similar in that they looked more to the eyes than to the mouth areas of interest. The ADHD group gazed longer at the mouth region than the other groups. Furthermore, groups were also similar in that they looked more to the dog than to the human faces. The eye-tracking technology proved to be useful for behavioral investigation in different neurodevelopmental disorders.

Introduction

The study of behavioral and neurophysiological patterns related to visual attention is a promising research field to understand the modulation of attentional performance at different ages and in various neurodevelopmental and neuropsychiatric disorders (1, 2).

Although difficulties in social interaction, empathy, facial expression, recognition, and emotional exchange are core symptoms of autism spectrum disorder (ASD), it has been reported that people with attention deficit/hyperactivity disorder (ADHD) may also have impairment in social cognition and mood regulation, which may lead to high levels of peer rejection (36). Social cognition impairments observed in children with ADHD usually involve difficulties in understanding emotional cues especially in negative contexts, such as anger, sadness and disgust, inadequate emotional reaction to emotional perception, and poor ability to inhibit and regulate emotional and behavioral responses (7). At the same time, changes in selective and sustained attention, one of the most consistent conditions in defining the neuropsychological profile of children with ADHD, are also observed in children with ASD (8, 9).

The current DSM-5 (10) does not exclude ASD in the delimitation of ADHD diagnostic criteria, given the frequent association and comorbidity between the two disorders. The existence of shared biological processes in these two neurodevelopmental conditions has been confirmed in epigenetic (11) and neuroimaging studies (12). Behavioral and neurophysiological measures can help to elucidate these disorders. The identification of objective performance measures as endophenotypic markers underlying the common clinical manifestations may be helpful to improve the differential diagnosis.

Attentional modulation occurs in ADHD at different levels, influencing selective attention guided both to external information and to endogenous processes, linked to executive control and emotional self-regulation (3). In this sense, research on motivated attention can contribute to a better understanding of symptom overlap with ASD. Eye movements are privileged pathways for obtaining knowledge about developmental abnormalities, opening new windows into the working of the mind (13). Eye-tracking technology allows a non-intrusive continuous measurement of attention to different types of visual stimuli and can be coupled to other recording devices to get a more complete picture of the physiological events that occur in the brain during information processing and improve our understanding of the neurophysiological and behavioral bases of ADHD and ASD. Although eye tracking has long been used to investigate the gaze patterns of normal adults, only recently it has been employed to study individuals with neurodevelopmental disorders. For instance, using eye-tracking technique, Riby and Hancock (14) compared how individuals with autism (ASD) and Williams’ syndrome (WS) investigated pictures of social scenes. Those with ASD spent less time than is typical viewing people and faces, whereas those with WS showed exaggerated fixations toward faces, and particularly toward the eyes. This study illustrates how the eye-tracking technique can be used to provide markers for atypical sociability and visual attention in neurodevelopmental disorders. Pelphrey et al. (15) reported anomalous face processing among children and adults with autism, with a greater proportion of their inspection time viewing non-feature areas of the faces and a smaller percentage of time examining core features, such as the nose, mouth, and, in particular, the eyes in comparison with control participants. Tottenham et al. (16) also found that individuals with ASD showed fewer gazes toward the eye region and that this behavioral pattern was accompanied by greater amygdala activation to neutral faces in comparison with controls. Dalton et al. (17) extended the use of eye tracking to the relatives of individuals with autism and found that the unaffected siblings’ gaze fixations and brain activation patterns during a face processing task were similar to that of the autism group compared with a matched control group.

Aims of the Present Paper

The present study aimed to analyze face scanning in two neurodevelopmental disorders. Using eye-tracking techniques, we compared how children and adolescents with ASD, ADHD, and typical development (TD) scanned faces. In contrast with previous studies which have been conducted with high-functioning ASD individuals [e.g., Ref. (15, 18, 19)], our sample included only low-functioning ASD individuals, considering that almost nothing is known about the low-functioning end of the autism spectrum.

In our study, we compared the scanning of human and dog faces by children and adolescents with ASD, ADHD, and typical development (TD). There are well-documented benefits of human–animal interactions for humans of different ages, with and without special mental health conditions with respect to: social attention, social behavior, interpersonal interactions, and mood; stress-related parameters, such as cortisol, heart rate, and blood pressure; self-reported fear and anxiety; and mental and physical health (20). There are reports of changes in prosocial behaviors among autistic children associated to the arrival of a pet in the family (21). It has been proposed that the activation of the oxytocin system plays a key role in these beneficial psychological and psychophysiological effects of human–animal interactions. Oxytocin may be released via eye contact in response to a single meeting with a dog (22). When given the choice to interact with a person, a dog, or an object, children with autism interacted most frequently and for the longest amount of time with the dog (23). Dogs may communicate their intentions in a way more readily understandable to people with autism. Temple Grandin (2426), a high-functioning autistic woman, who became a renowned professor at Colorado State University, reported that autistic people are closer to animals than normal people are and that looking in the eyes of people is aversive for them.

Photo prints used in a preliminary study, which yielded less gaze aversion to dog than to human faces among ASD children (27), were converted into digital pictures. The results of our pilot study are compatible with the report of Temple Grandin suggesting that the gaze of a dog may trigger less emotional activation than a human gaze.

Our intent was to increase the accuracy of our measurements with the use of eye-tracking technology, extending the investigation to samples of ADHD and typically developing children. The results of this type of research could help to obtain behavioral data useful to the understanding of the differential and shared neuropsychological endophenotype underlying the processing of social–emotional cues in ASD and ADHD conditions.

Materials and Methods

Participants

The sample (N = 45) consisted of children and adolescents, 15 with typical development (TD) controls (mean 9.5 years, SD = 3.8, 9 girls and 6 boys), 15 with ASD (mean 11.6 years, SD = 2.7, 2 girls and 13 boys), and 15 with ADHD (mean 9.4 years, SD = 2.3, 3 girls and 12 boys). There was a male predominance in both clinical groups in line with the known prevalence of these disorders. The participants were evaluated in a Children’s Interdisciplinary Neuropsychological Care Center in São Paulo, Brazil, by neuropsychologists, pediatric neurologists, and child psychiatrists using DSM-V criteria. The ADHD group was composed of individuals with IQs in the normal range (scores above 85 on WISC-IV), matched to the control TD group. None of ADHD children were taking psycho-stimulant medication at the time of assessment. The Child Behavior Checklist (CBCL) was used as a screen for psychiatric comorbidity in both controls and ADHD groups. The ASD group was composed of low-functioning individuals (IQ below 70) and with scores between 30 and 50 on the childhood autism rating scale – CARS (28).

Visual Stimuli

Presented stimuli consisted of color photographs of forward facing male and female human faces and dog faces with neutral expression and of neutral control stimuli (clouds and plant), taken from the Karolinska Directed Emotional Faces set (KDEF) (29) and from the International Affective Pictures System (IAPS) (30, 31).

Data Capture Procedures

The eye movements of the participants were recorded with an infrared-based eye-tracking system (Tobii TX 300), integrated with a 23″ TFT monitor (with screen resolution 1920 × 1080 pixel). This equipment used a 300-Hz tracking frequency to collect information about the location and duration of the participant’s gaze fixations on stimuli displayed on the monitor screen (Figure 1). Target static images were presented individually on the monitor for 5 s, separated by a screen showing two rolling dices for 2 s, in order to reduce the effect of environmental distractions, keeping participants’ attention on the monitor screen. The order in which the target images were viewed was randomized across participants.

FIGURE 1
www.frontiersin.org

Figure 1. Illustration of the experimental situation using a video-based eye-tracking system in our laboratory. Using an eye tracker, we can monitor where the child is looking (photograph of Sarah Kuwano Molinari Salotti).

Tests were conducted in three quiet rooms, with similar characteristics, with lights kept at a constant illumination1. The eye-tracking equipment is portable and was moved to each testing location. Participants were tested individually seating approximately 60 cm from the screen in a room with illuminance level of 300 lux in order to guarantee best gaze accuracy (0.4°) and precision (0.07°) as described in the manual of Tobii Tx-300. After a comfortable position was achieved, we asked each child to follow a red ball bouncing around the screen in order to obtain a five-point calibration. The study was conducted only with participants whose calibration was successfully achieved as attested by the Tobii Tx-300. Two research assistants were present during data collection, on each side of the participant, one controlling the computer and the other dealing with logistical issues, but they did not interfere with viewing behavior.

Ethics

The study was approved by the Human Research Ethics Committee at the Institute of Psychology, University of São Paulo, Brazil. Informed consent was obtained from parents.

Eye-Tracking Data

Two areas of interest (AOIs) of equal size were selected on each stimulus image to determine gaze location: eyes region and mouth region. Two measures were taken: the number of fixations in each AOI and the total fixation time. Fixations were defined as a gaze of at least 200 ms duration within a 50-pixel radius, as recommended by Tobii manual2 for static pictures. The ClearView Fixation Filter (Tobii TX-300) was used for eye movement classifications. Statistical analyses of the eye-tracking data of the three groups were performed with a linear mixed model (general linear mixed model – GLMM). This method was used for analyzing the dependent variables total duration and number of fixations. The independent variables were group (TD, ASD, ADHD), type of facial stimulus (dog, man, woman), and AOI (mouth and eyes). The control IVs were age and sex. The first test made by this method in both analyses was the omnibus test of the existence of a significant effect, adopting the 0.05 significance level. If the omnibus test was significant, the fixed effects to be tested in both analyses were the main effects and the interaction of the second and third orders. In these subsequent analyses, Bonferroni correction was used and the level of significance considered was 0.006. At this second step, significant effects were hierarchically identified from the higher order interactions to the main effects. If significant interaction or main effects were identified, post hoc comparisons tests were performed. The software used for analysis was the IBM SPSS Statistics 21.

Results

Total Fixation Time

Descriptive statistics summarizing total fixation time as a function of group, type of stimulus, and AOI (Table 1) show that participants spent much of the stimulus presentation time (5 s) with their eyes fixed on the AOIs (around the eyes and the mouth). Average time spent fixating on the eye region ranged from 1.86 to 3.38 s and on the mouth from 0.29 to 1.31 s, regardless of the type of stimulus.

TABLE 1
www.frontiersin.org

Table 1. Descriptive statistics for total fixation time as a function of group, type of stimulus, and AOI.

General linear mixed model analysis revealed that the model had explanatory value at a 5% significance level, indicating a significant type of stimulus main effect (p = 0.003) and a significant interaction effect between Group and AOI (p < 0.001) (Table 2).

TABLE 2
www.frontiersin.org

Table 2. Summary of GLMM model examining total fixation time as a function of IVs group, type of facial stimulus, and area of interest with age and sex as control IVs.

With respect to type of stimulus, it is notable that, regardless of other factors, participants spent more time gazing at AOIs of dog images in comparison to human images (Table 3). Pairwise comparison, with sequential SIDAK correction, indicated a significant dog × male difference (p = 0.003) and a marginally significant difference dog × female (p = 0.055). No difference was found between male and female images (p = 0.262).

TABLE 3
www.frontiersin.org

Table 3. Means, SDs, and confidence intervals of total fixation time as a function of type of stimulus.

Heatmaps of the most attended areas of animal and human faces illustrate both the similarities and the differences among TD, ASD, and ADHD individuals (Figure 3).

The interaction effect represented in Figure 2 shows that, regardless of the stimuli, the typical development group gazed longer at the eyes region in comparison with the other two groups with developmental disorders. Pairwise comparisons with sequential correction SIDAK revealed statistically significant differences of TD versus ASD (p = 0.001) and TD versus ADHD (p = 0.002). In addition, the ADHD group gazed longer at the mouth region than the other groups. Pairwise comparisons with sequential SIDAK correction indicated significant difference ADHD versus ASD (p = 0.025), and a trend toward significance between ADHD and TD (p = 0.075).

FIGURE 2
www.frontiersin.org

Figure 2. Total fixation time as a function of type of stimulus.

FIGURE 3
www.frontiersin.org

Figure 3. Heat diagrams illustration of the most attended areas of animal and human faces by TD, ASD, and ADHD individuals.

Discussion

In this study, we investigated attentional modulation according to face regions in samples of ADHD, ASD, and typically developing children, by means of eye-tracking procedures. Participants from the ADHD group had considerably higher intellectual and adaptive functioning in comparison to those from ASD group. Nevertheless, they did not differ on time or region of interest of eye tracking, independently of the kind of stimuli. Compared to typically developing controls, the children in the two clinical groups spent less time viewing the eye region than other regions of the face.

This result seems to reinforce previous evidence that difficulties in processing social cues may be shared by both clinical conditions. For instance, using functional neuroimaging techniques, Christakou et al. (11) found a reduced activation in striatal thalamic regions, in superior parietal, and left dorsolateral prefrontal cortex in children with ADHD and ASD in comparison to an age- and IQ-matched control group. Parts of these regions are related to an endogenous processing system known as “Default Network.” The Default network is thought to play an important role in human introspective and adaptive mental activities usually defined as “internal mentation” (32). For instance, it is well known that the theory-of-mind, as other cognitive processes related to socialization skills, is supported by this system (33).

Our results may be summarized in three main findings. The first is that all children looked at dog’s images more than at humans. This apparent greater interest of all participants for dog images suggests a widespread influence of motivational traits in attentional drive in childhood, which can be linked to an evolutionary perspective. A special human interest for other animals can be observed from a very early age (34). “Biophilia Hypothesis” from Wilson (35) proposes that this attraction for nature and life is a result of our evolutionary past.

The second concerns is that the differences in gaze according to face region. All children looked at eyes longer than at the mouth region, but the duration of fixation was lower in ASD and ADHD groups in comparison to TDs. There are several evidences that the processing of facial expressions in ASD children differ than the processing showed by normally developing children. Such differences can be explained by affective or preferences in visual analysis. For instance, an influence of affective valences in the processing of emotions in faces has been reported frequently in ASD children [e.g., Ref. (36)]. A preference for the left visual hemifield in the early stage of visual analysis of faces was observed in typically developing children but not in ASD children when looking to human and also to dog faces (37). In ADHD, on the other hand, such differences are not usually described.

The third main finding is that ADHD children focused on mouth regions more than the other two groups. A possible influence of affective valences in the processing of faces regions may be considered. For instance, Pelc et al. (37) observed, in a face emotion recognition task, that children with ADHD had more difficulties with anger and sadness faces than with other emotions. Problems in emotion recognition were also identified in boys at risk for ADHD (38). Children confounded the emotions of happiness and anger with that of sadness, and spent more time in the eye tracking to identify them. In our study, although we did not investigate specifically responses to emotional valences, the fact that ADHD children have fixed for more time the region of the mouth than the other groups may be related to the importance of the mouth opening for a more precise distinction among positive and negative emotions.

Our findings concerning a possible motivational effect of the interaction with dogs may have clinical implications, for instance, in the planning of alternative behavioral strategies in rehabilitation settings for children with neurodevelopmental disabilities. Theoretical implications include a better understanding of the maturational changes underlying social skill deficits. Finally, we consider that the eye-tracking technology proved to be useful for behavioral investigation even in low-functioning ASD children, and in this way its use in a more natural or ecological assessment setting looks promising.

Some factors may limit generalization of our findings. For instance, we did not include a high-functioning ASD sample for comparison. The most important seems to be the small size and the demographic and clinical heterogeneity of the samples. ADHD were older than ASD children. Participants from ADHD and ASD groups were considerably different in terms of intellectual and adaptive functioning. Nevertheless, they were similar in some aspects of AOI while visually scanning human and dog faces. Other studies will be needed to confirm these findings.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

This research was supported by a CNPq Project Grant (Grant No 301241/2013-9) to EO. We thank Juliana Lucena, André Correa Carvalho, Ana Caroline Aramaki Hitomi, Andressa Gourlat Crossetti, Andriely Darcanchy de Toledo, Bruna Assunção, Guilherme do Espirito Santo Paes, Isabela de Moraes Benzoni, Julia Pizani Leme Ferreira, Juliana Meirelles Hito, Mariana Fischer, and Marina Lavrador for assistance with data collection.

Footnotes

  1. ^The data from TD participants were collected in the Didactic Laboratory of Ethology at the Institute of Psychology of the University of São Paulo, the data from ASD participants were collected in the Paulista School of Special Education at São Bernardo do Campo, and the data of participants with ADHD participants were collected in the Paulista Neuropsychology Center/AFIP.
  2. ^User Manual – Tobii Studio Version 3.2 (http://www.tobii.com).

References

1. Popovic N. Eye-tracking software may reveal autism and other brain disorders. Sci Am (2013) 18:10.

Google Scholar

2. West GL, Mendizabal S, Carrière MP, Lippé S. Linear age-correlated development of inhibitory saccadic trajectory deviations. Dev Psychol (2014) 50(9):2285–90. doi: 10.1037/a0037383

PubMed Abstract | CrossRef Full Text | Google Scholar

3. Barkley RA. Deficient emotional self-regulation is a core component of ADHD. J ADHD Relat Disord (2010) 1(2):5–37.

Google Scholar

4. Maedgen JW, Carlson CL. Social functioning and emotional regulation in the attention deficit hyperactivity disorder subtypes. J Clin Child Psychol (2000) 29(1):30–42. doi:10.1207/S15374424jccp2901_4

PubMed Abstract | CrossRef Full Text | Google Scholar

5. Kotte A, Joshi G, Fried R, Uchida M, Spencer A, Woodworth KY, et al. Autistic traits in children with and without ADHD. Pediatrics (2013) 132(3):e612–22. doi:10.1542/peds.2012-3947

PubMed Abstract | CrossRef Full Text | Google Scholar

6. Caillies S, Bertot V, Motte J, Raynaud C, Abely M. Social cognition in ADHD: irony understanding and recursive theory of mind. Res Dev Disabil (2014) 35(11):3191–8. doi:10.1016/j.ridd.2014.08.002

PubMed Abstract | CrossRef Full Text | Google Scholar

7. Williams LM, Hermens DF, Palmer D, Kohn M, Clarke S, Keage H, et al. Misinterpreting emotional expressions in attention-deficit/hyperactivity disorder: evidence for a neural marker and stimulant effects. Biol Psychiatry (2008) 63:917–26. doi:10.1016/j.biopsych.2007.11.022

PubMed Abstract | CrossRef Full Text | Google Scholar

8. Konst MJ, Matson JL, Goldin R, Rieske R. How does ASD symptomology correlate with ADHD presentations? Res Dev Disabil (2014) 35(9):2252–9. doi:10.1016/j.ridd.2014.05.017

PubMed Abstract | CrossRef Full Text | Google Scholar

9. Leitner Y. The co-occurrence of autism and attention deficit hyperactivity disorder in children – what do we know? Front Hum Neurosci (2014) 29(8):268. doi:10.3389/fnhum.2014.00268

PubMed Abstract | CrossRef Full Text | Google Scholar

10. American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders. 4 ed. Washington, DC: American Psychiatric Association (2000).

Google Scholar

11. Martin J, Cooper M, Hamshere ML, Pocklington A, Scherer SW, Kent L, et al. Biological overlap of attention-deficit/hyperactivity disorder and autism spectrum disorder: evidence from copy number variants. J Am Acad Child Adolesc Psychiatry (2014) 53(7):761–70. doi:10.1016/j.jaac.2014.03.004

PubMed Abstract | CrossRef Full Text | Google Scholar

12. Christakou A, Murphy CM, Chantiluke K, Cubillo AI, Smith AB, Giampietro V, et al. Disorder-specific functional abnormalities during sustained attention in youth with attention deficit hyperactivity disorder (ADHD) and with autism. Mol Psychiatry (2013) 18:236–44. doi:10.1038/mp.2011.185

PubMed Abstract | CrossRef Full Text | Google Scholar

13. Karatekin C. Eye tracking studies of normative and atypical development. Dev Rev (2007) 27(3):283–348. doi:10.1016/j.dr.2007.06.006

CrossRef Full Text | Google Scholar

14. Riby DM, Hancock PJ. Viewing it differently: social scene perception in Williams syndrome and autism. Neuropsychologia (2008) 46(11):2855–60. doi:10.1016/j.neuropsychologia.2008.05.003

PubMed Abstract | CrossRef Full Text | Google Scholar

15. Pelphrey KA, Sasson NJ, Reznick JS, Paul G, Goldman BD, Piven J. Visual scanning of faces in autism. J Autism Dev Disord (2002) 32:249–61. doi:10.1023/A:1016374617369

PubMed Abstract | CrossRef Full Text | Google Scholar

16. Tottenham N, Hertzig ME, Gillespie-Lynch K, Gilhooly T, Millner AJ, Casey BJ. Elevated amygdala response to faces and gaze aversion in autism spectrum disorder. Soc Cogn Affect Neurosci (2014) 9(1):106–17. doi:10.1093/scan/nst050

PubMed Abstract | CrossRef Full Text | Google Scholar

17. Dalton KM, Nacewicz BM, Alexander AL, Davidson RJ. Gaze-fixation, brain activation, and amygdala volume in unaffected siblings of individuals with autism. Biol Psychiatry (2006) 61(4):512–20. doi:10.1016/j.biopsych.2006.05.019

PubMed Abstract | CrossRef Full Text | Google Scholar

18. Dalton KM, Nacewicz BM, Johnstone T, Schaefer HS, Gernsbacher MA, Goldsmith HH, et al. Gaze fixation and the neural circuitry of face processing in autism. Nat Neurosci (2005) 8(4):519–26. doi:10.1038/nn1421

PubMed Abstract | CrossRef Full Text | Google Scholar

19. Klin A, Jones W, Schultz R, Volkmar F, Cohen D. Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. Arch Gen Psychiatry (2002) 59:809–16. doi:10.1001/archpsyc.59.9.809

PubMed Abstract | CrossRef Full Text | Google Scholar

20. Beetz A, Uvnäs-Moberg K, Julius H, Kotrschal K. Psychosocial and psychophysiological effects of human-animal interactions: the possible role of oxytocin. Front Psychol (2012) 3:234. doi:10.3389/fpsyg.2012.00234

PubMed Abstract | CrossRef Full Text | Google Scholar

21. Grandgeorge M, Tordjman S, Lazartigues A, Lemonnier E, Deleau M, Hausberger M. Does pet arrival trigger prosocial behaviors in individuals with autism? PLoS One (2012) 7(8):e41739. doi:10.1371/journal.pone.0041739

PubMed Abstract | CrossRef Full Text | Google Scholar

22. Carter C, Keverne EB. The neurobiology of social affiliation and pair bonding. In: Pfaff D, editor. Hormones, Brains and Behavior. San Diego, CA: Academic Press (2002). p. 299–337.

Google Scholar

23. Prothmann A, Ettrich C, Prothmann S. Preference for, and responsiveness to, people, dogs and objects in children with autism. Anthrozoos (2009) 22:161–71. doi:10.2752/175303709X434185

CrossRef Full Text | Google Scholar

24. Grandin T. Thinking in Pictures: And Other Reports From My Life With Autism. New York, NY: Random House (1995).

Google Scholar

25. Grandin T. My mind is a web browser: how people with autism think. Cerebrum (2000) 2:14–22.

Google Scholar

26. Grandin T. How does visual thinking work in the mind of a person with autism? A personal account. Philos Trans Roy Soc B Biol Sci (2009) 364:1437–42. doi:10.1098/rstb.2008.0297

PubMed Abstract | CrossRef Full Text | Google Scholar

27. Chelini M-OM, Lacerda JR, Mangabeira V, Otta E. Indivíduos com autismo evitam menos o olhar de um cão do que de outro ser humano. [Individuals with autism avoid less a dog’s gaze of than a human’s gaze]. Caderno de Resumos da 42a Reunião Anual da Sociedade Brasileira de Psicologia, VIII Congresso Iberoamericano de Psicologia. São Paulo: Sociedade Brasileira de Psicologia (2012).

Google Scholar

28. Schopler E, Reichler RJ, DeVellis RF, Daly K. Toward objective classification of childhood autism: childhood autism rating scale (CARS). J Autism Dev Disord (1980) 10:91–103. doi:10.1007/BF02408436

CrossRef Full Text | Google Scholar

29. Lundqvist D, Flykt A, Öhman A. The Karolinska Directed Emotional Faces – KDEF, CD ROM from Department of Clinical Neuroscience, Psychology Section. Stockholm: Karolinska Institutet (1998).

Google Scholar

30. Lang PJ, Bradley MM, Cuthbert BN. International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual. Technical Report A-8. Gainesville, FL: University of Florida (2008).

Google Scholar

31. Ribeiro RL, Pompéia S, Bueno OFA. Brazilian norms for the international affective picture system (IAPS): brief report. Revista de Psiquiatria do Rio Grande do Sul (2004) 26:190–4. doi:10.1590/S0101-81082004000200008

CrossRef Full Text | Google Scholar

32. Andrews-Hanna JR. The brain’s default network and its adaptive role in internal mentation. Neuroscientist (2012) 18(3):251–70. doi:10.1177/1073858411403316

PubMed Abstract | CrossRef Full Text | Google Scholar

33. Washington SD, VanMeter JW. Anterior-posterior connectivity within the default mode network increases during maturation. Int J Med Biol Front (2015) 21(2):207–18.

PubMed Abstract | Google Scholar

34. DeLoache JS, Pickard MB, LoBue V. How very young children think about animals. In: McCardle P, McCune S, Griffin JA, Maholmes V, editors. How Animals Affect Us: Examining the Influence of Human–Animal Interaction on Child Development and Human Health. Washington, DC: American Psychological Association (2011). p. 85–99.

Google Scholar

35. Wilson EO. Biophilia. Cambridge, MA: Harvard University Press (1984).

Google Scholar

36. Wang AT, Dapretto M, Hariri AR, Sigman M, Bookheimer SY. Neural correlates of facial affect processing in children and adolescents with autism spectrum disorder. J Am Acad Child Adolesc Psychiatry (2004) 43(4):481–90. doi:10.1097/00004583-200404000-00015

PubMed Abstract | CrossRef Full Text | Google Scholar

37. Pelc K, Kornreich C, Foisy ML, Dan B. Recognition of emotional facial expressions in attention-deficit hyperactivity disorder. Pediatr Neurol (2006) 35:93–7. doi:10.1016/j.pediatrneurol.2006.01.014

PubMed Abstract | CrossRef Full Text | Google Scholar

38. Kats-Gold I, Besser A, Priel B. The role of simple emotion recognition skills among school aged boys at risk of ADHD. J Abnorm Child Psychol (2007) 35:363–78. doi:10.1007/s10802-006-9096-x

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: face scanning, ASD, ADHD, eye tracking, neurodevelopmental disorders

Citation: Muszkat M, de Mello CB, Muñoz POL, Lucci TK, David VF, Siqueira JO and Otta E (2015) Face scanning in autism spectrum disorder and attention deficit/hyperactivity disorder: human versus dog face scanning. Front. Psychiatry 6:150. doi: 10.3389/fpsyt.2015.00150

Received: 29 May 2015; Accepted: 08 October 2015;
Published: 23 October 2015

Edited by:

Ashok Mysore, St. John’s Medical College Hospital, India

Reviewed by:

Ganesan Venkatasubramanian, National Institute of Mental Health and Neurosciences, India
Christopher Gibbins, Children and Women’s Hospital of British Columbia, Canada

Copyright: © 2015 Muszkat, de Mello, Muñoz, Lucci, David, Siqueira and Otta. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Emma Otta, emmaotta@gmail.com

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.