Submitted:
02 February 2026
Posted:
03 February 2026
You are already at the latest version
Abstract
Keywords:
1. Introduction
2. Materials and Methods
2.1. Participants
2.2. Background for Screening Individuals with Developmental Prosopagnosia
2.3. Materials
2.3.1. Face Trait Implicit Association Task (IAT): A Novel Version of the IAT [11] Was Used in This Study with Female Composite Facial Stimuli
2.4. Data Analysis
2.4.1. Crawford Approach
3. Results
3.1. How Do Individuals with Developmental Prosopagnosia Perform on Face-Based Implicit Extraversion Trait Judgements?
3.2. Single Case Analyses
4. Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| DP | Developmental Prosopagnosia |
| IAT | Implicit Association Task |
| CFMT | Cambridge Face Memory Task |
| CFPT | Cambridge Face Perception Task |
| FFT | Famous Face Test |
| AQ | Autism Quotient |
| TAS20 | Toronto Alexithymia Scale 20 |
| PI20 | Prosopagnosia Index 20 |
| FaReS | Face Research Swansea |
| BF | Bayes Factor |
References
- Kannan, C.; Jones, A.L.; Towler, J.; Tree, J.J. Do young and older adult populations perform equivalently across different automatic face-trait judgements? Evidence for differential impacts of ageing. PLOS ONE 2025, 20, e0322165. [Google Scholar] [CrossRef]
- Knutson, K.M.; DeTucci, K.A.; Grafman, J. Implicit attitudes in prosopagnosia. Neuropsychologia 2011, 49, 1851–1862. [Google Scholar] [CrossRef]
- Rezlescu, C.; Susilo, T.; Barton, J.J.; Duchaine, B. Normal social evaluations of faces in acquired prosopagnosia. Cortex 2014, 50, 200–203. [Google Scholar] [CrossRef]
- Todorov, A.; Duchaine, B. Reading trustworthiness in faces without recognizing faces. Cogn. Neuropsychol. 2008, 25, 395–410. [Google Scholar] [CrossRef]
- Crawford, J.R.; Garthwaite, P.H.; Ryan, K. Comparing a single case to a control sample: Testing for neuropsychological deficits and dissociations in the presence of covariates. Cortex 2011, 47, 1166–1178. [Google Scholar] [CrossRef]
- Baron, R.J. Mechanisms of human facial recognition. Int. J. Man-Machine Stud. 1981, 15, 137–178. [Google Scholar] [CrossRef]
- Little, A.C.; Perrett, D.I. Using composite images to assess accuracy in personality attribution to faces. Br. J. Psychol. 2007, 98, 111–126. [Google Scholar] [CrossRef]
- Penton-Voak, I.S.; Pound, N.; Little, A.C.; Perrett, D.I. Personality Judgments from Natural and Composite Facial Images: More Evidence For A “Kernel Of Truth” In Social Perception. Soc. Cogn. 2006, 24, 607–640. [Google Scholar] [CrossRef]
- Todorov, A.; Olivola, C.Y.; Dotsch, R.; Mende-Siedlecki, P. Social Attributions from Faces: Determinants, Consequences, Accuracy, and Functional Significance. Annu. Rev. Psychol. 2015, 66, 519–545. [Google Scholar] [CrossRef]
- Rule, N.; Ambady, N. First Impressions of the Face: Predicting Success. Soc. Pers. Psychol. Compass 2010, 4, 506–516. [Google Scholar] [CrossRef]
- Greenwald, A.G.; McGhee, D.E.; Schwartz, J.L.K. Measuring individual differences in implicit cognition: The implicit association test. J. Pers. Soc. Psychol. 1998, 74, 1464–1480. [Google Scholar] [CrossRef]
- Greenwald, A.G.; Nosek, B.A.; Banaji, M.R. Understanding and using the Implicit Association Test: I. An improved scoring algorithm. J. Pers. Soc. Psychol. 2003, 85, 197–216. [Google Scholar] [CrossRef]
- Jones, A.L.; Tree, J.J.; Ward, R. Personality in faces: Implicit associations between appearance and personality. Eur. J. Soc. Psychol. 2018, 49, 658–669. [Google Scholar] [CrossRef]
- Bate, S.; Tree, J.J. The Definition and Diagnosis of Developmental Prosopagnosia. Q. J. Exp. Psychol. 2017, 70, 193–200. [Google Scholar] [CrossRef]
- Duchaine, B.; Nakayama, K. The Cambridge Face Memory Test: Results for neurologically intact individuals and an investigation of its validity using inverted face stimuli and prosopagnosic participants. Neuropsychologia 2006, 44, 576–585. [Google Scholar] [CrossRef]
- Behrmann, M.; Avidan, G. Congenital prosopagnosia: face-blind from birth. Trends Cogn. Sci. 2005, 9, 180–187. [Google Scholar] [CrossRef]
- Susilo, T.; Duchaine, B. Advances in developmental prosopagnosia research. Curr. Opin. Neurobiol. 2013, 23, 423–429. [Google Scholar] [CrossRef]
- Duchaine, B.C.; Weidenfeld, A. An evaluation of two commonly used tests of unfamiliar face recognition. Neuropsychologia 2003, 41, 713–720. [Google Scholar] [CrossRef]
- Avidan, G.; Behrmann, M. Impairment of the face processing network in congenital prosopagnosia. Front. Biosci. 2014, 6, 236–257. [Google Scholar] [CrossRef]
- Barton, J.J.; Corrow, S.L. Recognizing and identifying people: A neuropsychological review. Cortex 2016, 75, 132–150. [Google Scholar] [CrossRef]
- DeGutis, J.; Bahierathan, K.; Barahona, K.; Lee, E.; Evans, T.C.; Shin, H.M.; Mishra, M.; Likitlersuang, J.; Wilmer, J.B. What is the prevalence of developmental prosopagnosia? An empirical assessment of different diagnostic cutoffs. Cortex 2023, 161, 51–64. [Google Scholar] [CrossRef]
- Humphreys, K.; Avidan, G.; Behrmann, M. A detailed investigation of facial expression processing in congenital prosopagnosia as compared to acquired prosopagnosia. Exp. Brain Res. 2006, 176, 356–373. [Google Scholar] [CrossRef]
- Palermo, R.; Willis, M.L.; Rivolta, D.; McKone, E.; Wilson, C.E.; Calder, A.J. Impaired holistic coding of facial expression and facial identity in congenital prosopagnosia. Neuropsychologia 2011, 49, 1226–1235. [Google Scholar] [CrossRef]
- Bruce, V.; Young, A. Understanding face recognition. Br. J. Psychol. 1986, 77, 305–327. [Google Scholar] [CrossRef]
- Haxby, J.V.; Hoffman, E.A.; Gobbini, M.I. The distributed human neural system for face perception. Trends Cogn. Sci. 2000, 4, 223–233. [Google Scholar] [CrossRef]
- Bennetts, R.J.; Gregory, N.J.; Bate, S. Both identity and non-identity face perception tasks predict developmental prosopagnosia and face recognition ability. Sci. Rep. 2024, 14, 1–14. [Google Scholar] [CrossRef]
- Bate, S.; Cook, S.J.; Duchaine, B.; Tree, J.J.; Burns, E.J.; Hodgson, T.L. Intranasal inhalation of oxytocin improves face processing in developmental prosopagnosia. Cortex 2014, 50, 55–63. [Google Scholar] [CrossRef]
- Sutherland, C.A.; Oldmeadow, J.A.; Santos, I.M.; Towler, J.; Burt, D.M.; Young, A.W. Social inferences from faces: Ambient images generate a three-dimensional model. Cognition 2013, 127, 105–118. [Google Scholar] [CrossRef]
- Todorov, A.; Baron, S.G.; Oosterhof, N.N. Evaluating face trustworthiness: a model based approach. Soc. Cogn. Affect. Neurosci. 2008, 3, 119–127. [Google Scholar] [CrossRef]
- Crawford, J.; Garthwaite, P.H. Investigation of the single case in neuropsychology: confidence limits on the abnormality of test scores and test score differences. Neuropsychologia 2002, 40, 1196–1208. [Google Scholar] [CrossRef]
- Anwyl-Irvine, A.L.; Massonnié, J.; Flitton, A.; Kirkham, N.; Evershed, J.K. Gorilla in our midst: An online behavioral experiment builder. Behav. Res. Methods 2019, 52, 388–407. [Google Scholar] [CrossRef]
- Bowles, D.C.; McKone, E.; Dawel, A.; Duchaine, B.; Palermo, R.; Schmalzl, L.; Rivolta, D.; Wilson, C.E.; Yovel, G. Diagnosing prosopagnosia: Effects of ageing, sex, and participant–stimulus ethnic match on the Cambridge Face Memory Test and Cambridge Face Perception Test. Cogn. Neuropsychol. 2009, 26, 423–455. [Google Scholar] [CrossRef]
- Bobak, A.K.; Pampoulov, P.; Bate, S. Detecting Superior Face Recognition Skills in a Large Sample of Young British Adults. Front. Psychol. 2016, 7, 1378–1378. [Google Scholar] [CrossRef]
- Barton, J.J.S. Structure and function in acquired prosopagnosia: Lessons from a series of 10 patients with brain damage. J. Neuropsychol. 2008, 2, 197–225. [Google Scholar] [CrossRef]
- Duchaine, B.; Germine, L.; Nakayama, K. Family resemblance: Ten family members with prosopagnosia and within-class object agnosia. Cogn. Neuropsychol. 2007, 24, 419–430. [Google Scholar] [CrossRef]
- Shah, P.; Gaule, A.; Sowden, S.; Bird, G.; Cook, R. The 20-item prosopagnosia index (PI20): a self-report instrument for identifying developmental prosopagnosia. R. Soc. Open Sci. 2015, 2, 140343. [Google Scholar] [CrossRef]
- Bate, S.; Bennetts, R.J.; Gregory, N.; Tree, J.J.; Murray, E.; Adams, A.; Bobak, A.K.; Penton, T.; Yang, T.; Banissy, M.J. Objective Patterns of Face Recognition Deficits in 165 Adults with Self-Reported Developmental Prosopagnosia. Brain Sci. 2019, 9, 133. [Google Scholar] [CrossRef]
- De Renzi, E.; Faglioni, P.; Grossi, D.; Nichelli, P. Apperceptive and Associative Forms of Prosopagnosia. Cortex 1991, 27, 213–221. [Google Scholar] [CrossRef]
- Fox, C.J.; Oruç, I.; Barton, J.J.S. It doesn't matter how you feel. The facial identity aftereffect is invariant to changes in facial expression. J. Vis. 2008, 8, 11–11. [Google Scholar] [CrossRef]
- Eimer, M.; Gosling, A.; Duchaine, B. Electrophysiological markers of covert face recognition in developmental prosopagnosia. Brain 2012, 135, 542–554. [Google Scholar] [CrossRef]
- Baron-Cohen, S.; Wheelwright, S.; Skinner, R.; Martin, J.; Clubley, E. The Autism-Spectrum Quotient (AQ): Evidence from Asperger Syndrome/High-Functioning Autism, Males and Females, Scientists and Mathematicians. J. Autism Dev. Disord. 2001, 31, 5–17. [Google Scholar] [CrossRef] [PubMed]
- Minio-Paluello, I.; Porciello, G.; Pascual-Leone, A.; Baron-Cohen, S. Face individual identity recognition: a potential endophenotype in autism. Mol. Autism 2020, 11, 1–16. [Google Scholar] [CrossRef] [PubMed]
- Schultz, R.T. Developmental deficits in social perception in autism: the role of the amygdala and fusiform face area. Int. J. Dev. Neurosci. 2005, 23, 125–141. [Google Scholar] [CrossRef] [PubMed]
- Duchaine, B.; Murray, H.; Turner, M.; White, S.; Garrido, L. Normal social cognition in developmental prosopagnosia. Cogn. Neuropsychol. 2009, 26, 620–634. [Google Scholar] [CrossRef]
- Donnellan, M.B.; Oswald, F.L.; Baird, B.M.; Lucas, R.E. The Mini-IPIP Scales: Tiny-yet-effective measures of the Big Five Factors of Personality. Psychol. Assess. 2006, 18, 192–203. [Google Scholar] [CrossRef]
- Kramer, R.S.S.; Ward, R. Internal Facial Features are Signals of Personality and Health. Q. J. Exp. Psychol. 2010, 63, 2273–2287. [Google Scholar] [CrossRef]
- Back, M.D.; Schmukle, S.C.; Egloff, B. Predicting actual behavior from the explicit and implicit self-concept of personality. J. Pers. Soc. Psychol. 2009, 97, 533–548. [Google Scholar] [CrossRef]
- Crawford, J.R.; Garthwaite, P.H. Comparison of a single case to a control or normative sample in neuropsychology: Development of a Bayesian approach. Cogn. Neuropsychol. 2007, 24, 343–372. [Google Scholar] [CrossRef]

| Measure | Mean | Std. Deviation | Minimum | Maximum |
| Age | 53 | 14.40 | 18 | 81 |
| CFMT | 33.11 | 4.86 | 24 | 43 |
| CFPT | 27.72 | 4.44 | 20.67 | 36.67 |
| FFT | 44.73 | 15.38 | 15.39 | 70 |
| PI20 | 80.44 | 7.44 | 61 | 92 |
| AQ | 18 | 6.97 | 4 | 31 |
| Block | No. of trials | Function | Item assigned to left key | Item assigned to right key |
| 1 | 20 | Practice | Jane images | Mary images |
| 2 | 20 | Practice | Extraverted words | Introverted words |
| 3 | 20 | Test | Jane images + extraverted words | Mary images + introverted words |
| 4 | 40 | Test | Jane images + extraverted words | Mary images + introverted words |
| 5 | 20 | Practice | Mary images | Jane images |
| 6 | 20 | Test | Mary images + extraverted words | Jane images + introverted words |
| 7 | 40 | Test | Mary images + extraverted words | Jane images + introverted words |
| Step | Improved Algorithm | Present Study implementation |
| 1 | Use data from B3, B4, B6, & B7 | Use data from B3, B4, B6, & B7 |
| 2 | Eliminate trials with latencies > 10,000ms; eliminate subjects for whom more than 10% of trials have latency less than 300ms | Eliminate trials with latencies > 10,000ms; eliminate subjects for whom more than 10% of trials have latency less than 300ms |
| 3 | Use all trials | Use all trials |
| 4 | No extreme-value treatment (beyond Step 2) | No extreme-value treatment (beyond Step 2) |
| 5 | Compute mean of correct latencies for each block | Compute mean of correct latencies for each block |
| 6 | Compute one pooled SD for all trials in B3 & B6; another for B4 & B7 | Compute one pooled SD for all trials in B3 & B6; another for B4 & B7 |
| 7 | Replace each error latency with block mean (computed in Step 5) + 600 ms | Retain full latency from stimulus onset to corrected response (implicit error penalty) |
| 8 | No transformation | No transformation |
| 9 | Average the resulting values for each of the four blocks | Average the resulting values for each of the four blocks |
| 10 | Compute two differences: B6 - B3 and B7 - B4 | Compute two differences: B6 - B3 and B7 - B4 |
| 11 | Divide each difference by its associated pooled trials SD from Step 6 | Divide each difference by its associated pooled trials SD from Step 6 |
| 12 | Average the two quotients from Step 11 | Average the two quotients from Step 11 |
| Participant | DP IAT D | Crawford t | p value | Percentile |
|---|---|---|---|---|
| 1 | 0.190 | 0.276 | 0.783 | 56.2 |
| 2 | 0.153 | 0.194 | 0.846 | 54.2 |
| 3 | 0.263 | 0.442 | 0.659 | 59.8 |
| 4 | 0.886 | 1.981 | 0.050 | 97.6 |
| 5 | -0.156 | -0.353 | 0.725 | 34.4 |
| 6 | -0.538 | -1.309 | 0.192 | 13.0 |
| 7 | 0.011 | -0.020 | 0.984 | 50.8 |
| 8 | -0.209 | -0.474 | 0.636 | 33.0 |
| 9 | 0.012 | -0.018 | 0.986 | 50.7 |
| 10 | 0.364 | 0.676 | 0.500 | 63.4 |
| 11 | 0.207 | 0.300 | 0.765 | 57.2 |
| 12 | -0.227 | -0.516 | 0.606 | 31.0 |
| 13 | -0.019 | -0.041 | 0.967 | 49.2 |
| 14 | 0.081 | -0.002 | 0.998 | 52.0 |
| 15 | 0.093 | -0.074 | 0.941 | 52.5 |
| 16 | -0.041 | -0.089 | 0.929 | 48.4 |
| 17 | -0.004 | -0.012 | 0.990 | 49.8 |
| 18 | 0.020 | -0.001 | 0.999 | 51.2 |
| 19 | 0.113 | -0.024 | 0.981 | 53.3 |
| 20 | 0.042 | -0.060 | 0.953 | 51.7 |
| 21 | 0.212 | 0.312 | 0.755 | 57.4 |
| 22 | -0.132 | -0.297 | 0.767 | 36.2 |
| 23 | -0.074 | -0.166 | 0.868 | 40.6 |
| 24 | 0.013 | -0.018 | 0.986 | 50.7 |
| 25 | -0.110 | -0.254 | 0.800 | 38.0 |
| 26 | 0.260 | 0.437 | 0.663 | 59.6 |
| 27 | 0.008 | -0.023 | 0.982 | 50.5 |
| 28 | -0.083 | -0.185 | 0.853 | 39.7 |
| 29 | 0.007 | -0.024 | 0.981 | 50.5 |
| 30 | -0.114 | -0.262 | 0.794 | 37.8 |
| 31 | 0.171 | 0.235 | 0.814 | 55.0 |
| 32 | 0.175 | 0.244 | 0.808 | 55.2 |
| 33 | -0.073 | -0.164 | 0.870 | 40.5 |
| 34 | 0.026 | -0.009 | 0.993 | 51.6 |
| 35 | 0.062 | -0.041 | 0.967 | 51.1 |
| 36 | 0.044 | -0.058 | 0.954 | 51.8 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.