Preprint Review Version 1 Preserved in Portico This version is not peer-reviewed

Deciphering Deception – the Impact of AI Deepfakes on Human Cognition and Emotion

Version 1 : Received: 31 January 2024 / Approved: 2 February 2024 / Online: 2 February 2024 (07:50:05 CET)

How to cite: Qureshi, J.; Khan, S. Deciphering Deception – the Impact of AI Deepfakes on Human Cognition and Emotion. Preprints 2024, 2024020135. https://doi.org/10.20944/preprints202402.0135.v1 Qureshi, J.; Khan, S. Deciphering Deception – the Impact of AI Deepfakes on Human Cognition and Emotion. Preprints 2024, 2024020135. https://doi.org/10.20944/preprints202402.0135.v1

Abstract

AbstractThe emergence of AI-powered deepfakes, manipulatively realistic synthetic media, raises unprecedented challenges for human information processing and decision-making. This paper explores the cognitive and emotional consequences of exposure to deepfakes, delving into how these digitally fabricated experiences can reshape perception, trust, and social interactions. By reviewing existing research and outlining potential methodological approaches, this paper aims to establish a framework for investigating the multifaceted impact of deepfakes on the human brain.BackgroundThe digital age has ushered in an era of unprecedented information abundance and accessibility. Yet, this abundance breeds vulnerability, particularly with the emergence of AI-powered deepfakes. These hyper-realistic synthetic media creations present a potent threat to the fundamental trust and integrity of information sources. Deepfakes have infiltrated various domains, from entertainment and social media to politics and journalism, blurring the lines between truth and fabrication. Consequently, understanding the cognitive and emotional consequences of encountering deepfakes becomes crucial for navigating this increasingly deceptive landscape.ObjectivesThis research aims to: Investigate how exposure to deepfakes impacts various cognitive functions, including perception, attention, memory, and decision-making. Elucidate the emotional responses elicited by deepfakes and their underlying neural mechanisms. Evaluate the role of individual differences, such as susceptibility to deception and emotional sensitivity, in mediating the impact of deepfakes. Develop a comprehensive framework for understanding the multifaceted influence of deepfakes on the human brain. Results (anticipated)This research is expected to yield insights into: The specific cognitive and emotional processes altered by deepfakes, providing evidence for potential mechanisms of deception and manipulation. The neural circuitry involved in detecting and responding to deepfakes, leading to the development of more effective detection methods. The influence of individual differences (e.g., age, cognitive style) on susceptibility to deepfakes, informing targeted interventions and educational efforts.ConclusionBy deciphering the impact of deepfakes on the human brain, this research will offer valuable tools for mitigating the harmful consequences of misinformation and promoting responsible use of this powerful technology. Ultimately, understanding how deepfakes manipulate our cognitive and emotional landscape is not merely a scientific pursuit, but a critical step towards ensuring a future where truth prevails in the digital world.

Keywords

AI deepfakes; human cognition; emotion; perception; decision-making; neuroimaging; psychophysiology; behavioral analysis; individual differences; misinformation; ethical implications

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.