“Deepfakes” are videos in which the (usually human) subject of a video has been digitally altered to appear to do or say something that they never actually did or said. Sometimes these manipulations produce innocuous novelties (e.g., testing what it would look like if Will Smith had been cast as “Neo” in the film The Matrix), but far more dangerous use cases have been observed (e.g., producing fake footage of Ukrainian President, Volodymyr Zelenskyy, in which he instructs Ukrainian military forces to surrender on the battlefield). Generating the knowledge and tools necessary to defend against potential harms these videos could impose is likely to rely on contributions from a broad coalition of disciplines, many of which are represented in the GVU. In this week’s Brown Bag presentation, we will offer some real-time demonstrations of deepfake technology and present findings from our work that has largely focused on investigating the psychological factors influencing deepfake detection.
John Stasko is a Regents Professor in the School of Interactive Computing at the Georgia Institute of Technology, where he has been on the faculty since 1989. He works in the areas of information visualization and visual analytics, approaching each from a human-computer interaction perspective. Stasko received the IEEE Visualization and Graphics Technical Committee (VGTC) Visualization Technical Achievement Award in 2012 and was inducted into the ACM CHI Academy in 2016 and IEEE Visualization Academy in 2019. He was named an IEEE Fellow in 2014 and an ACM Fellow in 2023.
Richard Catrambone is a Professor in the School of Psychology at the Georgia Institute of Technology. He received his B.A. from Grinnell College and his Ph.D. in Experimental Psychology from the University of Michigan. His research interests include problem solving, educational technology, and human-computer interaction. He explores how to create instructional materials that help learners understand how to approach problems in a meaningful way rather than simply memorizing a set of steps that cannot easily be transferred to novel problems. He served on the Cognitive Science Society governing board from 2011-2016 and was chair of the Society in 2015.
Zack Tidler is a doctoral student in the School of Psychology with an emphasis in engineering psychology. His primary research interest is in developing new measures of human cognitive ability that consider tool usage, but he has developed a secondary research program which is focused on the study of deepfake video detection in humans. He is a former president of the Georgia Tech Chapter of the Human Factors and Ergonomics Society. His work on deepfake detection has been featured in the College of Sciences newsletter and the 2021 issue of the GT Alumni Magazine.
How to watch: If you can't attend. please watch the Live Stream, or view the Recording (available 30 days after event). If you have any questions, email us at firstname.lastname@example.org.