Deepfake-generated fake faces,commonly utilized in identity-related activities such as political propaganda,celebrity impersonations,evidence forgery,and familiar fraud,pose new societal threats.Although current deepf...Deepfake-generated fake faces,commonly utilized in identity-related activities such as political propaganda,celebrity impersonations,evidence forgery,and familiar fraud,pose new societal threats.Although current deepfake generators strive for high realism in visual effects,they do not replicate biometric signals indicative of cardiac activity.Addressing this gap,many researchers have developed detection methods focusing on biometric characteristics.These methods utilize classification networks to analyze both temporal and spectral domain features of the remote photoplethysmography(rPPG)signal,resulting in high detection accuracy.However,in the spectral analysis,existing approaches often only consider the power spectral density and neglect the amplitude spectrum—both crucial for assessing cardiac activity.We introduce a novel method that extracts rPPG signals from multiple regions of interest through remote photoplethysmography and processes them using Fast Fourier Transform(FFT).The resultant time-frequency domain signal samples are organized into matrices to create Matrix Visualization Heatmaps(MVHM),which are then utilized to train an image classification network.Additionally,we explored various combinations of time-frequency domain representations of rPPG signals and the impact of attention mechanisms.Our experimental results show that our algorithm achieves a remarkable detection accuracy of 99.22%in identifying fake videos,significantly outperforming mainstream algorithms and demonstrating the effectiveness of Fourier Transform and attention mechanisms in detecting fake faces.展开更多
基金supported by the National Nature Science Foundation of China(Grant Number:61962010).
文摘Deepfake-generated fake faces,commonly utilized in identity-related activities such as political propaganda,celebrity impersonations,evidence forgery,and familiar fraud,pose new societal threats.Although current deepfake generators strive for high realism in visual effects,they do not replicate biometric signals indicative of cardiac activity.Addressing this gap,many researchers have developed detection methods focusing on biometric characteristics.These methods utilize classification networks to analyze both temporal and spectral domain features of the remote photoplethysmography(rPPG)signal,resulting in high detection accuracy.However,in the spectral analysis,existing approaches often only consider the power spectral density and neglect the amplitude spectrum—both crucial for assessing cardiac activity.We introduce a novel method that extracts rPPG signals from multiple regions of interest through remote photoplethysmography and processes them using Fast Fourier Transform(FFT).The resultant time-frequency domain signal samples are organized into matrices to create Matrix Visualization Heatmaps(MVHM),which are then utilized to train an image classification network.Additionally,we explored various combinations of time-frequency domain representations of rPPG signals and the impact of attention mechanisms.Our experimental results show that our algorithm achieves a remarkable detection accuracy of 99.22%in identifying fake videos,significantly outperforming mainstream algorithms and demonstrating the effectiveness of Fourier Transform and attention mechanisms in detecting fake faces.