摘要
针对在现有人脸静态识别过程中被识别人需等待配合的问题,文中提出了一种动态人脸识别系统。该系统采用了基于RetinaFace与FaceNet算法的动态人脸检测和识别方法,并进行了优化,以达到高识别精度和实时性的目标。其中,RetinaFace检测采用GhostNet作为骨干网络,使用Adaptive-NMS(Non Max Suppression)非极大值抑制用于人脸框的回归,FaceNet识别采用MobileNetV1作为骨干网络,使用Triplet损失与交叉熵损失结合的联合损失函数用以人脸分类。优化后的算法在检测与识别上具有良好表现,改进RetinaFace算法在WiderFace数据集下检测精度为93.35%、90.84%和80.43%,FPS(Frames Per Second)可达53 frame·s^(-1)。动态人脸检测平均检测精度为96%,FPS为21 frame·s^(-1)。当FaceNet阈值设为1.15时,识别率最高达到98.23%。动态识别系统平均识别精度98%,FPS可达20 frame·s^(-1)。实验结果表明,该系统解决了人脸静态识别中需等待配合的问题,具有较高的识别精度与实时性。
This study proposes a dynamic face recognition system to address the problem of requiring the recognized individual's cooperation in existing static face recognition processes.The system uses the RetinaFace and FaceNet algorithms for dynamic face detection and recognition,respectively,and is optimized for high recognition accuracy and real-time performance.In particular,GhostNet is used as the backbone network for RetinaFace detection,and Adaptive-NMS(Non Max Suppression)non-maximum suppression is used for face bounding box regression.For FaceNet recognition,MobileNetV1 is used as the backbone network,and a joint loss function combining Triplet loss and cross-entropy loss is used for face classification.The optimized algorithm has excellent performance in detection and recognition The improved RetinaFace algorithm achieves detection accuracies of 93.35%,90.84%,and 80.43%on the WiderFace dataset,with a frame rate of 53 frame·s^(-1).For dynamic face detection,the average detection accuracy is 96%,with a frame rate of 21 frame·s^(-1).When the FaceNet threshold is set to 1.15,the highest recognition rate is 98.23%.The average recognition accuracy of the dynamic recognition system is 98%,with a frame rate of 20 frame·s^(-1).The experimental results demonstrate that the proposed system fully addresses the problem of requiring cooperation from the recognized individual in static face recognition and achieves high recognition accuracy and real-time performance.
作者
李云鹏
席志红
LI Yunpeng;XI Zhihong(College of Information and Communication Engineering,Harbin Engineering University,Harbin 150001,China)
出处
《电子科技》
2024年第12期79-86,共8页
Electronic Science and Technology
基金
国家自然科学基金(62001136)。