摘要
面向食品领域的图像检索和分类等方面的研究成为多媒体分析和应用领域越来越受关注的研究课题之一.当前的主要研究方法基于全图提取视觉特征,但由于食品图像背景噪音的存在使得提取的视觉特征不够鲁棒,进而影响食品图像检索和分类的性能.为此,本文提出了一种基于Faster R-CNN网络的食品图像检索和分类方法.首先通过Faster R-CNN检测图像中的候选食品区域,然后通过卷积神经网络(CNN)方法提取候选区域的视觉特征,避免了噪音的干扰使得提取的视觉特征更具有判别力.此外,选取来自视觉基因库中标注好的食品图像集微调Faster R-CNN网络,以保证Faster R-CNN食品区域检测的准确度.在包括233类菜品和49 168张食品图像的Dish-233数据集上进行实验.全面的实验评估表明:基于Faster R-CNN食品区域检测的视觉特征提取方法可以有效地提高食品图像检索和分类的性能.
Automatic understanding of food images has various applications in different fields,such as food intake monitor and food calorie estimation.Thus,the research on food related tasks,such as food image retrieval and classification has been one of the hot research topics in the field of multimedia analysis and applications recently.Existing methods mainly extract the visual features from the whole food image for further food analysis.The extracted features are lacking in robustness because of the background interference from the images.In order to solve this problem,we propose a Faster R-CNN( Region-based Convolutional Neural Network) based food retrieval and classification method.For the solution,we first detect the food candidate regions using Faster R-CNN,and then adopt the CNN network to extract the visual features from the detected food regions.Such extracted features are more discriminative for reducing the background interference.Furthermore,we select the annotated food images from the Visual Genome dataset to fine-tune the Faster R-CNN to guarantee its performance. We conduct the experiment on two datasets:Food-101 with 101 classes and 10 641 food images,and Dish-233 with 233 dishes and 49 168 images.The extensive evaluation demonstrates the effectiveness of the proposed Faster R-CNN based food visual feature extraction method in food image retrieval and classification.
出处
《南京信息工程大学学报(自然科学版)》
CAS
2017年第6期635-641,共7页
Journal of Nanjing University of Information Science & Technology(Natural Science Edition)
基金
国家自然科学基金(61532018
61602437
61672497
61472229
61202152)
北京市科技计划(D161100001816001)
山东省自然科学基金(ZR2017MF02)
山东省科技发展计划(2016ZDJS02A11
2014GGX101035
2014BSB01020)