摘要
Lifelog is a digital record of an individual’s daily life.It collects,records,and archives a large amount of unstructured data;therefore,techniques are required to organize and summarize those data for easy retrieval.Lifelogging has been utilized for diverse applications including healthcare,self-tracking,and entertainment,among others.With regard to the imagebased lifelogging,even though most users prefer to present photos with facial expressions that allow us to infer their emotions,there have been few studies on lifelogging techniques that focus upon users’emotions.In this paper,we develop a system that extracts users’own photos from their smartphones and congures their lifelogs with a focus on their emotions.We design an emotion classier based on convolutional neural networks(CNN)to predict the users’emotions.To train the model,we create a new dataset by collecting facial images from the CelebFaces Attributes(CelebA)dataset and labeling their facial emotion expressions,and by integrating parts of the Radboud Faces Database(RaFD).Our dataset consists of 4,715 high-resolution images.We propose Representative Emotional Data Extraction Scheme(REDES)to select representative photos based on inferring users’emotions from their facial expressions.In addition,we develop a system that allows users to easily congure diaries for a special day and summaize their lifelogs.Our experimental results show that our method is able to effectively incorporate emotions into lifelog,allowing an enriched experience.
基金
supported by the MSIT(Ministry of Science and ICT),Korea,under the Grand Information Technology Research Center Support Program(IITP-2020-2015-0-00742)
Articial Intelligence Graduate School Program(Sungkyunkwan University,2019-0-00421)
the ICT Creative Consilience Program(IITP-2020-2051-001)supervised by the IITP
supported by NRF of Korea(2019R1C1C1008956,2018R1A5A1059921)to J.J.Whan。