With the increasing presence of robots in our daily life,there is a strong need and demand for the strategies to acquire a high quality interaction between robots and users by enabling robots to understand users’mood...With the increasing presence of robots in our daily life,there is a strong need and demand for the strategies to acquire a high quality interaction between robots and users by enabling robots to understand users’mood,intention,and other aspects.During human-human interaction,personality traits have an important influence on human behavior,decision,mood,and many others.Therefore,we propose an efficient computational framework to endow the robot with the capability of understanding the user’s personality traits based on the user’s nonverbal communication cues represented by three visual features including the head motion,gaze,and body motion energy,and three vocal features including voice pitch,voice energy,and mel-frequency cepstral coefficient(MFCC).We used the Pepper robot in this study as a communication robot to interact with each participant by asking questions,and meanwhile,the robot extracts the nonverbal features from each participant’s habitual behavior using its on-board sensors.On the other hand,each participant’s personality traits are evaluated with a questionnaire.We then train the ridge regression and linear support vector machine(SVM)classifiers using the nonverbal features and personality trait labels from a questionnaire and evaluate the performance of the classifiers.We have verified the validity of the proposed models that showed promising binary classification performance on recognizing each of the Big Five personality traits of the participants based on individual differences in nonverbal communication cues.展开更多
基金supported by the EU-Japan coordinated R&D project on“Culture Aware Robots and Environmental Sensor Systems for Elderly Support,”commissioned by the Ministry of Internal Affairs and Communications of Japan and EC Horizon 2020 Research and Innovation Programme(737858)financial supports from the Air Force Office of Scientific Research(AFOSR-AOARD/FA2386-19-1-4015)。
文摘With the increasing presence of robots in our daily life,there is a strong need and demand for the strategies to acquire a high quality interaction between robots and users by enabling robots to understand users’mood,intention,and other aspects.During human-human interaction,personality traits have an important influence on human behavior,decision,mood,and many others.Therefore,we propose an efficient computational framework to endow the robot with the capability of understanding the user’s personality traits based on the user’s nonverbal communication cues represented by three visual features including the head motion,gaze,and body motion energy,and three vocal features including voice pitch,voice energy,and mel-frequency cepstral coefficient(MFCC).We used the Pepper robot in this study as a communication robot to interact with each participant by asking questions,and meanwhile,the robot extracts the nonverbal features from each participant’s habitual behavior using its on-board sensors.On the other hand,each participant’s personality traits are evaluated with a questionnaire.We then train the ridge regression and linear support vector machine(SVM)classifiers using the nonverbal features and personality trait labels from a questionnaire and evaluate the performance of the classifiers.We have verified the validity of the proposed models that showed promising binary classification performance on recognizing each of the Big Five personality traits of the participants based on individual differences in nonverbal communication cues.