The aim of this paper is to create an interface for humanrobot interaction. Specifically, musical performance parameters (i. e. vibrato expression) of the Waseda Flutist Robot No. 4 Refined IV (WF-4RIV) are to be ...The aim of this paper is to create an interface for humanrobot interaction. Specifically, musical performance parameters (i. e. vibrato expression) of the Waseda Flutist Robot No. 4 Refined IV (WF-4RIV) are to be manipulated. This research focused on enabling the WF-4RIV to interact with human players (musicians) in a natu- ral way. In this paper, as the first approach, a vision processing algorithm, which is able to track the 3Dorientation and position of a musical instrument, was developed. In particular, the robot acquires image data through two catneras attached to its head. Using color histogram matching and a particle filter, the position of the musician's hands on the inset are tracked. Analysis of this data determines orientation and location of the iustnment. These pa- rameters are mapped to manipulate the musical expression of the WF- 4RIV, more specifically sound vibrato and volume values. The authors present preliminary experiments to determine if the robot may dynamically change musical parameters while interacting with a human player (i.e. vibrato etc. ). From the experirnental results, they may confirm the feasibility of the interaction during the performance, although further research must be carried out to consider the physical constraints of the flutist robot.展开更多
文摘The aim of this paper is to create an interface for humanrobot interaction. Specifically, musical performance parameters (i. e. vibrato expression) of the Waseda Flutist Robot No. 4 Refined IV (WF-4RIV) are to be manipulated. This research focused on enabling the WF-4RIV to interact with human players (musicians) in a natu- ral way. In this paper, as the first approach, a vision processing algorithm, which is able to track the 3Dorientation and position of a musical instrument, was developed. In particular, the robot acquires image data through two catneras attached to its head. Using color histogram matching and a particle filter, the position of the musician's hands on the inset are tracked. Analysis of this data determines orientation and location of the iustnment. These pa- rameters are mapped to manipulate the musical expression of the WF- 4RIV, more specifically sound vibrato and volume values. The authors present preliminary experiments to determine if the robot may dynamically change musical parameters while interacting with a human player (i.e. vibrato etc. ). From the experirnental results, they may confirm the feasibility of the interaction during the performance, although further research must be carried out to consider the physical constraints of the flutist robot.