The research on the development of the anthropomorphic flutist robot at Waseda University has been focused in emulating the anatomy and physiology of the organs involved during the flute playing from an engineering point of view, facilitating the symbiosis between the human and the robot (i.e. active interaction between musician and musical robot) and proposing novel applications for humanoid robots (i.e. music education). As a result of this research, the Waseda Flutist Robot No.4 Refined IV has been developed and a musical-based interaction system implemented so the robot is capable of interacting with musicians by processing both aural and visual cues. However; there is a trade-off relationship between the duration of the flute sound produced by the robot and the sound pressure (volume). In fact, the robot is only capable of playing sounds low-pitched sounds for long periods. In addition, a husky sound is detected while playing high-pitch sounds. From our discussions with professional players, this effect is caused due to the inner shape of the oral cavity. From this, the conversion efficiency ratio between from the exhaled air from the artificial lungs to the produced sound is too low. For this purpose, we have obtained MR images of the head from professional players in order to re-design the oral cavity of the flutist robot. A total of 5 prototypes were tested and the best one has been selected and integrated into the Waseda Flutist Robot No. 4 Refined VI (WF-4RVI). A set of experiments were proposed in order to verify the improvements of the conversion efficiency ratio as well as the sound evaluation function score. From the experimental results, we could verify the improvements compared with the previous version of the flutist robot.
Abstract At Waseda University, since 1990, the authors have been developing anthropomorphic musical performance robots as a means for understanding human control, introducing novel ways of interaction between musical partners and robots, and proposing applications for humanoid robots. In this paper, the design of a biologically-inspired control architecture for both an anthropomorphic flutist robot and a saxophone playing robot are described. As for the flutist robot, the authors have focused on implementing an auditory feedback system to improve the calibration procedure for the robot in order to play all the notes correctly during a performance. In particular, the proposed auditory feedback system is composed of three main modules: an Expressive Music Generator, a Feed Forward Air Pressure Control System and a Pitch Evaluation System. As for the saxophone-playing robot, a pressure-pitch controller (based on the feedback error learning) to improve the sound produced by the robot during a musical performance was proposed and implemented. In both cases studied, a set of experiments are described to verify the improvements achieved while considering biologically-inspired control approaches.
The research on Humanoid Robots designed for playing musical instruments has a long tradition in the research field of robotics. During the past decades, several researches are developing anthropomorphic and automated machines able to create live musical performances for both understanding the human itself and for creating novel ways of musical expression. In particular, Humanoid Robots are being designed to roughly simulate the dexterity of human players and to display higher-level of perceptual capabilities to enhance the interaction with musical partners. In this chapter, the concept and implementation of an interactive musical system for multimodal musician-humanoid interaction is detailed.
Our research is related to the development of an anthropomorphic saxophonist robot which reproduced the human organs involved during the saxophone playing. This research approach aims in understanding the human motor control from an engineering point of view and enabling the communication between humans and robots in musical terms. In a previous research, we have presented the Waseda Saxophonist Robot No. 2 (WAS-2) which improved the design of the lip and finger mechanisms. In addition, a feed-forward air pressure with dead-time compensation and an overblowing correction controller were implemented. However, the range of pressure was too limited to reproduce dynamic effects of the sound (i.e. decrescendo, etc.), a delay on the response of the finger mechanism was detected (due to the use of a wire-driven mechanism) and deviations on the pitch during the saxophone playing were observed. Therefore; in this paper, we present the Waseda Saxophonist Robot No. 2 Refined (WAS-2R). In particular the shape of the oral cavity has been re-designed to increase the sound pressure range and potentiometers were embedded on the fingers to reduce the dynamic delay response of the wire-driven mechanism. In addition, a Pressure-Pitch Controller has been implemented to reduce the deviation of the sound pitch by implementing a feedback error learning algorithm for a Multiple-Input Multiple-Output system. A set of experiments were proposed to verify the effectiveness of the re-designed mechanisms and the improved control strategy. From the experimental results, we could confirm the improvements to extend the sound pressure range to reproduce the decrescendo effect, to reduce the response delay from the finger mechanism as well as the deviations on the sound pitch.
In the Nordic countries, the snow in the solar panels affects considerably the solar energy production and the detection of its effects can be an issue due to the outdoor climate conditions. Our research aims to develop intelligent control systems for photovoltaic systems that adapts after variable environmental conditions. One of these conditions is the presence of snow or other form of pollution. In this paper, we focus to investigate whether it is possible to detect snow on a solar panel (off-grid) without using additional external sensors as well to investigate how snow affects the efficiency, current and power on a grid-connected solar panel. For this purpose, a set of experiments were proposed and carried out in a PV solar park. © 2021 IEEE.
Up to now, the embodiment of bodily-kinaesthetic, perceptual and cognitive capabilities for assistive robots has been scarcely studied. This research aims to incorporate and develop the concept of robotic human science and to enable its application in a human-friendly robot for assistive purposes. In this paper, the authors describe the improvements of the velocity control of the two-wheeled inverted pendulum of the hWalk by adding a LQR as compensator for the wheel angular velocity to the existent PID controller. On the other hand, an algorithm based on computing the integral of the motor measured current was proposed in order to detect a ramp in order to cope with inclined surfaces. Experiments were carried out to verify the effectiveness of the proposed velocity control as well as verify the feasibility of the proposed ramp detection algorithm.
This workshop is dedicated to discuss and explore the specific interdisciplinary aspects of Bodily human robot interaction, and establish a common ground for this area as a recognized and continued research topic. Bodily interaction with robots and robotic devices is partially established in niche applications such as exoskeletons, assistive devices and advanced machines for physical training where bodily interaction is the application. Bodily interaction is expected to develop a broader role in human robot interaction, for instance in manufacturing and in social and entertainment robotics. The direct exchange of force and motion in bodily interaction create a range of engineering challenges, but also entwine engineering directly with topics that traditionally reside in the realm of health and humanistic science, from biomechanics to human's social responses to the prompting and responses of physical interaction.
The authors are developing anthropomorphic musical performance robots as an approach to understand the human motor control and to enhance the human-robot interaction from an engineering point of view. For this purpose; since 1990, we have developed an anthropomorphic flutist robot. As one of our long-term approaches, we aim to develop an anthropomorphic musical performance robot capable of playing different kinds of woodwind instruments, e.g. flute and saxophone. In this paper, the improvements of the mechanical design and sensing system of the Waseda Flutist Robot No. 4 Refined V (WF-4RV) are detailed. As for the sensing system, an array of sensors has been designed to detect the lip's pressure distribution. On the other hand, the lips and lung have been re-designed to enable the flutist robot to play the saxophone.