Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Development of the Waseda Flutist Robot No. 4 Refined IV: Implementation of a Real-Time Interaction System with Human Partners
Karlstad University, Faculty of Technology and Science, Department of Physics and Electrical Engineering.ORCID iD: 0000-0002-6865-7346
2008 (English)In: 2008 2nd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob 2008): Proceedings of a meeting held 19-22 October 2008, Scottsdale, Arizona., IEEE conference proceedings, 2008, p. 421-426Conference paper, Published paper (Refereed)
Abstract [en]

The aim of our research is to develop an anthropomorphic flutist robot that on the one hand reproduces the human motor skills required for playing the flute, and on the other hand displays cognitive capabilities for interacting with other (human) musicians. In this paper, we detail the recent mechanical improvements on the Waseda Flutist Robot (WF-4RIV), enhancing the realistic production of the flute sound. In particular, improved lips, oral cavity and tonguing are introduced and their mechanisms described: The possibility to deform the lip shape in 3-DOF, allows us to accurately control the characteristics of the air-stream (width, thickness and angle). An improved tonguing mechanism (1-DOF) has been designed to reproduce double tonguing. Furthermore we present the implementation of a real-time interaction system with human partners. We developed, as a first approach, a vision processing algorithm to track the 3D-orientation and position of a musical instrument: Image data is recorded using two cameras attached to the head of the robot, and processed in real-time. The proposed algorithm is based on color histogram matching and particle filter techniques to follow the position of a musicianpsilas hands on an instrument. Data analysis enables us to determine the orientation and location of the instrument. We map these parameters to control musical performance parameters of the WF-4RIV, such as sound vibrato and sound volume. A set of experiments were proposed to verify the effectiveness of the proposed tracking system during interaction with a human player. We conclude, that the quality of the musical performance of the WF-4RIV and its capabilities to interact with musical partners, have been significantly improved by the implementation of the techniques, that are proposed in this paper

Place, publisher, year, edition, pages
IEEE conference proceedings, 2008. p. 421-426
National Category
Robotics
Identifiers
URN: urn:nbn:se:kau:diva-18318DOI: 10.1109/BIOROB.2008.4762798ISBN: 9781424428823 (print)ISBN: 978-1-4244-2883-0 (print)OAI: oai:DiVA.org:kau-18318DiVA, id: diva2:591948
Conference
2nd IEEE RAS & EMBS - International Conference on Biomedical Robotics and Biomechatronics, 19-22 October 2008, Scottsdale, Arizona
Available from: 2013-01-21 Created: 2013-01-21 Last updated: 2015-06-17Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Authority records BETA

Solis, Jorge

Search in DiVA

By author/editor
Solis, Jorge
By organisation
Department of Physics and Electrical Engineering
Robotics

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 97 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf