Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • apa.csl
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Development of the Waseda Flutist Robot No. 4 Refined IV: Implementation of a Real-Time Interaction System with Human Partners
Karlstads universitet, Fakulteten för teknik- och naturvetenskap, Avdelningen för fysik och elektroteknik.ORCID-id: 0000-0002-6865-7346
2008 (engelsk)Inngår i: 2008 2nd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob 2008): Proceedings of a meeting held 19-22 October 2008, Scottsdale, Arizona., IEEE conference proceedings, 2008, s. 421-426Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

The aim of our research is to develop an anthropomorphic flutist robot that on the one hand reproduces the human motor skills required for playing the flute, and on the other hand displays cognitive capabilities for interacting with other (human) musicians. In this paper, we detail the recent mechanical improvements on the Waseda Flutist Robot (WF-4RIV), enhancing the realistic production of the flute sound. In particular, improved lips, oral cavity and tonguing are introduced and their mechanisms described: The possibility to deform the lip shape in 3-DOF, allows us to accurately control the characteristics of the air-stream (width, thickness and angle). An improved tonguing mechanism (1-DOF) has been designed to reproduce double tonguing. Furthermore we present the implementation of a real-time interaction system with human partners. We developed, as a first approach, a vision processing algorithm to track the 3D-orientation and position of a musical instrument: Image data is recorded using two cameras attached to the head of the robot, and processed in real-time. The proposed algorithm is based on color histogram matching and particle filter techniques to follow the position of a musicianpsilas hands on an instrument. Data analysis enables us to determine the orientation and location of the instrument. We map these parameters to control musical performance parameters of the WF-4RIV, such as sound vibrato and sound volume. A set of experiments were proposed to verify the effectiveness of the proposed tracking system during interaction with a human player. We conclude, that the quality of the musical performance of the WF-4RIV and its capabilities to interact with musical partners, have been significantly improved by the implementation of the techniques, that are proposed in this paper

sted, utgiver, år, opplag, sider
IEEE conference proceedings, 2008. s. 421-426
HSV kategori
Identifikatorer
URN: urn:nbn:se:kau:diva-18318DOI: 10.1109/BIOROB.2008.4762798ISBN: 9781424428823 (tryckt)ISBN: 978-1-4244-2883-0 (tryckt)OAI: oai:DiVA.org:kau-18318DiVA, id: diva2:591948
Konferanse
2nd IEEE RAS & EMBS - International Conference on Biomedical Robotics and Biomechatronics, 19-22 October 2008, Scottsdale, Arizona
Tilgjengelig fra: 2013-01-21 Laget: 2013-01-21 Sist oppdatert: 2015-06-17bibliografisk kontrollert

Open Access i DiVA

Fulltekst mangler i DiVA

Andre lenker

Forlagets fulltekst

Person

Solis, Jorge

Søk i DiVA

Av forfatter/redaktør
Solis, Jorge
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric

doi
isbn
urn-nbn
Totalt: 174 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • apa.csl
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf