Change search
Refine search result
1 - 1 of 1
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Teshome, Teshome Delellegn
    Karlstad University, Faculty of Technology and Science, Department of Physics and Electrical Engineering.
    3-D Gesture Based Module for an Intelligently Controlled Service Robot Using a Motion Sensor2018Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    The study of gesture recognition based sensing modalities for the purpose of Human-Robot Interaction (HRI) has had favorable success as of late - but most often at the expense of loss of naturalness on used gestures due mostly to use of awkward and bulky recognition system, lack of real-timeliness, and static condition requirements. However, some recent advancements in Natural User Interfaces by the industry and research community has helped produce new and interesting products, like the Kinect from Microsoft, that have captivated the attention of roboticits in its intended use for HRI. Coupled with the advent of artificial intelligence and machine learning that have bases in soft computing paradigms, the potential of these natural user interfaces is even more amplified.

    The aim of this thesis is to design and implement a 3-D gesture based recognition system capable of working on a mobile platform, and incorporate this interface as part of a mobile service robot that is able to navigate its environment reasonably well, while interacting and putting into action commands made by its human user. The fundamental distinction of this work as compared to the previous related works is the ability to recognize 3-D pose and motion gestures made in static and dynamic condition; at the same time being used for HRI by a service robot.

    The gesture recognition module is implemented by Time Delay Neural Network (TDNN) that has been trained to recognizing 3-D motion patterns from gesture sequences captured via the motion sensor, Kinect. The design of navigation control system of the service robot is undertaken using a Layered Behavior-based Control Architecture consisting of three layers, namely supervision layer, Behavioral layer, and locomotion layer. Fuzzy logic is used for designing robust individual behaviors as well as for behavior arbitration and fusion, while a two-degree-of-freedom PID control takes care putting into action desired locomotion commanded.

    Experiments aimed at verifying the performance of the gesture recognition module show an average recognition rate of 94.45% for a training dataset, where as a 91.06% average recognition rate for a previously unseen testing dataset. For gestures performed under both static and dynamic conditions, such results are as expected and deemed satisfactory as per the design specifications. In an attempt to showcase the application of the implemented 3-D gesture recognition perceptual module and its integration into the layered behavior-based control architecture, a series of tests were performed that serve as navigation and task execution examples. The results show the service robot interacting and performing simulated task scenarios to varying degrees of success; but all in all, the results were encouraging based on design specifications and the work scope put forth at the onset of the project work.

1 - 1 of 1
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf