ZatLab Gesture Recognition Framework: Machine Learning Results
The main problem this work addresses is the real-time recognition of gestures, particularly in the complex domain of artistic performance. By recognizing the performer gestures, one is able to map them to diverse controls, from lightning control to the creation of visuals, sound control or even music creation, thus allowing performers real-time manipulation of creative events. The work presented here takes this challenge, using a multidisciplinary approach to the problem, based in some of the known principles of how humans recognize gesture, together with the computer science methods to successfully complete the task. This paper is a consequence of previous publications and presents in detail the Gesture Recognition Module of the ZatLab Framework and results obtained by its Machine Learning (ML) algorithms. One will provide a brief review the previous works done in the area, followed by the description of the framework design and the results of the recognition algorithms.
Year of publication: |
2016
|
---|---|
Authors: | Baltazar, André |
Published in: |
International Journal of Creative Interfaces and Computer Graphics (IJCICG). - IGI Global, ISSN 1947-3125, ZDB-ID 2703160-3. - Vol. 7.2016, 2 (01.07.), p. 11-24
|
Publisher: |
IGI Global |
Subject: | Computer Vision | DTW | Gesture Recognition | HCI | HMM | Interactive Performance | Kinect | Machine Learning |
Saved in:
Saved in favorites
Similar items by subject
-
Interactive 360 Degree Holographic Installation
Alves, Ricardo Martins, (2017)
-
Sixth Sense Technology: Advances in HCI as We Approach 2020
AlKassim, Zeenat, (2017)
-
State of the Art of Audio- and Video-Based Solutions for AAL
Aleksic, Slavisa, (2022)
- More ...