Abstract
Human-computer interaction is not limited to the traditional input combination of keyboard and mouse. Natural user interfaces take advantage of input devices that react to real world actions such as gestures or motion. We present our experiments with prototypes including multi-touch table top display, Kinect sensor and a treadmill. Most of the prototypes are used in context of virtual reality where the usage of a natural interface increases user immersion. The immersion is even better if the interface and the action controlled by the interface share the same semantics. Our experience proves the intuitiveness of these interfaces but also documents their limits.

This work is licensed under a Creative Commons Attribution 4.0 International License.
Copyright (c) 2012 International Journal of Information Technology Applications
