Virtual Effort: An Advanced User Interface that Combines Various Visual Information with a Kinetic System for Virtual Object Manipulations

Article Preview

Abstract:

The present day virtual environments are very advanced and have the ability to display complex representations of simulated objects together with their surroundings. Each virtual object can be endowed with multiple physical properties; in this way the entire virtual environment tends to become very real. However, human interaction with objects in virtual environments is forced from the user perspective and very different from how humans interact with objects in physical world. In this paper, a research that allows a more natural human-computer communication is presented. This is realized by combining visual information that is displayed to a user with kinetic information captured from the user hand by utilizing a sensor to capture depth data. The presented method is used in order to perform experiment that consist in grasping, moving and dropping a virtual object and analyze the hand trajectory compared with the virtual object trajectory.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

497-502

Citation:

Online since:

July 2013

Export:

Price:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] A. Bachvarov, S. Maleshkov, D. Chotrov, J. Katicic, Immersive Representation of Objects in Virtual Reality Environment Implementing Implicit Properties, Developments in E-systems Engineering (DeSE), (2011).

DOI: 10.1109/dese.2011.80

Google Scholar

[2] K. Ponto, R. Kimmel, J. Kohlmann, A. Bartholomew, R. G. Radwin, Virtual Exertions: a user interface combining visual information, kinesthetics and biofeedback for virtual object manipulation, IEEE Symposium on 3D User Interfaces, (2012).

DOI: 10.1109/3dui.2012.6184189

Google Scholar

[3] J. Iqbal, N. G. Tsagarakis, D. G. Caldwell, A Multi-DOF Robotic Exoskeleton Interface for Hand Motion Assistance, 33rd Annual International Conference of the IEEE EMBS Boston, Massachusetts, USA, (2011).

DOI: 10.1109/iembs.2011.6090458

Google Scholar

[4] M. Van den Bergh, L. Van Gool, Combining RGB and ToF Cameras for Real-time 3D Hand Gesture Interaction, Proceedings of 20th IEEE International Symposium on Robot and Human Interactive Communication, (2011).

DOI: 10.1109/wacv.2011.5711485

Google Scholar

[5] A. Fogelton, Real-time Hand Tracking using Flocks of Features, Proceedings of CESCG, The 15th Central European Seminar on Computer Graphics, (2011).

Google Scholar

[6] R. Y. Wang, J. Popović, Real-time hand-tracking with a color glove, Journal ACM Transactions on Graphics (TOG), Proceedings of ACM SIGGRAPH, (2009).

DOI: 10.1145/1576246.1531369

Google Scholar

[7] V. Frati, D. Prattichizzo, Using Kinect for hand tracking and rendering in wearable haptics, IEEE World Haptics Conference (2011)

DOI: 10.1109/whc.2011.5945505

Google Scholar

[8] C. C. Moldovan and I. Staretu, Real-Time Gesture Recognition for Controlling a Virtual Hand: Advanced Materials Research (2012)

DOI: 10.4028/www.scientific.net/amr.463-464.1147

Google Scholar

[9] Information on www.ros.org

Google Scholar

[10] A. Miller, P. K. Allen, V. Santos, F. Valero-Cuevas, From robot hands to human hands: A visualization and simulation engine for grasping research, Industrial Robot: An International Journal. Volume 32 Number 1 (2005) 55–63.

DOI: 10.1108/01439910510573309

Google Scholar