A methodological framework to assess the accuracy of virtual reality hand-tracking systems: A case study with the Meta Quest 2.
Hand-tracking
Metaverse
Tracking precision
VR delay
Virtual reality
Journal
Behavior research methods
ISSN: 1554-3528
Titre abrégé: Behav Res Methods
Pays: United States
ID NLM: 101244316
Informations de publication
Date de publication:
13 Feb 2023
13 Feb 2023
Historique:
accepted:
09
12
2022
entrez:
13
2
2023
pubmed:
14
2
2023
medline:
14
2
2023
Statut:
aheadofprint
Résumé
Optical markerless hand-tracking systems incorporated into virtual reality (VR) headsets are transforming the ability to assess fine motor skills in VR. This promises to have far-reaching implications for the increased applicability of VR across scientific, industrial, and clinical settings. However, so far, there are little data regarding the accuracy, delay, and overall performance of these types of hand-tracking systems. Here we present a novel methodological framework based on a fixed grid of targets, which can be easily applied to measure these systems' absolute positional error and delay. We also demonstrate a method to assess finger joint-angle accuracy. We used this framework to evaluate the Meta Quest 2 hand-tracking system. Our results showed an average fingertip positional error of 1.1cm, an average finger joint angle error of 9.6
Identifiants
pubmed: 36781700
doi: 10.3758/s13428-022-02051-8
pii: 10.3758/s13428-022-02051-8
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Informations de copyright
© 2023. The Author(s).
Références
Abdlkarim, D., Ortenzi, V., Pardi, T., Filipovica, M., Wing, A., Kuchenbecker, K. J., & Di Luca, M. (2021). Prendosim: Proxy-hand-based robot grasp generator. In 18th International Conference on Informatics in Control, Automation and Robotics. SciTePress Digital Library.
Block, H. J., & Bastian, A. J. (2011). Sensory weighting and realignment: independent compensatory processes. Journal of Neurophysiology, 106(1), 59–70.
doi: 10.1152/jn.00641.2010
pubmed: 21490284
pmcid: 3129718
Boian, R., Sharma, A., Han, C., Merians, A., Burdea, G., Adamovich, S., ..., Poizner, H. (2002). Virtual reality-based post-stroke hand rehabilitation. In Medicine Meets Virtual Reality 02/10, pp. 64–70. IOS Press.
Castiello, U. (2005). The neuroscience of grasping. Nature Reviews Neuroscience, 6(9), 726–736.
doi: 10.1038/nrn1744
pubmed: 16100518
Droettboom, M., et al. (2003). Matplotlib: Open-source MATLAB like tool for Python.
Elliott, M. T., Karim, D., Clark, T., Masood, A., Kearney, R., & Metcalfe, A. (2017). Feasibility of using the leap motion hand gesture sensor for tracking wrist fracture physiotherapy. In WIN Annual Conference.
Foulkes, A. J., & Chris Miall, R. (2000). Adaptation to visual feedback delays in a human manual tracking task. Experimental Brain Research, 131(1), 101–110.
doi: 10.1007/s002219900286
pubmed: 10759175
Gruen, R., Ofek, E., Steed, A., Gal, R., Sinclair, M., & Gonzalez-Franco, M. (2020). Measuring system visual latency through cognitive latency on video see-through AR devices. In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 791–799. IEEE.
Han, S., Liu, B., Cabezas, R., Twigg, C. D., Zhang, P., Petkau, J., ..., et al. (2020). Megatrack: monochrome egocentric articulated hand-tracking for virtual reality. ACM Transactions on Graphics (TOG), 39(4), 87–1.
doi: 10.1145/3386569.3392452
Jones, E., Oliphant, T., Peterson, P., & et al. (2001). SciPy: Open-source scientific tools for Python.
Khademi, M., Hondori, H. M., McKenzie, A., Dodakian, L., Lopes, C. V., & Cramer, S. C. (2014). Free-hand interaction with leap motion controller for stroke rehabilitation. 1663–1668.
Kim, W. -S., Cho, S., Jeonghun, K., Kim, Y., Lee, K., Hwang, H. -J., & Paik, N. -J. (2020). Clinical application of virtual reality for upper limb motor rehabilitation in stroke: review of technologies and clinical evidence. Journal of Clinical Medicine, 9(10), 3369.
doi: 10.3390/jcm9103369
pubmed: 33096678
pmcid: 7590210
Krakauer, J. W. (2009). Motor learning and consolidation: the case of visuomotor rotation. Progress in Motor Control, 405–421.
Niehorster, D. C., Li, L., & Lappe, M. (2017). The accuracy and precision of position and orientation tracking in the HTC VIVE virtual reality system for scientific research. i-Perception, 8(3), 2041669517708205.
doi: 10.1177/2041669517708205
pubmed: 28567271
pmcid: 5439658
Ortenzi, V., Filipovica, M., Abdlkarim, D., Pardi, T., Takahashi, C., Wing, A., ..., Kuchenbecker, K. J. (2022). Robot, Pass me the tool: Handle visibility facilitates task-oriented handovers. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 1–9).
Park, J., Byun, S. -C., & Lee, B. -U. (2009). Lens distortion correction using ideal image coordinates. IEEE Transactions on Consumer Electronics, 55(3), 987–991.
doi: 10.1109/TCE.2009.5278053
Rinderknecht, M. D., Kim, Y., Santos-Carreras, L., Bleuler, H., & Gassert, R. (2013). Combined tendon vibration and virtual reality for post-stroke hand rehabilitation. In 2013 World Haptics Conference (WHC), pp 277–282. IEEE.
Scheggi, S., Meli, L., Pacchierotti, C., & Prattichizzo, D. (2015). Touch the virtual reality: using the leap motion controller for hand tracking and wearable tactile devices for immersive haptic rendering. 1–1.
Schröder, M., Maycock, J., Ritter, H., & Botsch, M. (2014). Real-time hand tracking using synergistic inverse kinematics. In 2014 IEEE International Conference on Robotics and Automation (ICRA), pp 5447–5454. IEEE.
Real-time latency tests of a Qualisys system in the Sensory-Motor Systems Lab at ETH, Zurich, Switzerland, April (2019).
Voigt-Antons, J. -N., Kojic, T., Ali, D., & Möller, S. (2020). Influence of Hand Tracking as a Way of Interaction in Virtual Reality on User Experience. In 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX), pp. 1–4. ISSN: 2472-7814.
Yamaoka, K., Scheibler, R., Ono, N., & Wakabayashi, Y. (2019). Sub-sample time delay estimation via auxiliary-function-based iterative updates. In 2019 IEEE Workshop on Applications of Signal Processing to Audio and Acoustics (WASPAA), pp 130–134. IEEE.
Zhang, F., Bazarevsky, V., Vakunov, A., Tkachenka, A., Sung, G., Chang, C. -L., & Grundmann, M. (2020). Mediapipe hands: On-device real-time hand tracking. arXiv: 2006.10214 .