Advances in Air Gesture and Handwriting Recognition for Human-Computer Interaction: A Quantitative Approach

Authors

  • Prof. Snehal Javheri, Yash Rajiv Ghute, Nandan Anerao, Akshay Sudhakar Shinde Author

Abstract

In recent years, air gesture and in-air handwriting recognition have emerged as transformative modalities in human-computer interaction (HCI), enabling touchless and intuitive communication between users and machines. This paper presents a comprehensive quantitative evaluation of various recognition techniques applied to air gestures and airborne handwriting. We benchmark multiple classical and deep learning models—including HMM, DTW, CNN-LSTM, and Transformer architectures—on publicly available datasets such as SHREC, DHG-14/28, and AirGest. Performance metrics including accuracy, latency, F1-score, and computational efficiency are analyzed in real-world HCI contexts. Additionally, we assess the impact of sensor types (e.g., depth cameras, IMUs), data preprocessing filters, and trajectory reconstruction techniques on system performance. Case studies in automotive control, healthcare, and AR/VR platforms demonstrate practical applications and real-time viability. Experimental results show that Transformer-based models achieve up to 96.1% accuracy with latency under 40 ms, suggesting strong potential for real-time deployment. The paper also highlights challenges such as user variability, occlusion robustness, and dataset imbalance, and provides future directions for optimizing adaptive, low-power HCI systems.

 

Downloads

Published

07.01.2025

How to Cite

Advances in Air Gesture and Handwriting Recognition for Human-Computer Interaction: A Quantitative Approach. (2025). International Journal of Open Publication and Exploration, ISSN: 3006-2853, 13(1), 19-30. https://ijope.com/index.php/home/article/view/193

Most read articles by the same author(s)

<< < 2 3 4 5 6 7 8 9 10 11 > >>