Deep Learning for Real-time Affective Hand Gesture Recognition in EMASPEL
Mohamed Ben Ammar1, Jihane Ben Slimane2, Taoufik Saidani3, Refka Ghodhbani4

1Dr. Mohamed Ben Ammar, Department of Information Systems, Faculty of Computing and Information Technology, Northern Border University, Rafha, Saudi Arabia.

2Jihane Ben Slimane, Department of Computer Sciences Faculty of Computing and Information Technology, Northern Border University, Rafha, Saudi Arabia, National Engineering School of Tunis, LR11ES20 Analysis Design and Control of Systems Laboratory, University of Tunis El Manar, Tunis, Tunisia.

3Taoufik Saidani, Department of Computer Sciences Faculty of Computing and Information Technology, Northern Border University, Rafha, Saudi Arabia.

4Refka Ghodhbani, Department of Computer Sciences Faculty of Computing and Information Technology, Northern Border University, Rafha, Saudi Arabia.

Manuscript received on 16 February 2024 | Revised Manuscript received on 21 February 2024 | Manuscript Accepted on 15 March 2024 | Manuscript published on 30 March 2024 | PP: 25-34 | Volume-12 Issue-6, March 2024 | Retrieval Number: 100.1/ijrte.F801212060324 | DOI: 10.35940/ijrte.F8012.12060324

Open Access | Editorial and Publishing Policies | Cite | Zenodo | OJS |  Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: This research marks a transformative leap in personalized learning through real-time affective hand gesture recognition in EMASPEL (Emotional Multi-Agents System for Peer-to-peer E-Learning), an educational platform. Our deep learning model, a meticulously crafted ensemble of convolutional and recurrent neural networks, deciphers the unspoken language of emotions embedded within student gestures, accurately capturing both spatial and temporal patterns. This detailed emotional map empowers EMASPEL to tailor its interactions with exquisite precision, addressing frustration, nurturing curiosity, and maximizing student engagement. The impact is profound: students flourish in personalized learning environments, experiencing enhanced outcomes and a newfound connection to their educational journey. Teachers, equipped with real-time emotional insights, provide targeted support and cultivate a more inclusive, responsive classroom. Beyond gestures, we envision a future enriched by multimodal data integration, encompassing facial expressions, voice analysis, and potentially physiological sensors, to paint even richer portraits of student emotions and cognitive states. Continuous refinement through rigorous longitudinal studies will pave the way for deeper understanding and ensure responsible implementation. Ultimately, this research reimagines education as a dynamic ensemble of personalized learning, where technology serves as a bridge between teacher and student, unlocking not just academic success but a lifelong love of knowledge. 

Keywords: Real-time Affective Hand Gesture Recognition, Deep Learning, EMASPEL
Scope of the Article: Deep Learning