Chapter 10

Deep Bidirectional LSTM for Emotion Detection through Mobile Sensor Analysis

D. Roja Ramani

D. Roja Ramani

Computer Science and Engineering, New Horizon College of Engineering, Bangalore, India

Search for more papers by this author
Naveen Chandra Gowda

Naveen Chandra Gowda

School of Computer Science and Engineering, REVA University, Bengaluru, India

Search for more papers by this author
S. Sreejith

S. Sreejith

Department of Artificial Intelligence and Data Science, New Horizon College of Engineering, Bangalore, India

Search for more papers by this author
Shrikant Tangade

Shrikant Tangade

Department of Mathematics, University of Padova, Padova, Italy

Search for more papers by this author
First published: 16 February 2025

Summary

In the era of rapid sensor and information technology advancements, machines now possess the remarkable capability to not only recognize but also analyze the intricate tapestry of human emotions. The pursuit of emotion recognition has become a focal point across various domains, unveiling a captivating landscape where facial expressions, speech patterns, behavior, and physiological signals serve as the rich palette for understanding human emotions. These diverse signals, artfully captured by different sensors, pave the way for accurate identification, propelling the field of affective computing into an exciting frontier. This journey underscores the profound importance of decoding the multifaceted layers of human experience, marking a transformative stride towards a more emotionally intelligent technological landscape. This research looks into the use of Deep Bidirectional Long Short-Term Memory (LSTM) for the nuanced task of emotion detection using mobile sensor analysis. Emotion detection is at the heart of several applications, most notably human-robot interaction. Traditionally, this work includes fusing data from several sources, such as physiological signals, environmental inputs, video feeds, and so on. These methodologies frequently wrestle with the complexities of feature engineering, a process fraught with inherent limits. Our novel technique harnesses the power of deep learning, aided by an iterative process that weaves a tapestry of insights from a wide range of sensor data kinds and modalities. What distinguishes us is our ability to avoid the time-consuming task of manual feature engineering. Our dataset is a synthesis of information from on-body, ambient, and geographical sensors, providing a comprehensive view of the ever-changing landscape of human emotions. Our foray into uncharted territory has yielded truly remarkable results. Our approach surpasses traditional deep neural networks and ensemble methods that depend on feature engineering, achieving an exceptional average accuracy of 97%. This study underscores the groundbreaking potential of Deep Bidirectional LSTM for emotion detection through mobile sensor analysis, emphasizing the paramount importance of accuracy and effectiveness.

The full text of this article hosted at iucr.org is unavailable due to technical difficulties.