Social Perception of Pedestrians and Virtual Agents
using Movement Features

We present novel algorithms for identifying emotion, deception, dominance, and friendliness characteristics of pedestrians based on their motion behaviors. We also propose models for conveying emotions, friendliness, and dominance traits in virtual agents. We present applications of our algorithms to simulate interpersonal relationships between virtual characters, facilitate socially-aware robot navigation, identify perceived emotions and deception from videos of walking individuals, and increase the sense of presence in scenarios involving multiple virtual agents.

Recent Projects

Detecting Deception from Walking

Dominance Modeling for Virtual Characters

Dominance Modeling for Robot Navigation

Identifying Emotions from Walking

Conveying Emotions in Virtual Agents

Friendliness Modeling for Virtual Agents

Generating Personalized Walking Gaits

Datasets

We present two datasets of walking videos, gaits and associated labels.

EWalk (Emotion Walk)

This dataset consists of videos of walking individuals with gaits and labeled emotions.

DeceptiveWalk

This dataset consists of videos of walking individuals with gaits, gestures, and deceptive behavior labels.

Detecting Deception with Gait and Gesture

We present a data-driven deep neural algorithm for detecting deceptive walking behavior using nonverbal cues like gaits and gestures. Using gait and gesture data from a novel DeceptiveWalk dataset, we train an LSTM-based deep neural network to obtain deep features. We use a combination of psychology-based gait, gesture, and deep features to detect deceptive walking with an accuracy of 88.41%. To the best of our knowledge, ours is the first algorithm to detect deceptive behavior using non-verbal cues of gait and gesture.

Randhavane, T., Bhattacharya, U., Kapsaskis, K., Gray, K., Bera, A., & Manocha, D. (2020). The Liar's Walk: Detecting Deception with Gait and Gesture
PDF Video (MP4, 81.6 MB)

Modeling Data-Driven Dominance Traits for Virtual Characters using Gait Analysis

We present a data-driven algorithm for generating gaits of virtual characters with varying dominance traits. Our gait dominance classification algorithm can classify the dominance traits of gaits with 73% accuracy. We also present an application of our approach that simulates interpersonal relationships between virtual characters. To the best of our knowledge, ours is the first practical approach to classifying gait dominance and generate dominance traits in virtual characters.

Randhavane, T., Bera, A., Kubin, E., Gray, K., & Manocha, D. (2019). Modeling Data-Driven Dominance Traits for Virtual Characters using Gait Analysis
PDF Video (MP4, 100 MB)

Pedestrian Dominance Modeling for Socially-Aware Robot Navigation

We present a Pedestrian Dominance Model to identify the dominance characteristics of pedestrians based on their motion behaviors corresponding to trajectory, speed, and personal space. Prior studies in psychology literature indicate that when interacting with humans, people are more comfortable around people that exhibit complementary movement behaviors. Our algorithm leverages these findings by enabling the robots to exhibit complementing responses to pedestrian dominance.

Randhavane, T., Bera, A., Kubin, E., Wang, A., Gray, K., & Manocha, D. (2019). Pedestrian Dominance Modeling for Socially-Aware Robot Navigation In Proceedings of IEEE International Conference on Robotics and Automation (ICRA 2019).
PDF Video (MP4, 19.7 MB)

Identifying Emotions from Walking using Affective and Deep Features

We present a new data-driven model and algorithm to identify the perceived emotions of individuals based on their gaits. Using affective features computed using psychological findings and deep features learned using LSTM we classify the emotional state of the human into one of four emotions: happy, sad, angry, or neutral with an accuracy of 80.07%. We also present an "EWalk (Emotion Walk)" dataset that consists of videos of walking individuals with gaits and labeled emotions. To the best of our knowledge, this is the first gait-based model to identify perceived emotions from videos of walking individuals.

Randhavane, T., Bhattacharya, U., Kapsaskis, K., Gray, K., Bera, A., & Manocha, D. (2018). Identifying Emotions from Walking using Affective and Deep Features
PDF Video (MP4, 49.8 MB)

EVA: Generating Emotional Behavior of Virtual Agents using Expressive Features of Gait and Gaze

We present a novel, real-time algorithm, EVA, for generating virtual agents with various emotions. Our approach is based on using non-verbal movement cues such as gaze and gait to convey emotions corresponding to happy, sad, angry, or neutral. Our studies suggest that the use of EVA and gazing behavior can considerably increase the sense of presence in scenarios with multiple virtual agents. Our results also indicate that both gait and gazing features contribute to the perceptions of emotions in virtual agents.

Randhavane, T., Bera, A., Kapsaskis, K., Sheth, R., Gray, K., & Manocha, D. (2018). EVA: Generating Emotional Behavior of Virtual Agents using Expressive Features of Gait and Gaze To Appear in ACM Symposium on Applied Perception (ACM SAP 2019).
PDF Video (MP4, 64.6 MB)

FVA: Modeling Perceived Friendliness of Virtual Agents Using Movement Characteristics

We present a new approach to improve the friendliness and warmth of a virtual agent in an AR environment by generating appropriate movement characteristics. Our algorithm is based on a novel data-driven friendliness model that is computed using a user-study and psychological characteristics. We investigated the perception of a user in an AR setting and observed that an FVA has a statistically significant improvement in the perceived friendliness and social presence of a user compared to an agent without the friendliness modeling.

Randhavane, T., Bera, A., Kapsaskis, K., Gray, K., & Manocha, D. (2018). FVA: Modeling Perceived Friendliness of Virtual Agents Using Movement Characteristics To Appear in TVCG Special Issue for 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), IEEE, 2019.
PDF Video (MP4, 26.6 MB)

Generating Virtual Avatars with Personalized Walking Gaits

We present a novel algorithm for automatically synthesizing personalized walking gaits for a human user from noisy motion caputre data. The overall approach is robust and can generate personalized gaits with little or no artistic intervention using commodity sensors.

Narang, S., Best, A., Shapiro, A., & Manocha. D. (2017, October). Generating Virtual Avatars with Personalized Walking Gaits using Commodity Hardware. Proceedings of the on Thematic Workshops of ACM Multimedia 2017 (pp. 219-227).
PDF Video (MP4, 28.6 MB)

EWalk: Emotion Walk

We present a new public domain dataset, EWalk, with videos of walking individuals. We also provide their gaits in the form of 3D positions of 16 joints and labeled emotions obtained using a perception study.

Randhavane, T., Bera, A., Kapsaskis, K., Sheth, R., Gray, K., & Manocha, D. (2018).
Dataset Sample
For full dataset, please contact the author: tanmay@cs.unc.edu

DeceptiveWalk

We present a new public domain deception dataset, DeceptiveWalk, which contains 1144 annotated gaits and gestures collected from 88 individuals performing deceptive and natural walking.

Randhavane, T., Bhattacharya, U., Kapsaskis, K., Gray, K., Bera, A., & Manocha, D. (2019).
For full dataset, please contact the author: tanmay@cs.unc.edu