Social Perception of Pedestrians and Virtual Agents
using Movement Features

We present novel algorithms for identifying emotion, dominance, and friendliness characteristics of pedestrians based on their motion behaviors. We also propose models for conveying emotions, friendliness, and dominance traits in virtual agents. We present applications of our algorithms to simulate interpersonal relationships between virtual characters, facilitate socially-aware robot navigation, identify perceived emotions from videos of walking individuals, and increase the sense of presence in scenarios involving multiple virtual agents.

We also present a dataset of videos of walking individuals with gaits and labeled emotions.
EWalk (Emotion Walk)

Recent Projects

Dominance Modeling for Virtual Characters

Dominance Modeling for Robot Navigation

Identifying Emotions from Walking

Conveying Emotions in Virtual Agents

Generating Personalized Walking Gaits

Friendliness Modeling for Virtual Agents

Modeling Data-Driven Dominance Traits for Virtual Characters using Gait Analysis

We present a data-driven algorithm for generating gaits of virtual characters with varying dominance traits. Our gait dominance classification algorithm can classify the dominance traits of gaits with 73% accuracy. We also present an application of our approach that simulates interpersonal relationships between virtual characters. To the best of our knowledge, ours is the first practical approach to classifying gait dominance and generate dominance traits in virtual characters.

Randhavane, T., Bera, A., Kubin, E., Gray, K., & Manocha, D. (2018). Modeling Data-Driven Dominance Traits for Virtual Characters using Gait Analysis
PDF Video (MP4, 100 MB)

Pedestrian Dominance Modeling for Socially-Aware Robot Navigation

We present a Pedestrian Dominance Model to identify the dominance characteristics of pedestrians based on their motion behaviors corresponding to trajectory, speed, and personal space. Prior studies in psychology literature indicate that when interacting with humans, people are more comfortable around people that exhibit complementary movement behaviors. Our algorithm leverages these findings by enabling the robots to exhibit complementing responses to pedestrian dominance.

Randhavane, T., Bera, A., Kubin, E., Wang, A., Gray, K., & Manocha, D. (2019). Pedestrian Dominance Modeling for Socially-Aware Robot Navigation In Proceedings of IEEE International Conference on Robotics and Automation (ICRA 2019).
PDF Video (MP4, 19.7 MB)

Identifying Emotions from Walking using Affective and Deep Features

We present a new data-driven model and algorithm to identify the perceived emotions of individuals based on their gaits. Using affective features computed using psychological findings and deep features learned using LSTM we classify the emotional state of the human into one of four emotions: happy, sad, angry, or neutral with an accuracy of 80.07%. We also present an "EWalk (Emotion Walk)" dataset that consists of videos of walking individuals with gaits and labeled emotions. To the best of our knowledge, this is the first gait-based model to identify perceived emotions from videos of walking individuals.

Randhavane, T., Bhattacharya, U., Kapsaskis, K., Gray, K., Bera, A., & Manocha, D. (2018). Identifying Emotions from Walking using Affective and Deep Features
PDF Video (MP4, 49.8 MB)

EVA: Generating Emotional Behavior of Virtual Agents using Expressive Features of Gait and Gaze

We present a novel, real-time algorithm, EVA, for generating virtual agents with various emotions. Our approach is based on using non-verbal movement cues such as gaze and gait to convey emotions corresponding to happy, sad, angry, or neutral. Our studies suggest that the use of EVA and gazing behavior can considerably increase the sense of presence in scenarios with multiple virtual agents. Our results also indicate that both gait and gazing features contribute to the perceptions of emotions in virtual agents.

Randhavane, T., Bera, A., Kapsaskis, K., Sheth, R., Gray, K., & Manocha, D. (2018). EVA: Generating Emotional Behavior of Virtual Agents using Expressive Features of Gait and Gaze To Appear in ACM Symposium on Applied Perception (ACM SAP 2019).
PDF Video (MP4, 64.6 MB)

FVA: Modeling Perceived Friendliness of Virtual Agents Using Movement Characteristics

We present a new approach to improve the friendliness and warmth of a virtual agent in an AR environment by generating appropriate movement characteristics. Our algorithm is based on a novel data-driven friendliness model that is computed using a user-study and psychological characteristics. We investigated the perception of a user in an AR setting and observed that an FVA has a statistically significant improvement in the perceived friendliness and social presence of a user compared to an agent without the friendliness modeling.

Randhavane, T., Bera, A., Kapsaskis, K., Gray, K., & Manocha, D. (2018). FVA: Modeling Perceived Friendliness of Virtual Agents Using Movement Characteristics To Appear in TVCG Special Issue for 2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), IEEE, 2019.
PDF Video (MP4, 26.6 MB)

Generating Virtual Avatars with Personalized Walking Gaits

We present a novel algorithm for automatically synthesizing personalized walking gaits for a human user from noisy motion caputre data. The overall approach is robust and can generate personalized gaits with little or no artistic intervention using commodity sensors.

Narang, S., Best, A., Shapiro, A., & Manocha. D. (2017, October). Generating Virtual Avatars with Personalized Walking Gaits using Commodity Hardware. Proceedings of the on Thematic Workshops of ACM Multimedia 2017 (pp. 219-227).
PDF Video (MP4, 28.6 MB)

EWalk: Emotion Walk

We present a new public domain dataset, EWalk, with videos of walking individuals. We also provide their gaits in the form of 3D positions of 16 joints and labeled emotions obtained using a perception study.

Randhavane, T., Bera, A., Kapsaskis, K., Sheth, R., Gray, K., & Manocha, D. (2018).
Dataset Sample
For full dataset, please contact the author: tanmay@cs.unc.edu