Publications by authors named "Tanaya Guha"

We explore the efficacy of multimodal behavioral cues for explainable prediction of personality and interview-specific traits. We utilize elementary head-motion units named kinemes, atomic facial movements termed action units and speech features to estimate these human-centered traits. Empirical results confirm that kinemes and action units enable discovery of multiple trait-specific behaviors while also enabling explainability in support of the predictions.

View Article and Find Full Text PDF

Individuals with Autism Spectrum Disorder (ASD) are known to have significantly limited social interaction abilities, which are often manifested in different non-verbal cues of communication such as facial expression, atypical eye gaze response. While prior works leveraged the role of pupil response for screening ASD, limited works have been carried out to find the influence of emotion stimuli on pupil response for ASD screening. We, in this paper, design, develop, and evaluate a light-weight LSTM (Long-short Term Memory) model that captures pupil responses (pupil diameter, fixation duration, and fixation location) based on the social interaction with a virtual agent and detects ASD sessions based on short interactions.

View Article and Find Full Text PDF

We introduce the problem of multi-camera trajectory forecasting (MCTF), which involves predicting the trajectory of a moving object across a network of cameras. While multi-camera setups are widespread for applications such as surveillance and traffic monitoring, existing trajectory forecasting methods typically focus on single-camera trajectory forecasting (SCTF), limiting their use for such applications. Furthermore, using a single camera limits the field-of-view available, making long-term trajectory forecasting impossible.

View Article and Find Full Text PDF

Research shows that neurotypical individuals struggle to interpret the emotional facial expressions of people with Autism Spectrum Disorder (ASD). The current study uses motion-capture to objectively quantify differences between the movement patterns of emotional facial expressions of individuals with and without ASD. Participants volitionally mimicked emotional expressions while wearing facial markers.

View Article and Find Full Text PDF

Several studies have established that facial expressions of children with autism are often perceived as atypical, awkward or less engaging by typical adult observers. Despite this clear deficit in the quality of facial expression production, very little is understood about its underlying mechanisms and characteristics. This paper takes a computational approach to studying details of facial expressions of children with high functioning autism (HFA).

View Article and Find Full Text PDF

Children with Autism Spectrum Disorder (ASD) are known to have difficulty in producing and perceiving emotional facial expressions. Their expressions are often perceived as by adult observers. This paper focuses on data driven ways to analyze and quantify atypicality in facial expressions of children with ASD.

View Article and Find Full Text PDF

This paper explores the effectiveness of sparse representations obtained by learning a set of overcomplete basis (dictionary) in the context of action recognition in videos. Although this work concentrates on recognizing human movements-physical actions as well as facial expressions-the proposed approach is fairly general and can be used to address other classification problems. In order to model human actions, three overcomplete dictionary learning frameworks are investigated.

View Article and Find Full Text PDF