top of page

Translating our
Academic Research Legacy

Our technology and approach build on a rich academic research legacy of our team, that has attracted tens of millions in funding from major grant-making agencies in the US, and worldwide. 

Our team has published over 600+ papers generating 64k citations. 

ONR white logo_edited.png

Our 2009 landmark multimodal depression study, started the field.

The world's first multimodal psychopathology paper was published by our co-founder and Chief Scientist, Jeff Cohn et al.: "Detecting depression from facial actions and vocal prosody," 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, Amsterdam, Netherlands, 2009, pp. 1-7.

​

We've published 600+ papers
which received 64k citations

(see some selected publications on our on our methods below)

Detecting history of depression in mother-adolescent child interaction from multimodal feature selection

Bilalpur, M., 2023

Detecting history of depression in mother-adolescent child interaction from multimodal feature selection

Bilalpur, M., 2023

Infant AFAR: Automated facial action recognition in infants.

Ertugrul, I.O., 2022

Facial expression manipulation for personalized facial action estimation.

Niinuma, K., 2022

Facial expression manipulation for personalized facial action estimation.

Niinuma, K., 2022

Automated facial expression recognition

Lien, J.J., 1998

Facial action units and head dynamics in longitudinal interviews reveal symptom severity of ocd and depression

Darzi, A., 2021

Long-term ecological assessment of intracranial electrophysiology synchronized to behavioral markers in Obsessive-Compulsive Disorder.

Provenza, N.R., 2021

Deep brain stimulation for depression informed by intracranial recordings.

Sheth, S., 2022

A Novel Framework for Network-Targeted Neuropsychiatric Deep Brain Stimulation

Allawala, A.B., 2021

Systematic evaluation of design choices for deep facial action coding across pose.

Niinuma, K., 2021

A Novel Framework for Network-Targeted Neuropsychiatric Deep Brain Stimulation

Allawala, A.B., 2021

A scoping review of machine learning in psychotherapy research

Aafjes-van Doorn, 2021

A person- and time- varying autoregressive model to capture interactive infant-mother head movement dynamics.

Chen, M., 2020

Reconsidering the Duchenne Smile: Formalizing and Testing Hypotheses About Eye Constriction and Positive Emotion

Girard, J.M., 2020

Crossing domains for AU coding: Perspectives, approaches, and measures.

Ertugrul, I.O., 2020

Interpretation of Depression Detection Models via Feature Selection Methods

Alghowinem, S., 2020

D-PAttNet: Dynamic Patch-Attentive Deep Network for Action Unit Detection

Ertugrul, I. O., 2019

Gram matrices formulation of body shape motion: An application for depression severity assessment.

Daoudi, M., 2019

Automated measurement of head movement synchrony during dyadic depression severity interviews.

Bhatia, S., 2019

Dynamics of Face and Head Movement in Infants with and without Craniofacial Microsomia: An Automatic Approach

Hammal, Z., 2019

Learning facial action units with spatiotemporal cues and multi-label sampling

Chu, W. S., 2019

D-PAttNet: Dynamic Patch-Attentive Deep Network for Action Unit Detection

Ertugrul, I. O., 2019

Dynamics of Face and Head Movement in Infants with and without Craniofacial Microsomia: An Automatic Approach

Hammal, Z., 2019

Guest Editorial: The Computational Face

Escalera, S., 2018

Automated face coding: A computer-vision based method of facial expression analysis.

Cohn, J.F., 1997

Multimodal assessment of depression from behavioral signals.

Cohn, J.F., 2018

Automated face coding: A computer-vision based method of facial expression analysis.

Cohn, J.F., 1997

Detecting depression severity by interpretable representations of motion dynamics.

Kacem, A., 2018

Dynamic multimodal measurement of depression severity using deep autoencoding

DibeklioÄŸlu, H., 2018

Objective measurement of head movement differences in children with and without autism spectrum disorder

Martin, K.B., 2018

Facial expressiveness in infants with and without craniofacial microsomia: Preliminary findings.

Hammal, Z., 2018

Viewpoint-Consistent 3D Face Alignment

Tulyakov, S., 2018

Dense 3D face alignment from 2D video for real-time use

Jeni, L. A., 2017

Cross-cultural depression recognition from vocal biomarkers

Alghowinem, S., 2015

Survey on RGB, 3D, Thermal, and Multimodal Approaches for Facial Expression Recognition: History, Trends, and Affect-Related Applications

Corneanu, C.A., 2016

Editorial of special issue on spontaneous facial behaviour analysis

Zafeiriou, S., 2016

Editorial of special issue on spontaneous facial behaviour analysis

Zafeiriou, S., 2016

Confidence Preserving Machine for Facial Action Unit Detection

Zeng, J., 2016

Automatic Measurement of Head and Facial Movement for Analysis and Detection of Infants’ Positive and Negative Affect

Hammal, Z., 2015

Spontaneous facial expression in unscripted social interactions can be measured automatically

Girard, J.M., 2014

Estimating smile intensity: A better way

Girard, J.M., 2015

Multimodal detection of depression in clinical interviews

DibeklioÄŸlu, H., 2015

Extraversion and the Rewarding Effects of Alcohol in a Social Context

Fairbairn, C. E., 2015

Speech volume indexes sex differences in the social-emotional effects of alcohol.

Fairbairn, C. E., 2015

Automated audiovisual depression analysis

Girard, J.M., 2015

Joint patch and multi-label learning for facial action unit detection

Zhao, K., 2015

Open Challenges in Modelling, Analysis and Synthesis of Human Behaviour in Human–Human and Human–Machine Interactions

Vinciarelli, A., 2015

Joint patch and multi-label learning for facial action unit detection

Zhao, K., 2015

Predicting Ad Liking and Purchase Intent: Large-Scale Analysis of Facial Responses to Ads

McDuff, D., 2014

Dyadic behavior analysis in depression severity assessment interviews

Scherer, S., 2014

Nonverbal social withdrawal in depression: Evidence from manual and automatic analyses

Girard, J.M., 2014

A high-resolution spontaneous 3D dynamic facial expression database

Zhang, X., 2013

Spatial and Temporal Linearities in Posed and Spontaneous Smiles

Trutoiu, L. C., 2014

Interpersonal coordination of head motion in distressed couples

Hammal, Z., 2014

Facial Action Unit Event Detection by Cascade of Tasks

Ding, X., 2013

Darwin’s Duchenne: Eye Constriction during Infant Joy and Distress

Mattson, W. I., 2013

Relative body parts movement for automatic depression analysis

Joshi, J. 2013

Facial expression as an indicator of pain in critically ill intubated adults during endotracheal suctioning.

Mamoona, A.R., 2013

Selective Transfer Machine for Personalized Facial Action Unit Detection

Chu, W-S., 2013

DISFA: A spontaneous facial action intensity database

Mavadati, S.M., 2013

Detecting depression severity from intra- and interpersonal vocal prosody

Yang, Y., 2012

The effects of alcohol on the emotional displays of Whites in interracial groups.

Fairbairn, C.E., 2013

In the pursuit of effective affective computing: The relationship between features and registration

Chew, S.W., 2012

Alcohol and group formation: A multimodal investigation of the effects of alcohol on emotion and social bonding

Sayette, M.A., 2012

The Eyes Have It: Making Positive Expressions More Positive and Negative Expressions More Negative

Messinger, D.S., 2012

Painful monitoring: Automatic pain monitoring using the UNBC-McMaster shoulder pain expression archive database☆

Lucey, P., 2012

Children’s Depressive Symptoms in Relation to EEG Frontal Asymmetry and Maternal Depression

Feng, X., 2012

Dynamic Cascades with Bidirectional Bootstrapping for Action Unit Detection in Spontaneous Facial Behavior

Zhu, Y., 2011

Automatically detecting pain in video through facial action units

Lucey, P., 2011

Something in the Way We Move: Motion Dynamics, Not Perceived Sex, Influence Head Movements in Conversation

Boker, S.M., 2011

Spontaneous facial expression in a small group can be automatically measured: An initial demonstration

Cohn, J.F., 2010

Advances in Behavioral Science Using Automated Facial Image Analysis and Synthesis [Social Sciences]

Cohn, J.F., 2010

Deformable model fitting by regularized landmark mean-shift

Saragih, J.M, 2011

Non-rigid face tracking with enforced convexity and local appearance consistency constraint

Lucey, S., 2010

Multi-PIE

Gross, R. 2010

Effects of damping head movement and facial expression in dyadic conversation using real–time facial expression tracking and synthesized avatars

Boker, S.M., 2009

Detecting depression from facial actions and vocal prosody

Cohn, J.F., 2009

Visual and multimodal analysis of human spontaneous behaviour: Introduction to the Special Issue

Pantic, M., 2009

Efficient constrained local model fitting for non-rigid face alignment

Lucey, S., 2009

Multi-PIE

Gross, R. 2010

Use of Active Appearance Models for Analysis and Synthesis of Naturally Occurring Behavior

Cohn, J.F., 2009

Mapping and Manipulating Facial Expression

Theobald, B.-J., 2009

Automated Measurement of Facial Expression in Infant–Mother Interaction: A Pilot Study

Messinger, D.S., 2009

All Smiles are Not Created Equal: Morphology and Timing of Smiles Perceived as Amused, Polite, and Embarrassed/Nervous

Ambadar, Z., 2009

Puckering and blowing facial expressions in people with facial movement disorders

Denlinger, R.L., 2008

Infant Smiling Dynamics and Perceived Positive Emotion

Messinger, D.S., 2008

Children’s Affect Expression and Frontal EEG Asymmetry: Transactional Associations with Mothers’ Depressive Symptoms

Forbes, E.E., 2008

The painful face – Pain expression recognition using active appearance models

Ashraf, A.B., 2007

Long-term stability of electroencephalographic asymmetry and power in 3 to 9 year-old children

Vuga, M., 2008

Multi-View AAM Fitting and Construction

Ramnath, K., 2008

Robust Biometric Person Identification Using Automatic Classifier Fusion of Speech, Mouth, and Face Experts

Fox, N.A., 2007

Automated Facial Image Analysis: Detecting Improvement in Abnormal Facial Movement After Treatment With Botulinum Toxin A

Rogers, C.R., 2007

Meticulously detailed eye region model and its application to analysis of facial images

Moriyama, T., 2006

Children's affect regulation during a disappointment: Psychophysiological responses and relation to parent history of depression

Forbes, E.E., 2006

Long-term stability of frontal electroencephalographic asymmetry in adults with a history of depression and controls

Vuga, M., 2006

Maternal depression, child frontal asymmetry, and child affective behavior as factors in child behavior problems

Forbes, E.E., 2006

Affect-modulated startle in adults with childhood-onset depression: Relations to bipolar course and number of lifetime depressive episodes

Forbes, E.E., 2005

Facial expression analysis: Preliminary results of a new image-processing based method.

Cohn, J.F., 1996

Automatic Recognition of Eye Blinking in Spontaneously Occurring Behavior

Cohn, J.F., 2003

Regional Patterns of Brain Activity in Adults With a History of Childhood-Onset Depression: Gender Differences and Clinical Variability

Miller, A., 2002

Recognizing Action Units for Facial Expression Analysis

Tian, Y,I., 2001

Detection, tracking, and classification of action units in facial expression

Lien, J.J.J., 2000

Specific Impairment of Smiling Increases the Severity of Depressive Symptoms in Patients with Facial Neuromuscular Disorders

VanSwearingen, J.M., 1999

Automated face analysis by feature point tracking has high concurrent validity with manual FACS coding

Cohn, J.F., 1999

Subtly different facial expression recognition and emotion expression intensity estimation

Lien, J.J., 1998

Automatically recognizing facial expressions in spatio-temporal domain using hidden Markov models

Lien, J.J., 1997

Vocal timing in face-to-face interaction of clinically depressed and nondepressed mothers and their 4-month-old infants

Zlochower, A.J., 1996

Vocal timing in face-to-face interaction of clinically depressed and nondepressed mothers and their 4-month-old infants

Zlochower, A.J., 1996

bottom of page