Scientific discoveries pave the way for technology that responds appropriately to our expressions.
1739
Hume argues that emotions drive choice and well-being.
1739
“Reason is and ought only to be the slave of the passions” - David Hume.
At Hume AI, we take this as a guiding principle behind ethical AI: to address our needs, technology should be guided by how we express them.
Recognizing the need to map out the internal states that animate thought and action, David Hume also proposed a taxonomy of over 16 emotions, but lacked scientific evidence.
1872
Darwin surveys human expression.
1872
Darwin described similarities and differences in over 20 facial, bodily, and vocal expressions across species, cultures, and stages of life.
The Expression of the Emotions in Man and Animals was his third major work.
He lacked statistical methods to test his hypotheses about human expression. But 150 years later, studies are confirming many of Darwin’s observations.
1969
Ekman documents six facial expressions.
1969
Paul Ekman traveled the world to study whether six expressions were universally recognized.
By focusing on a narrow set of behaviors, Ekman was able to use the limited data available to him to confirm some of Darwin’s ideas.
However, the focus on six possible meanings for expressions also introduced what we call the 30% problem: the focus of scientists for 50 years on less than 30% of what people express.
1970 - 2020
Scientists reduce expression even further.
1970 - 2020
While some scientists assume six possible meanings of expressions, others attempt to derive taxonomies of expression from data.
However, due to small sample sizes and statistical limitations, these results lead to even more reductive theories.
Some scientists endorse “core affect": the notion that the meanings of expressive behaviors are largely captured by how pleasant or unpleasant and calm or aroused they seem.
2022
The full spectrum of expression.
2022
The scientists behind Hume are pioneering data-driven approaches to expression.
We explore the full range of signals in the voice, face, and body using computational methods, large-scale experiments, and diverse datasets. We've collected millions of reactions to videos, music, and art; analyzed the brain mechanisms of emotion; investigated expressions in ancient sculptures; and used deep learning to measure expressions in videos from around the world.
Our findings reveal over 30 dimensions of expressive behavior.
Our Research
Mapping Expression
Exploring the full spectrum of expression
We’ve introduced new datasets and statistical methods to explore the dimensions of meaning that explain the feelings we report in different situations, the patterns of brain activity they evoke, physiological responses like goosebumps, and nuanced expressions in the face, body, and voice.
The Face and Body
Over 28 nuanced facial expressions
With millions of reactions captured, studied, and mapped, our work reveals that facial expressions are more than three times more diverse and complex than previously assumed. These findings pave the way for vast improvements in technologies that interpret facial expression.
The Voice
The power of laughs, cries, and sighs
The voice is revealing itself to be an even more important medium for nonverbal expression than the face. Our computational work reveals dozens of kinds of vocal emotional expression, from sighs and gasps to grunts and growls, opening the door to technologies that better understand and communicate with us to improve our well-being.
Papers
Learn more by exploring our publications
Deep learning reveals what vocal bursts express in different cultures
ReadWhat the face displays: Mapping 28 emotions conveyed by naturalistic expression
ReadSixteen facial expressions occur in similar contexts worldwide
ReadMapping the passions: Toward a high-dimensional taxonomy of emotional experience and expression
ReadThe primacy of categories in the recognition of 12 emotions in speech prosody across two cultures
ReadSemantic Space Theory: Data-Driven Insights Into Basic Emotions
ReadMapping 24 emotions conveyed by brief human vocalization
ReadThe neural representation of visually evoked emotion Is high-dimensional, categorical, and distributed across transmodal brain regions
ReadWhat music makes us feel: At least 13 dimensions organize subjective experiences associated with music across different cultures
ReadGoEmotions: A dataset of fine-grained emotions
ReadSelf-report captures 27 distinct categories of emotion bridged by continuous gradients
ReadUniversal facial expressions uncovered in art of the ancient Americas: A computational approach
ReadEmotional expression: Advances in basic emotion theory
ReadThe MuSe 2022 Multimodal Sentiment Analysis Challenge: Humor, Emotional Reactions, and Stress
ReadThe ACII 2022 Affective Vocal Bursts Workshop & Competition: Understanding a critically understudied modality of emotional expression
ReadThe ICML 2022 Expressive Vocalizations Workshop and Competition: Recognizing, generating, and personalizing vocal bursts
ReadIntersectionality in emotion signaling and recognition: The influence of gender, ethnicity, and social class
ReadHow emotions, relationships, and culture constitute each other: Advances in social functionalist theory
ReadBy using this website, you agree to our use of cookies.