Paula M. Niedenthal

Paula M. Niedenthal is a social psychologist currently working as a professor of psychology at the University of Wisconsin–Madison. She also completed her undergraduate studies at the University of Wisconsin at Madison where she received a Bachelor's in Psychology. She then received her Ph.D. at the University of Michigan before becoming a faculty member of the departments of Psychology at Johns Hopkins University and Indiana University. Until recently, she served as the Director of Research in the National Centre for Scientific Research at the Université Blaise Pascal in Clermont-Ferrand France. The majority of Niedenthal's research focuses on several levels of analysis of emotional processes, this would include emotion-cognition interaction and representational models of emotion. Niedenthal has authored more than 80 articles and chapters, and several books. Niedenthal is a fellow of the Society for Personality and Social Psychology.

Honors and awards

 * Theoretical Innovation Award, Society of Personality and Social Psychology, 2004
 * Distinguished Teacher Award Finalist, Johns Hopkins University, 1992
 * Teaching Excellence Award, Johns Hopkins University, 1991
 * Sigma Xi Scientific Research Award, University of Michigan, 1986-1987

Niedenthal Emotions Lab
Upon arriving at the University of Wisconsin–Madison as a full professor in 2011, Niedenthal assembled the Niedenthal Emotions Lab in order to continue past research and discover new research topics relating to human emotion. The lab consists of several undergraduate research assistants, graduate students under Paula Niedenthal's supervision, and other interested or collaborating parties. Undergraduates are given a chance to develop research projects, collect and analyze data, and examine the entire process of research in social psychology.

Embodying emotions
Niedenthal focuses a large portion of research towards the embodiment of emotions. Embodiment refers both to actual bodily states and to simulations of experience in the brain's modality-specific systems for perception, action, and introspection. One such study conducted by Niedenthal focuses primarily on the central sense of embodiment, referring to the brain's modality-specific systems which include the sensory systems that underlie perception of a current situation, the motor systems that underlie action, and the introspective systems that underlie conscious experiences of emotion, motivation and cognitive operations. Niedenthal employs the use of several theories of embodied cognition to explain how certain phenomena can be based specifically in such systems and bodily states. The main idea underlying theories of embodied cognition is that cognitive representations and operations are fundamentally grounded in their physical context, while cognition relies heavily on the brain's modality-specific systems and actual bodily states. Niedenthal believes that it is becoming increasingly essential to demonstrate the casual rules of embodiment in higher cognition. Niedenthal has suggested that embodiment plays a large role in information processing of emotions through the use of Wilsons’ idea of "online" and ‘’offline’’ embodiment in that embodiment is present when individuals respond to real emotion objects as well as when individuals represent the meanings of emotional symbols (e.g., words).

Facial expressions and embodiment
Niedenthal has demonstrated that mimicry plays a causal role in the processing of emotional expression. Consistent with the embodiment hypothesis; In her study participants that were free to mimic the expressions detected the change in emotional expression earlier and more efficiently for any facial expression than those who were prevented from mimicking the expressions. Further studies have shown that in regards to judgment, individuals embodied the relevant, discrete emotion as indicated by their facial expression. When certain emotions are experienced the resulting action is the stimulation of differing facial muscles depending on the response. Through the use of the electromyographic recording technique Niedenthal was able to observe the participants’ resulting facial expressions. Four such muscles that were activated were the orbicularis oculi, zygomaticus, the corrugator, and levator. It is also noted that embodiment does not occur when the information can be processed through association or perceptual features.

Facial expressions and attachment
According to Niedenthal, the perception of facial expressions of emotion can be affected by emotional states among many factors. In accordance with attachment orientation Niedenthal has proposed that lower level cognitive processes of perception such as the timing of the perceived offset of facial expressions of emotion influence the perception of facial expressions of different emotions such as happiness, anger, fear, and sadness.

Emotional response categorization
Defined as: "the mental grouping together of objects and events that elicit the same emotion, and the treatment of those objects and events as "the same kind of thing"". Emotional response categorization enables an individual to choose an appropriate action in regards to their own categorization, allowing the individual to understand the meaning of an object based on their past experiences thus allowing for the individual to imagine their response and the outcome to an event or object.

Emotional response categorization theory
Proposed in 1999 by Niedenthal et al., emotional response categorization theory proposes that emotional states provide a form of conceptual coherence in that any objects or events that pertain to an emotion are grouped together into categories. The theory consists of three claims:
 * 1) Basic emotions provide the mental structure for emotional response categorization.
 * 2) The experience of a basic emotion(s) (happiness, sadness, fear, anger, etc.) will lead to the process of emotional response equivalence in order for categorization to occur.
 * 3) Selective attention to the emotional response associated with an object or event will result in emotional response categorization.

Structure of emotion
Emotional state as well as arousal are important in facilitating memory in that objects and events can be distinguished into high and low-arousal categories. These arousal categories based on basic emotions form the basis for emotional learning that is essential for understanding.

Basic emotions organize concepts into categories
An individual is more likely to associate an emotional experience into a category based on relevance and emotional response categorization. Niedenthal proposes that the experienced emotional state increases the use of all emotional response categories and not just the one being experienced.

Selective attention to emotional response
Niedenthal and colleagues contend that while experiencing an emotional state, an individual selectively attends the response to stimuli that have greater emotional meaning and importance. These previously experienced emotional responses have a larger impact on the judgments and behaviors.

An individual experiencing a strong emotional state due to naturally occurring events will organize their perceptual and conceptual worlds differently from those in a neutral state. In other words: when experiencing a strong emotional state, an individual is more likely to associate the experienced emotion with the targeted objects or event(s). The stimulated person(s) is more likely to associate a wider range of objects and events that are associated with the specific emotion than would the non-stimulated person. It is important to note that emotional response categorization is context dependent in that the resulting response depends greatly on an individual's past experiences.

Emotion congruence and selective perception
One particular area of study by Dr. Niedenthal looks at the question of "Are we more tuned to perceive things that are more congruent with our mood?". For example, one study done by Dr. Niedenthal and Marc Setterlund in 1994 suggests that happiness and sadness have emotion-congruent effects upon selective perception. In their 1994 study, participants were given earphones so that they could listen to music throughout the experiment. Half of the participants were given classical music that was intended to induce a happy mood (the allegro from Mozart's Eine kleine Nachtmusik, and parts of Vivaldi's Concerto in C Major), and half of the participants were given classical music that was intended to induce sad moods (Adagietto by Mahler and the adagio from the piano from the Piano Concerto No. 2 in C Minor by Rachmaninov). Subjects were then asked to perform a lexical decision task. Letters were flashed on a screen: some were real words and some were non-words or words that are not in the dictionary but can still be pronounced in English, such as "blang". The words were put into five categories, happy words, positive words that are unrelated to happiness, neutral words, negative words that are unrelated to sadness, and sad words.

Niedenthal and Setterlund found that music can induce happy or sad moods. They also found that when in a happy mood, participants were quicker at identifying happy words than sad words. Such findings are in line with the emotion-congruence thesis. This research also makes the conclusion that our existing moods and emotions lead us to selectively perceive emotion-congruent objects and events.

Neural computation
A 2008 study, conducted by Niedenthal and others, challenges research pertaining to the anger superiority effect. The anger superiority effect states that it is easier to detect angry faces than happy faces in a crowd neutral ones. This idea was said to have evolved over phylogenic development because it is beneficial to be able to quickly detect a threat in a given environment. Niedenthal et al. cite recent research that questions the hidden perceptual versus emotional factors that might account for this so-called anger superiority effect. In congruence with the findings of this research, Niedenthal et al. tested the anger superiority effect using a neural analysis of human faces in two different simulations in order to separate emotional and perceptual processes. They found that perceptual bias is causing a faster, more accurate identification of anger faces.