Home News About Us Contact Contributors Disclaimer Privacy Policy Help FAQ

Home
Search
Quick Search
Advanced
Fulltext
Browse
Collections
Persons
My eDoc
Session History
Login
Name:
Password:
Documentation
Help
Support Wiki
Direct access to
document ID:


          Institute: MPI für Psycholinguistik     Collection: Yearbook 2011     Display Documents



  history
ID: 555085.0, MPI für Psycholinguistik / Yearbook 2011
Perceptual cues in nonverbal vocal expressions of emotion
Authors:Sauter, Disa; Eisner, Frank; Calder, Andrew J.; Scott, Sophie K.
Date of Publication (YYYY-MM-DD):2010-11
Title of Journal:Quarterly Journal of Experimental Psychology
Volume:63
Issue / Number:11
Start Page:2251
End Page:2272
Review Status:Peer-review
Audience:Not Specified
Intended Educational Use:No
Abstract / Description:Work on facial expressions of emotions (Calder, Burton, Miller, Young, & Akamatsu, 2001) and emotionally inflected speech (Banse & Scherer, 1996) has successfully delineated some of the physical properties that underlie emotion recognition. To identify the acoustic cues used in the perception of nonverbal emotional expressions like laugher and screams, an investigation was conducted into vocal expressions of emotion, using nonverbal vocal analogues of the “basic” emotions (anger, fear, disgust, sadness, and surprise; Ekman & Friesen, 1971; Scott et al., 1997), and of positive affective states (Ekman, 1992, 2003; Sauter & Scott, 2007). First, the emotional stimuli were categorized and rated to establish that listeners could identify and rate the sounds reliably and to provide confusion matrices. A principal components analysis of the rating data yielded two underlying dimensions, correlating with the perceived valence and arousal of the sounds. Second, acoustic properties of the amplitude, pitch, and spectral profile of the stimuli were measured. A discriminant analysis procedure established that these acoustic measures provided sufficient discrimination between expressions of emotional categories to permit accurate statistical classification. Multiple linear regressions with participants' subjective ratings of the acoustic stimuli showed that all classes of emotional ratings could be predicted by some combination of acoustic measures and that most emotion ratings were predicted by different constellations of acoustic features. The results demonstrate that, similarly to affective signals in facial expressions and emotionally inflected speech, the perceived emotional character of affective vocalizations can be predicted on the basis of their physical features.
Free Keywords:emotion; voice; vocalizations; acoustics; non-verbal behaviour
External Publication Status:published
Document Type:Article
Communicated by:Karin Kastens
Affiliations:MPI für Psycholinguistik
External Affiliations:External Organizations
Identifiers:URL:http://pubman.mpdl.mpg.de/pubman/item/escidoc:1021...
DOI:10.1080/17470211003721642
LOCALID:escidoc:102143
The scope and number of records on eDoc is subject to the collection policies defined by each institute - see "info" button in the collection browse view.