Contemporary affective multimedia databases are simple repositories of audiovisual multimedia documents such as pictures, sounds, text, videos etc. with described general semantics and emotion content.
Two important features distinguish affective multimedia databases from other multimedia repositories:
- purpose of multimedia documents, and;
- emotion representation of multimedia documents.
Multimedia documents in affective multimedia databases are aimed at inducing or stimulating emotions in exposed subjects. As such they are usually referred to as stimuli. All multimedia documents (i.e. stimuli) in affective multimedia databases have specified semantic and emotional content. Two predominant theories used to describe emotion are the discrete category model and the dimensional model of affect.
All databases have all been characterized according to at least one of these models. Other distinctive attributes of affective multimedia databases are: prevalent usage in psychological, psychophysiological and neuroscience research, interaction with domain tests and questionnaires, ethical concerns, harder and diversified demands on document retrieval, necessity for richer and more accurate semantics descriptors, semantic gap solution, scalability, interoperability, virtual reality hardware support and an advanced user interface for less computer proficient users with separate visualization systems for stimuli exposure supervisors and subjects.
The International Affective Picture System (IAPS) and the International Af-fective Digital Sounds system (IADS) are two of the most cited tools in the area of affective stimulation.
These databases were created with three goals in mind:
- Better experimental control of emotional stimuli;
- Increasing the ability of cross-study comparisons of results;
- Facilitating direct replication of undertaken studies.
The same standardization principles are shared among other similar affective mul-timedia databases. Apart from the IAPS and IADS the most important readily available affective multimedia databases available are Geneva Affective PicturE Database (GAPED), Nencki Affective Pictures System (NAPS), Dataset for Emotion Analysis using eeg, Physiological and video signals (DEAP), NimStim Face Stimulus Set, Pictures of Facial Affect (POFA), Karolinska Directed Emotional Faces (KDEF), Japanese and Caucasian Facial Expressions of Emotion and Neutral Faces (JACFEE and JACNeuF), The CAlifornia Facial expressions of Emotion (CAFE), Yale Face Database, Yale Face Database B, Japanese Female Facial Expression (JAFFE) Database, Facial Expressions and Emotion Database (FEED), Radboud Faces Database (RaFD), Affective Norms for English Words (ANEW), Affective Norms for English Texts (ANET) and SentiWordNet.
Additional audio-visual affective multimedia databases with category or dimensional emotion annotations are listed here. As can be seen in Table 1 facial expression databases are by far the most numerous modality among affective multimedia databases. Although facial expression databases are employed in emotion elicitation, these databases are also often used for face recognition and face detection. All three types of databases are commonly called face databases. A more detailed overview of these databases is available from (Gross2005).
Table 1. The list of the most often used collections of audiovisual stimuli. Some datasets had multiple revisions in the designated time frame. Owner refers to an institution that distributes a specific dataset.
|IAPS||Picture||University of Florida, The Center for the Study of Emotion and Attention||1997-2008|
|GAPED||Picture||University of Geneva, Swiss Center for Affective Sciences||2011|
|NAPS||Picture||Nencki Institute of Experimental Biol-ogy, Polish Academy of Sciences||2013|
|IADS||Sound||University of Florida, The Center for the Study of Emotion and Attention||1999|
|DEAP||Video||Queen Mary University of London||2012|
|NimStim||Facial expression||The MacArthur Foundation Research Network on Early Experience and Brain Development||2009|
|POFA||Facial expression||Paul Ekman Group||1976-1993|
|KDEF||Facial expression||Karolinska Institutet, Department of Clinical Neuroscience, Section of Psychology||1998|
|CAFE||Facial expression||University of California||2001|
|Yale Face Database||Facial expression||Yale University||1997|
|Yale Face Database B||Facial expression||Yale University||2001|
|JAFFE||Facial expression||Kyushu University, Psychology De-partment||1998|
|ANEW||Text||University of Florida, The Center for the Study of Emotion and Attention||1999|
|ANET||Text||University of Florida, The Center for the Study of Emotion and Attention||1999-2007|