Document Type

Conference Paper


Available under a Creative Commons Attribution Non-Commercial Share Alike 4.0 International Licence


Computer Sciences

Publication Details

The 4th International Conference on Cybernetics and Information Technologies, Systems and Applications: CITSA 2007 jointly with The 5th International Conference on Computing, Communications and Control Technologies: CCCT 2007


This paper details experimental procedures designed to elicit real emotional responses from participants within a controlled acoustic environment. The experiments use Mood Induction Procedures (MIP’s), specifically MIP 4, to implement a co-operative task using two participants. These cooperative tasks are designed to engender emotional responses of activation and evaluation from the participants who are situated in separate isolation booths, thus reducing unwanted noise in the signal, preventing the participants from being distracted and ensuring a cleanly recorded audio signal. The audio is recorded at a professional level of quality (24bit/192Khz). The emotional dimensions of each audio recording will be evaluated using listening tests in conjunction with the FeelTrace tool, providing a statistical evaluation of these recordings that will be used to compile an emotional speech corpus. This corpus can then be analysed to define a set of rules for the detection of basic emotional dimensions in speech.


European Union