University of Reading Research Data Archive

The Reading Everyday Emotion Database (REED)

How to cite this Dataset

Description

We developed a set of audio-visual recordings of emotions called the Reading Everyday Emotion Database (REED). Twenty-two native British English adults (12 females + 10 males) from a diverse age range and with drama/acting experience were recorded producing utterances of various lengths in spoken and sung conditions in 13 various emotions (neutral, the 6 basic emotions, and 6 complex emotions) using everyday recording devices (e.g., laptops, mobile phones, etc.). All the recordings were validated by a separate, independent group of raters (n = 155 adults), and the database consists only of recordings that were recognised above chance.

This dataset contains metadata about the files in the database (in the README file). In addition, it also contains:
- data_validation.csv -- Data for the validation task
- UoR-DataAccessAgreement-000336.pdf -- The REED Data Access Agreement
- example_clips.zip -- Example clips of the REED
- InfoSheet.pdf -- Information sheet (the copy used for participants involved in the REED study)
- ConsentForm.pdf -- Consent form (the copy used for participants involved in the REED study)

Not available in current dataset:
- The REED (will be sent to user once they have signed the Data Access Agreement)

To request access to the REED, please complete a data access request at https://redcap.link/data-request. If your application meets the criteria, a Data Access Agreement (see "UoR-DataAccessAgreement-000336.pdf" for a copy) will be sent to your nominated institutional signatory. Once the agreement is signed and returned, the Research Data Service team will arrange secure access to the dataset.

Resource Type: Dataset
Creators: Ong, Jia Hoong ORCID logoORCID: https://orcid.org/0000-0003-1503-8311, Leung, Florence ORCID logoORCID: https://orcid.org/0000-0001-9136-1263 and Liu, Fang ORCID logoORCID: https://orcid.org/0000-0002-7776-0222
Rights-holders: University of Reading
Data Publisher: University of Reading
Publication Year: 2021
Data last accessed: 26 June 2022
DOI: https://doi.org/10.17864/1947.000336
Metadata Record URL: https://researchdata.reading.ac.uk/id/eprint/336
Organisational units: Life Sciences > School of Psychology and Clinical Language Sciences
Participating Organisations: University of Reading
Access restrictions: The complete REED database is available to authorised users subject to a Data Access Agreement between the University of Reading and a recipient organisation. A copy of the Data Access Agreement is included with this item.

To request access to the database, please complete a data access request at https://redcap.link/data-request.

A subset of example clips from the database is made available for use under a Creative Commons Attribution-NonCommercial 4.0 International Licence (https://creativecommons.org/licenses/by-nc/4.0/). Those clips are listed in the 'example_clips.zip' file and only those clips should be used for publication and presentation purposes.
Keywords: emotion, audio-visual, database, stimuli set, speech, song
Rights:

Files

Download all (.zip)

Documentation

Data

README file

Licence document

Statistics

Altmetric

Actions (Log-in required)

View item View item