How to cite this Dataset
Ong, Jia Hoong, Leung, Florence and Liu, Fang (2022): The Reading Everyday Emotion Database (REED): version 2.0. University of Reading. Dataset. https://doi.org/10.17864/1947.000407
This is the latest version of this item.
Description
We developed a set of audio-visual recordings of emotions called the Reading Everyday Emotion Database (REED). Twenty-two native British English adults (12 females + 10 males) from a diverse age range and with drama/acting experience were recorded producing utterances of various lengths in spoken and sung conditions in 13 various emotions (neutral, the 6 basic emotions, and 6 complex emotions) using everyday recording devices (e.g., laptops, mobile phones, etc.). All the recordings were validated by a separate, independent group of raters (n = 168 adults).
This dataset contains metadata about the files in the database (in the README file). In addition, it also contains:
- REED_validation_summary.csv -- Data for the validation task
- UoR-DataAccessAgreement-000407.pdf -- The REED Data Access Agreement
- example_clips.zip -- Example clips of the REED
Not available in current dataset:
- The REED (will be sent to user once they have signed the Data Access Agreement)
To request access to the REED, please complete a data access request at https://redcap.link/data-request. If your application meets the criteria, a Data Access Agreement (see "UoR-DataAccessAgreement-000407.pdf" for a copy) will be sent to your nominated institutional signatory. Once the agreement is signed and returned, the Research Data Service team will arrange secure access to the dataset.
Resource Type: | Dataset |
---|---|
Creators: | Ong, Jia Hoong ORCID: https://orcid.org/0000-0003-1503-8311, Leung, Florence ORCID: https://orcid.org/0000-0001-9136-1263 and Liu, Fang ORCID: https://orcid.org/0000-0002-7776-0222 |
Rights-holders: | University of Reading |
Data Publisher: | University of Reading |
Publication Year: | 2022 |
Data last accessed: | 17 November 2024 |
DOI: | https://doi.org/10.17864/1947.000407 |
Metadata Record URL: | https://researchdata.reading.ac.uk/id/eprint/407 |
Organisational units: | Life Sciences > School of Psychology and Clinical Language Sciences |
Participating Organisations: | University of Reading |
Keywords: | emotion, audio-visual, database, stimuli set, speech, song |
Rights: | |
Data Availability: | RESTRICTED |
Restrictions: | This dataset is wholly or partly restricted. The dataset may be provided to authorised users subject to a Data Access Agreement between the University of Reading and a recipient organisation. A copy of the Data Access Agreement is included with this item. To request access to the dataset, please complete a data access request at https://redcap.link/data-request |
Available Versions of this Item
-
The Reading Everyday Emotion Database (REED). (deposited 08 Dec 2021 13:32)
- The Reading Everyday Emotion Database (REED): version 2.0. (deposited 02 Aug 2022 12:51) [Currently Displayed]