How to cite this Dataset
Ong, Jia Hoong, Leung, Florence and Liu, Fang (2021): The Reading Everyday Emotion Database (REED). University of Reading. Dataset. https://doi.org/10.17864/1947.000336
Description
We developed a set of audio-visual recordings of emotions called the Reading Everyday Emotion Database (REED). Twenty-two native British English adults (12 females + 10 males) from a diverse age range and with drama/acting experience were recorded producing utterances of various lengths in spoken and sung conditions in 13 various emotions (neutral, the 6 basic emotions, and 6 complex emotions) using everyday recording devices (e.g., laptops, mobile phones, etc.). All the recordings were validated by a separate, independent group of raters (n = 155 adults), and the database consists only of recordings that were recognised above chance.
This dataset contains metadata about the files in the database (in the README file). In addition, it also contains:
- data_validation.csv -- Data for the validation task
- UoR-DataAccessAgreement-000336.pdf -- The REED Data Access Agreement
- example_clips.zip -- Example clips of the REED
- InfoSheet.pdf -- Information sheet (the copy used for participants involved in the REED study)
- ConsentForm.pdf -- Consent form (the copy used for participants involved in the REED study)
Not available in current dataset:
- The REED (will be sent to user once they have signed the Data Access Agreement)
To request access to the REED, please complete a data access request at https://redcap.link/data-request. If your application meets the criteria, a Data Access Agreement (see "UoR-DataAccessAgreement-000336.pdf" for a copy) will be sent to your nominated institutional signatory. Once the agreement is signed and returned, the Research Data Service team will arrange secure access to the dataset.
Resource Type: | Dataset |
---|---|
Creators: | Ong, Jia Hoong ORCID: https://orcid.org/0000-0003-1503-8311, Leung, Florence ORCID: https://orcid.org/0000-0001-9136-1263 and Liu, Fang ORCID: https://orcid.org/0000-0002-7776-0222 |
Rights-holders: | University of Reading |
Data Publisher: | University of Reading |
Publication Year: | 2021 |
Data last accessed: | 20 November 2024 |
DOI: | https://doi.org/10.17864/1947.000336 |
Metadata Record URL: | https://researchdata.reading.ac.uk/id/eprint/336 |
Organisational units: | Life Sciences > School of Psychology and Clinical Language Sciences |
Participating Organisations: | University of Reading |
Keywords: | emotion, audio-visual, database, stimuli set, speech, song |
Rights: | |
Data Availability: | RESTRICTED |
Restrictions: | This dataset is wholly or partly restricted. The dataset may be provided to authorised users subject to a Data Access Agreement between the University of Reading and a recipient organisation. A copy of the Data Access Agreement is included with this item. To request access to the dataset, please complete a data access request at https://redcap.link/data-request |
Available Versions of this Item
- The Reading Everyday Emotion Database (REED). (deposited 08 Dec 2021 13:32) [Currently Displayed]