We developed a set of audio-visual recordings of emotions called the Reading Everyday Emotion Database (REED). Twenty-two native British English adults (12 females + 10 males) from a diverse age range and with drama/acting experience were recorded producing utterances of various lengths in spoken and sung conditions in 13 various emotions (neutral, the 6 basic emotions, and 6 complex emotions) using everyday recording devices (e.g., laptops, mobile phones, etc.). All the recordings were validated by a separate, independent group of raters (n = 168 adults). This dataset contains metadata about the files in the database (in the README file). In addition, it also contains: - REED_validation_summary.csv -- Data for the validation task - UoR-DataAccessAgreement-000407.pdf -- The REED Data Access Agreement - example_clips.zip -- Example clips of the REED Not available in current dataset: - The REED (will be sent to user once they have signed the Data Access Agreement) To request access to the REED, please complete a data access request at https://redcap.link/data-request. If your application meets the criteria, a Data Access Agreement (see "UoR-DataAccessAgreement-000407.pdf" for a copy) will be sent to your nominated institutional signatory. Once the agreement is signed and returned, the Research Data Service team will arrange secure access to the dataset.