Designing and Building a 1st stage dataset for embodied music-making (musicking).

dc.cclicenceCC-BY-NCen
dc.contributor.authorVear, Craig
dc.date.accessioned2020-07-09T10:59:30Z
dc.date.available2020-07-09T10:59:30Z
dc.date.issued2020-06-01
dc.description.abstractThe aim of this research project was to design and develop a dataset that captured embodied music-making (hereafter musicking (Small 1989)) for use in human-computer interaction between AI and human musicians. The proposed solution to this challenge was to capture embodied musicking through data harvesting of physical and sonic elements from a musician embodied in the flow of musicking. This report describes the small-scale, first stage proof-of- concept design, development and deployment of such a system. The hard question here was to develop a dataset that could inform perception of music-AI so that it could co-create within the realtime flow of musicking with other machines and/or human musicians. The results highlight the importance that any modes of capture other than real-world musicking, would be a setup for failure.en
dc.funderNo external funderen
dc.identifier.citationVear, C., (2018) Designing and Building a 1st stage dataset for embodied music-making (musicking). Available online: https://fbf4c877-a633-43a6-a8f8-182ae6c0f74b.filesusr.com/ugd/9a47a8_bf85e2bac03a45a8b2563d23b26349d9.pdfen
dc.identifier.urihttps://fbf4c877-a633-43a6-a8f8-182ae6c0f74b.filesusr.com/ugd/9a47a8_bf85e2bac03a45a8b2563d23b26349d9.pdf
dc.identifier.urihttps://dora.dmu.ac.uk/handle/2086/19974
dc.language.isoenen
dc.peerreviewedNoen
dc.researchinstituteInstitute of Creative Technologies (IOCT)en
dc.subjectcreative aien
dc.subjectembodied intelligenceen
dc.titleDesigning and Building a 1st stage dataset for embodied music-making (musicking).en
dc.typeWorking Paperen

Files

License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
4.2 KB
Format:
Item-specific license agreed upon to submission
Description: