Designing and Building a 1st stage dataset for embodied music-making (musicking).
The aim of this research project was to design and develop a dataset that captured embodied music-making (hereafter musicking (Small 1989)) for use in human-computer interaction between AI and human musicians. The proposed solution to this challenge was to capture embodied musicking through data harvesting of physical and sonic elements from a musician embodied in the flow of musicking. This report describes the small-scale, first stage proof-of- concept design, development and deployment of such a system. The hard question here was to develop a dataset that could inform perception of music-AI so that it could co-create within the realtime flow of musicking with other machines and/or human musicians. The results highlight the importance that any modes of capture other than real-world musicking, would be a setup for failure.
Citation : Vear, C., (2018) Designing and Building a 1st stage dataset for embodied music-making (musicking). Available online: https://fbf4c877-a633-43a6-a8f8-182ae6c0f74b.filesusr.com/ugd/9a47a8_bf85e2bac03a45a8b2563d23b26349d9.pdf
Research Institute : Institute of Creative Technologies (IOCT)
Peer Reviewed : No
- Leicester Media School