SIMPLE MAPPINGS, EXPRESSIVE MOVEMENT: A QUALITATIVE INVESTIGATION INTO THE END-USER MAPPING DESIGN OF EXPERIENCED MID-AIR MUSICIANS
Brown, D., Nash, C., and Mitchell, T. (2018) Digital Creativity.
In a New Interface for Musical Expression (NIME), the design of the relationship between a musician’s actions and the instrument’s sound response is critical in creating instruments that facilitate expressive music performance. A growing body of NIMEs expose this design task to the end performer themselves, leading to the possibility of new insights into NIME mapping design: what can be learned from the mapping design strategies of practicing musicians? This research contributes a qualitative study of four highly experienced users of an end-user mapping instrument to examine their mapping practice. The study reveals that the musicians focus on designing simple, robust mappings that minimize errors, embellishing these control gestures with theatrical ancillary gestures that express metaphors. However, musical expression is hindered by the unintentional triggering of musical events. From these findings, a series of heuristics are presented that can be applied in the future development of NIMEs.
UNDERSTANDING USER-DEFINED MAPPING DESIGN IN MID-AIR MUSICAL PERFORMANCE
Brown, D., Nash, C., and Mitchell, T. (2018) In: Proceedings of the Fifth International Conference on Movement and Computing (MOCO 2018). Genoa, Italy. ACM.
Modern gestural interaction and motion capture technology is frequently incorporated into Digital Musical Instruments (DMIs) to enable new methods of musical expression. A major topic of interest in this domain concerns how a performer’s actions are linked to the production of sound. Some DMI developers choose to design these mapping strategies themselves, while others expose this design space to performers. This work explores the latter of these scenarios, studying the user-de ned mapping strategies of a group of experienced mid-air musicians chosen from a rare community of DMI practitioners. Participants are asked to design mappings for a piece of music to determine what factors influence their choices. The findings reveal novice performers spend little time reviewing mapping choices, more time practising, and design mappings that adhere to musical metaphors. Experienced performers edit mappings continuously and focus on the ergonomics of their mapping designs.
A USER EXPERIENCE REVIEW OF MUSIC INTERACTION EVALUATIONS
Brown, D., Nash, C., and Mitchell, T. (2017) In: Proceedings of New Interfaces for Musical Expression (NIME) 2017. Copenhagen, Denmark.
The need for thorough evaluations is an emerging area of interest and importance in music interaction research. As a large degree of DMI evaluation is concerned with exploring the subjective experience: ergonomics, action-sound mappings and control intimacy; User Experience (UX) methods are increasingly being utilised to analyse an individual’s experience of new musical instruments, from which we can extract meaningful, robust findings and subsequently generalised and useful recommendations. However, many music interaction evaluations remain informal. In this paper, we provide a meta-review of 132 papers from the 2014 – 2016 proceedings of the NIME, SMC and ICMC conferences to collate the aspects of UX research that are already present in music interaction literature, and to highlight methods from UX’s widening field of research that have not yet been explored. Our findings show that usability and aesthetics are the primary focus of evaluations in music interaction research, and other important components of the user experience such as enchantment, motivation and frustration are frequently if not always overlooked. We argue that these factors are prime areas for future research in the field and their consideration in design and evaluation could lead to a better understanding of NIMEs and other computer music technology.
GESTURECHORDS: TRANSPARENCY IN GESTURALLY CONTROLLED DIGITAL MUSICAL INSTRUMENTS THROUGH ICONICITY AND CONCEPTUAL METAPHOR
Brown, D., Nash, C., and Mitchell, T. (2016) In: Proceedings of Sound and Music Computing (SMC) 2016. Hamburg, Germany.
This paper presents GestureChords, a mapping strategy for chord selection in freehand gestural instruments. The strategy maps chord variations to a series of hand postures using the concepts of iconicity and conceptual metaphor, influenced by their use in American Sign Language (ASL), to encode meaning in gestural signs. The mapping uses the conceptual metaphors MUSICAL NOTES ARE POINTS IN SPACE and INTERVALS BETWEEN NOTES ARE SPACES BETWEEN POINTS, which are mapped respectively to the number of extended fingers in a performer’s hand and the abduction or adduction between them. The strategy is incorporated into a digital musical instrument and tested in a preliminary study for transparency by both performers and spectators, which gave promising results for the technique.
LEIMU: GLOVELESS MUSIC INTERACTION USING A WRIST MOUNTED LEAP MOTION
Brown, D., Nash, C., and Mitchell, T. (2016) In: Proceedings of New Interfaces for Musical Expression (NIME) 2016. Brisbane, Australia.
Camera-based motion tracking has become a popular enabling technology for gestural human-computer interaction. However, the approach suffers from several limitations, which have been shown to be particularly problematic when employed within musical contexts. This paper presents Leimu, a wrist mount that couples a Leap Motion optical sensor with an inertial measurement unit to combine the benefits of wearable and camera-based motion tracking. Leimu is designed, developed and then evaluated using discourse and statistical analysis methods. Qualitative results indicate that users consider Leimu to be an effective interface for gestural music interaction and the quantitative results demonstrate that the interface offers improved tracking precision over a Leap Motion positioned on a table top.
THE APPLICATION OF ESTABLISHED GESTURAL LANGUAGES IN THE CONTROL MAPPINGS OF FREE-HAND GESTURAL MUSICAL INSTRUMENTS
Brown, D. (2016) In: Proceedings of the International Conference on Live Interfaces (ICLI) 2016. Brighton, UK.
The “mapping problem” is a long standing issue in the development of digital musical instruments, and occurs when an instrument’s musical response fails to reflect the performer’s gestural actions. This paper describes ongoing doctoral research that seeks to address this issue by studying the existing gestural languages of Soundpainting and American Sign Language in order to influence the control mappings of free-hand gestural musical instruments. The research seeks to contribute a framework of mapping strategies influenced by these gestural languages, as well as developments in the field of gesture recognition algorithms and novel evaluation methods for free-hand gestural musical instruments.