0%

Music Interaction Design and predictive models

0504 Music Interaction Design

What is interaction Design?

  • The practice of designing interactive digital products, environment

  • Who interact with who?
    machine, software, HCI

  • Where does interaction design enter in a computer music system?

    • gestures - input device
    • other IO Device, machine to machine
  • Setup an interaction

    • capture the gesture
    • mapping system
    • the transmission of the gestural and musical signals between different devices
    • Related areas: creative programming, electronics, telecommunication protocols
  • Example of M2M interaction

    • laptop orchestra by Stanford
    • position, gesture -> music parameters -> centre node -> synthesis -> music system (not a product, just used for this purpose)
    • IMU Motion Tracker(controller, feedback, flex sensors, connection with osc(?))
    • leap motion(device)
  • AI for music interaction

    • how if the computer could learn our way to interact with it?
    • eg. Wekinator, an AI system automatically study (gesture space -> music space)
  1. Deep Predictive Models in Interactive Music, https://arxiv.org/pdf/1801.10492.pdf

    • how is deep learning involved in music performace
    • especially digital musical instruments
    • musical predictions
    • difference between mapping and modelling
      • Mapping refers to connecting the control and sensing components of a musical instrument to parameters in the sound synthesis component
      • Modelling refers to capturing a representation of a musical process
  2. How to generate music?

    • ANN -> RNN -> LSTM : improved (to learn distant dependencies)
    • RNNs with LSTM cells were later used by Eck and Schmidhuber to generate blues music
    • generate music using Markov models to generate the emission probabilities of future notes based on those preceding
    • Bach music, polyphonic chorales of J. S. Bach have also been modelled by RNN
    • difference in these models ???
    • learn much about the temporal structure of music, and how melodies and harmonies can be constructed
    • Simon and Oore’s Performance RNN: goes further by generating dynamics and rhythmic expression, or rubato, simultaneously with polyphonic music.
    • WaveNet: producing samples
  3. Predictive models: instrument-level -> performer-level -> ensemble-level