Recently, I think a lot about how to use the new technology to allow the users to immerse themselves inside the music. We are so used to listening to music in “flat” stereo that the opportunities of 6DoF tracking are very tempting to be explored in the music context. In one of my previous posts, I discussed the necessity of composing new music for XR environments. Today I want to talk more about my ideas on how to take already existing music and translate it into an immersive experience.
First of all, we need to have a separate track for each of the musical instruments which will allow to process them independently. We can treat each instrument as a separate sound object and assign them location in the space using binaural rendering. Then we can program different kinds of audio processing depending on the relation between the user and source position.
If we use the device which enables transparent hearing (does not occlude sounds from the outside environment) we can employ external speakers in our scene to enhance low frequencies and facilitate externalization of sounds. Using audio devices with transparent hearing allows users to interact with each other which can bring a very important social aspect to the experience.
As the last step, we can add the reverberation of the space the user is in, to create a true blend between the music and real environment.
Last semester I worked with Christy and Makan on an immersive experience using Bose Frames. We employed 6DoF tracking on them and created an immersive mix of existing stereo music. More on details on our project in the future posts.