MuSyC is a project which aims at building a music to colour synaesthesia visualizer.
Synaesthesia is a neurological phenomenon in which perception of a certain stimulus, such as the musical note ‘A’, involuntarily elicits another seemingly unrelated sensation, such as seeing the colour red. There are various types of synaesthesia, one of the most common of which is music to colour associations. However, explaining the qualia or feeling this cross-sensory activation elicits is hard to do – thankfully, music visualizers have made it easier to showcase how sound can influence and shape images. As someone who has two types of synaesthesia, MuSyC started out as an attempt to both simulate the creation of art from a synaesthete’s point of view (inspired by Wassily Kandinksy and his paintings) as well as an effort to create applied neurotechnologies to show how some synaesthetes experience the world. MuSyC is an attempt to combine my interests in art, neuroscience, music, augmented reality and fabrication into one.
My initial goal with MuSyC is to make a wearable device which is sensitive to the frequency or pitch of a sound and flashes certain colours in response to certain frequency ranges or particular defined notes, the universal A, B, C, D, E and F. Not only will this device allow non-synaesthetes to experience the world and music from the eyes of a music-colour synaesthete, it has great potential to be used as a tool for performance art in collaboration with musical orchestras and bands. Moreover, I hope that this device can also be used as an educational tool for those beginning training in music and those who need an additional visual cue to help distinguishing between closely related notes. Another potential use of this device is create a way for individuals who are deaf to experience music in the form of a visual light show in real time, allowing them to integrate cues from physical vibrations with the sight of music. Since the inception of the project and with the help of many talented mentors and collaborators (Luke Jaeger, Shani Mensing, Audrey St John, Cassiel Moroney, Kyoko Sano, Grant Falkenburg, Stella Yang, Arushee Agrawal, Aisvarya Chandrasekar, Henry Stone and Brendan Le), I’ve been able to help create three hardware prototypes as well as a beta website for MuSyC.
While I continue to work on refining and creating better methods to dynamically simulate synesthesia through both hardware and software, I thought that taking photographs of concerts and illustrating them with the colours and shapes evoked by the music would be an interesting endeavor and method of digitally creating an archive and basis off of which I can then continue to model MuSyC. By using the art pieces (and associated musical pieces) as reference, I hope to potentially employ more sophisticated computational techniques such as machine learning to potentially even predict synaesthetic responses to new songs and musical stimuli.
- Project Lead: Misha Oraa Ali
- Collaborators: Sheila Chukwulozie
- Published: Summer 2017