Colourblind Eyeborg Colours to Sound
The story of the cyborg
The project I have created exists in outside the traditional domain of computer culture of physical installation. I have created a new sensation, a cyborgian extension of the human perception system residing in the brain of on student. Neil Harbisson.
I first met Neil at Dartington College of Arts whilst I was giving a talk on practical cyborg techniques and applications. Neil became very excited about the idea of using digital inputs to augment his senses.
He explained to me that he had a rare condition of achromatopsia (a hereditary vision disorder which affects 1 person in 33,000). One of the is monocharmatism, the inability to perceive colour. To him the world was black and white. He explained to me how, in his paintings he had only ever used black and white paints: “I never used colours to paint because I feel completely distant to them. Colours create a mysterious reaction to people that I still don’t quite understand.”
He described colours to me as “being an energy that I can’t see because it moves too quickly. I’ve imagined colours as fast moving energies.” Neil became curious as to the possibilities of a cyborg-like extension of his sensory system. A new input based prosthesis.
I decided that using Neil’s existing senses as a host for new artificial senses would be an effective approach. By using sound, I felt that it would give him a good approximation of colour as he has very good pitch perception as he is a keen musician. I was confident that shifting colour into sound would be an appropriate and effective way of re-mapping Neil’s brain, as the natural occurrence of synesthesia seems to suggest that the visual and auditory senses can in some case become overlapping.
Colours into sounds
The case of how to convert colours into sounds was a difficult one. After much consideration it became apparent that I would have to create an audio experience that, like the light spectrum, would transcend labels. I used a physical model of transposing light into sound. After all, both light and sound are waves. Although light waves are far too high to hear, it is possible to mathematically transpose them down until they sit within the audible wavelength, so the lowest colour in the spectrum (Dark red) becomes the lowest note in the scale. I created colour to sound conversion software that would dynamically scale the colours from a miniature wearable camera into audible frequencies. Instead of having one note per colour I wanted Neil to be able to hear subtle differences in colour, just as the human eye can distinguish between many different kinds of blue, Neil is able to do the same.
Neil can now perceive 360 different hues, one for each degree on the colour wheel. Each hue was assigned an audible frequency between 384 and 718. This approach allows him to disregard brighter and darker variations (due to lighting conditions) and also to disregard colour saturation (The camera may over or under saturate colours depending on the environment) and instead gives us pure Hue perception.
How it works
Neil was able to run the software on a small laptop that he mounted in a backpack. The laptop was modified to run the software even when the lid was closed, allowing him to comfortably wear it in “sleep” mode. This meant that the battery time lasted for long enough periods for him to go through the whole school day without a re-charge.
Surprisingly, within 15 minutes of Neil using the system he was able to instantly recognise similarities and differences between hues, something he had never previously been able to do. Conclusively, this project exists not in the software, or domain of so called “virtual” reality, but in the reality of Neil’s perception of the world, unveiling, quite literally, an invisible architecture of energy.
Neil now wears this “eyborg” 24 hours a day, and has doctor’s certification that he is a Cyborg. He continues to live each day as a colourful Cyborg in a colourful world.