This interactive installation uses gesture and expression to explore sound scapes.

“We wanted to design an immersive environment where sound is manipulated through the user’s presence and kinetic movement of ‘nodes’. We’ve created a sound scape for the user to adapt and explore through intuitive ways which, we hope, sounds and looks pleasing.

“By using AR computer vision techniques we’ve built a system that gives us a positional data stream for each node present on our table. Using this data we have created different ways of interacting with Reason [1] to manipulate sounds and textures present in our sonic landscape. We built our own box in which to house the equipment and covered it with a transparent lid. Placing ‘nodes’ on top with AR code graphics facing down into the table, we are able to track the X and Y position of each node, as well as the angle it is to the camera, giving us a constant data stream of position to manipulate. In our code we have used multiple libraries to make this possible, and rely on several different input streams to be passed through our code and into Reason to create and shape the sounds heard around the room.”

George Sullivan, Leon Fedden and Henry Clements, BSc Creative Computing