Artificial Senses
Humans mostly perceive the world using their senses. Through our eyes we see, and with our ears we hear. But what if we could transform visual stimuli into audio stimuli? Our project has focussed on this question. We have created a soundscape that is influenced by its environment. In doing so, the piece creates a relaxing, yet unknown experience for its users. The users interact, through moving the artefact, with sound. During this dialogue, one never knows whether the artefact is going to respond to certain actions. Moreover, when the artifact does ‘decide’ to respond, it unknown in what way it will do so. We set of with the idea to build a standing artefact where the user and the artefact would interact with each other. The interaction would be in the form of presenting objects to the artefact. However, this interaction method seemed rather boring and unintuitive. Hence, we decided to explore the possibilities to create a persona which was able to move around. Nevertheless, this would still not fit our ideas, so we decided to keep its ability to move, but leave the persona.
This resulted in a computer, or webcam, which would wirelessly communicate with a PureData patch. The observing computer is equipped with a trained neural network (NN), and produces the numbers which will later generate sounds. Once captured and calculated, it sends these values to the receiving computer. This computer continuously generates certain self-made sounds, produced by sines and oscillators. Together, these sounds create a...
/ 0/