The idea is simple to translate the journey of a person into a continuous stimulation of moods and emotion. Sounds the resultant of this stimulation is produced by changing speed, direction,relative distance to A & B and the compass “look direction” to produce a multitrack “song” that with time, the user can play with it’s composition. This is done by using an app called antimap, which captures your position, speed, direction to the compass, while you are rooming in the city. Processing is use to make a sketch which uses these data and translates it into into a visual & sound experiences. So you not only see put also hear his or her journey. So with a this basic concept we went to drawing board and started sketching how the display would look like and what kind of environment the object would be. The object is just a representation of a persons journey. So We (Lucas De Sordi & me) found a somehow similar sketch in open processing website (link here). We used this sketch as a reference for our idea because it uses the potions of mouse to produce a certain kind of sound. we liked the idea of a certain kind of sound. We first made a sketch that visualizes the movement of mouse to produce sound. This was to test the variation and range of sounds that can be produced. Later this movement of mouse would be replaced be the movement of humans in the city.
Untitled from Arfurqan on Vimeo.
In next step (major coding) the data from antimap, the visual display and the production of sound. All these happen written into a simple code. Here are a few screen shots. Screen Shots
ok from start we wanted to have a more zen like peaceful melodies sounds, Since the example for the sound was not the right example it was hard for u the change it at this moment, secondly the sound was going to be produced by the direction of once head and this event would also be shown on the display.