Gestur.a is an interactive musical synthesiser app, built for iOS. It is currently in the iterative prototyping stage. The synthesiser is controlled by a mixture of different affordances - user gestural motion, on screen touch parameters, and the microphone. The purpose of the app is to give users the ability to make expressive music with their iPhone in an easy and intuitive way, and to be a platform where they can share these compositions with friends and collaborators. Gestur.a is built in Xcode, using the LibPD framework to connect with PureData - the open source visual programming language that it uses for audio processing and synthesis. This is an ongoing project, and this essay aims to contextualise the project in the field of music technology and instrument / interactive design and provide a rationale for, and a description of, the development to date.