Publication Date:
2007
abstract:
In this paper, we present an innovative gestural sensor
interface that allows an electronic music composer to plan and
conduct the musical expressivity of a performer. For musical
expressivity we mean all those execution techniques and
modalities that a performer has to follow in order to satisfy
common musical aesthetics, as well as the desiderata of the
composer.
The proposed sensor interface is able to transform physical
parameters in sound synthesis parameters. It is composed by a
gestural transducer, that measure motion acceleration and
angular velocity, and a mapping module, that transform few
physical measured parameters into a lot of specific sound
synthesis parameters. In this work, we focus our attention on
mapping strategies based on Neural Network.
Iris type:
04.01 Contributo in Atti di convegno
List of contributors: