Publication Date:
2007
abstract:
Emerging ubiquitous environments raise the need to support multiple interaction modalities in diverse types of devices. Designing multimodal interfaces for ubiquitous environments using development tools creates challenges since target platforms support different resources and interfaces. Model-based approaches have been recognized as useful for managing the increasing complexity consequent to the many available interaction platforms. However, they have usually focused on graphical and/or vocal modalities. This paper presents a solution for enabling the development of tilt-based hand gesture and graphical modalities for mobile devices in a multimodal user interface development tool. The challenges related to developing gesture-based applications for various types of devices involving mobile devices are discussed in detail. The possible solution presented is based on a logical description language for hand-gesture user interfaces. Such language allows us to obtain a user interface implementation on the target mobile platform. The solution is illustrated with an example application that can be accessed from both the desktop and mobile device supporting tilt-based gesture interaction.
Iris type:
02.01 Contributo in volume (Capitolo o Saggio)
Keywords:
Multimodal User Interfaces
List of contributors:
Mantyjarvi, Jani; Paterno', Fabio
Book title:
Task Models and Diagrams for Users Interface Design. 5th International Workshop, TAMODIA 2006. Revised papers.