Over the past few years, multi-touch user interaction has emerged from research prototypes into mass market products. This evolution has been mainly driven by emerging devices such as Apple’s iPad or Microsoft’s Surface tabletop computer. The realisation of these multi-touch applications is often time-consuming and expensive since there is a lack of software engineering abstractions in existing multi-touch development frameworks. In our research on multi-touch gesture recognition we are developing a rule-based language to improve the extensibility and reusability of multi-touch gestures. Our Midas solution for the declarative gesture definition and detection is based on logical rules that are defined over a set of input facts.
A similar approach cannot only be used for multi-touch interaction but also for the development of multimodal interfaces. For this purpose, we have developed Mudra, an innovative multimodal gesture interaction framework that enables a declarative description of gestures for different input modalities and devices including, for example, the Microsoft Kinect, digital pen and paper solutions or the Emotiv EPOC brain interface. We feel confident that our research will support the implementation and investigation of novel multimodal and multi-touch interactions that go beyond the current state-of-the-art.
Based on specific requirements for our pen and paper-based user interfaces, we have developed iGesture, a general gesture recognition framework. The Java-based iGesture solution is suited for application developers who would like to add new gesture recognition functionality to their application as well as for the designers of new gesture recognition algorithms. Recently, iGesture has been extended to support new input devices. In addition to traditional screen and mouse-based interaction and digital pen and paper input, iGesture now also provides support for the Wii Remote and for TUIO devices.
Engineering Gestures for Multimodal User Interfaces,
Florian Echtler, Dietrich Kammer, Davy Vanacken, Lode Hoste and Beat Signer,
Proceedings of EICS 2014, 6th International Conference on Engineering Interactive Computing Systems,
Rome, Italy, June, 2014
Design Guidelines for Adaptive Multimodal Mobile Input Solutions,
Bruno Dumas, María Solórzano and Beat Signer,
Proceedings of MobileHCI 2013, 15th International Conference on Human-Computer Interaction with Mobile Devices and Services,
Munich, Germany, August 2013 (acceptance rate: 22%)
Mudra: A Unified Multimodal Interaction Framework,
Lode Hoste, Bruno Dumas and Beat Signer,
Proceedings of ICMI 2011, 13th International Conference on Multimodal Interaction,
Alicante, Spain, November 2011 (acceptance rate: 39%)
Midas: A Declarative Multi-Touch Interaction Framework,
Christophe Scholiers, Lode Hoste, Beat Signer and Wolfgang De Meuter,
Proceedings of TEI 2011, 5th International Conference on Tangible, Embedded and Embodied Interaction,
Funchal, Portugal, January 2011 (acceptance rate: 32%)