Multimodal and Multi-Touch Interaction
Left Border
Prof. Dr. Beat Signer
Vrije Universiteit Brussel
Department of Computer Science
Pleinlaan 2, 1050 Brussels
(Belgium)
+32 2 629 1239, bsigner@vub.be
Office: PL9.3.60 (Pleinlaan 9)
VUB
View Beat Signer's profile on LinkedIn twitter View Beat Signer's profile on Facebook View Beat Signer's profile on YouTube Instagram View Beat Signer's profile on academia.edu View Beat Signer's profile on Google Scholar View Beat Signer's profile on ResearchGate View Beat Signer's profile in the ACM Digital Library View Beat Signer's ORCID profile Slideshare View Beat Signer's profile on Speaker Deck View Beat Signer's profile on 500px View Beat Signer's profile on SmugMug

Multimodal and Multi-Touch Interaction

Over the past few years, multi-touch user interaction has emerged from research prototypes into mass market products. This evolution has been mainly driven by emerging devices such as Apple’s iPad or Microsoft’s Surface tabletop computer. The realisation of these multi-touch applications is often time-consuming and expensive since there is a lack of software engineering abstractions in existing multi-touch development frameworks. In our research on multi-touch gesture recognition we are developing a rule-based language to improve the extensibility and reusability of multi-touch gestures. Our Midas solution for the declarative gesture definition and detection is based on logical rules that are defined over a set of input facts.
Multimodal interaction setup
Fig. 1: Multimodal interaction setup

A similar approach cannot only be used for multi-touch interaction but also for the development of multimodal interfaces. For this purpose, we have developed Mudra, an innovative multimodal gesture interaction framework that enables a declarative description of gestures for different input modalities and devices including, for example, the Microsoft Kinect, digital pen and paper solutions or the Emotiv EPOC brain interface. We feel confident that our research will support the implementation and investigation of novel multimodal and multi-touch interactions that go beyond the current state-of-the-art.


Based on specific requirements for our pen and paper-based user interfaces, we have developed iGesture, a general gesture recognition framework. The Java-based iGesture solution is suited for application developers who would like to add new gesture recognition functionality to their application as well as for the designers of new gesture recognition algorithms. Recently, iGesture has been extended to support new input devices. In addition to traditional screen and mouse-based interaction and digital pen and paper input, iGesture now also provides support for the Wii Remote and for TUIO devices.

Related Publications

  • 2017

  • 2015

  • 2014

  • 2013

  • 2012

  • 2011