Midas
Left Border
Prof. Dr. Beat Signer
Vrije Universiteit Brussel
Department of Computer Science
Pleinlaan 2, 1050 Brussels
(Belgium)
+32 2 629 1239, bsigner@vub.be
Office: PL9.3.60 (Pleinlaan 9)
VUB
View Beat Signer's profile on LinkedIn twitter View Beat Signer's profile on Facebook View Beat Signer's profile on YouTube Instagram View Beat Signer's profile on academia.edu View Beat Signer's profile on Google Scholar View Beat Signer's profile on ResearchGate View Beat Signer's profile in the ACM Digital Library View Beat Signer's ORCID profile Slideshare View Beat Signer's profile on Speaker Deck View Beat Signer's profile on 500px View Beat Signer's profile on SmugMug

Midas: A Declarative Multi-Touch Interaction Framework

Multi-touch technology allows users to use their hands to manipulate digital information. We have observed that mainstream software frameworks do not offer support to deal with the complexity of these new devices. Current multi-touch frameworks only provide a narrow range of hardcoded functionality. Therefore, the development of new multi-touch gestures and their integration with other gestures is notoriously hard. The main goal of the Midas framework is to provide developers adequate software engineering abstractions to close the gap between the evolution in the multi-touch technology and software detection mechanisms.

Current frameworks force the programmer into an event driven programming model where the programmer has to register and compose event handlers manually. This results in applications where the control flow of the application is driven by external events and no longer by the sequential structure of the program. The reuse, composition and understanding of program code gets difficult when using such frameworks.

In this work, we propose a solution based on research conducted in the complex event processing domain. We advocate the use of a rule language which allows programmers to express gestures in a declarative way. The advantage of such an approach is that the programmer no longer needs to be concerned about how to derive gestures but only about describing the gesture. We present a first step in that direction in the form of a domain-specific language supporting spatio-temporal operators.

Complex gestures which are extremely hard to be implemented in traditional approaches can be expressed in one or multiple rules which are easy to understand. The use of a rule language has the benefit that the developed gestures are reusable and easy to compose. Further, a strong connection to application-level entities allows developers to activate and deactivate gestures depending on their graphical context.

Related Publications

  • 2015

  • 2014

  • thumb Engineering Gestures for Multimodal User Interfaces, , , , and , Proceedings of EICS 2014, 6th International Conference on Engineering Interactive Computing Systems, Rome, Italy, June, 2014
    Available:  document ACM digital library
  • 2013

  • 2011

  • thumb Midas: A Declarative Multi-Touch Interaction Framework, , , and , Proceedings of TEI 2011, 5th International Conference on Tangible, Embedded and Embodied Interaction, Funchal, Portugal, January 2011 (acceptance rate: 32%)
    Available:  document ACM digital library presentation
  • thumb Mudra: A Unified Multimodal Interaction Framework, , and , Proceedings of ICMI 2011, 13th International Conference on Multimodal Interaction, Alicante, Spain, November 2011 (acceptance rate: 39%)
    Available:  document ACM digital library