Development of a gesture controller software via Microsoft Kinect sensor for switching between different layouts in operating rooms.

Gesture Controller with Microsoft Kinect for EIZO OR Monitor

A showcase for MEDICA
EIZO GmbH, specifically its Operating Rooms department, was considering demonstrating contactless gesture control for its CuratOR Surgical Panel (a customisable wall console for clinical use) as a showcase at MEDICA 2015 in Düsseldorf.
Hands on or rather "HandsUp" for CONZE
The goal was to develop a fault-tolerant gesture and voice control concept that would allow the user to switch between different signal layouts from a variety of image sources. These signal layouts were meant to be processed and merged by the EIZO Large Monitor Manager (LMM). The controller software should be activated with a simple “HandsUp” gesture or voice command. A swiping gesture or voice command was supposed to linearly switch between the layouts.
Analytical approach with practical result
Everything started by evaluating a wide range of sensor devices: Microsoft Kinect, Leap Motion, Thalmic Labs Myo and many other human-computer interaction technologies were checked for their suitability in a medical working environment. To bring EIZO’s vision to life, the software architects at CONZE combined the proven C# development environment with an intuitive user interface based on the WPF-GUI framework. Before the coding started, parameters like positioning, sequence and movement needed to be pre-defined precisely for all swiping gestures. Microsoft Kinect’s Skeletal Tracking feature served as the basis for the motion processing. With this technology, the joints of up to four users could be located and tracked simultaneously. The Voice control of the controller software was activated with the command “Hey CuratOR”. Inputs and Layouts were switched with the command “back” or “next” for the respective direction. The communication between the CONZE Gesture Controller Software and the EIZO LMM was realized with a configured HTTP interface to display gestures and image sources separately.


  • Real-time processing of the gesture recognition algorithm with the Kinect V2 sensor, gesture control from a distance of up to 5 metres for up to 6 people possible
  • Touchless control via hand gestures (left/right) and voice commands
  • Integration of the Microsoft speech recognition API
  • Switching between different video sources
  • 24/7 operation with low CPU/RAM utilisation


  • C# / .NET 4.5 for Microsoft Windows 8
  • Microsoft Kinect v2 hardware and SDK
  • Microsoft speech SDK


  • 1 Software Architect (.NET)
  • 1 UI-Developer (C#)