Sign Language for Cars


Drivers can sift through a stored playlist of music, zoom in and out of navigation maps or accept phone calls, all without ever having to look at or touch the center console. 

Delphi Gesture Control leverages simple gestures to control vehicle functions, essentially performing “sign language” from the driver to the vehicle. These everyday gestures require no visual attention, allowing the driver to stay focused on driving.

Delphi Gesture Control is the first automotive system which offers an intuitive and convenient way to operate in-car functions without using voice or touch to force an action. The system distinguishes intended hand gesture commands from random movements, and is a powerful extension to existing controls such as traditional knobs, touch panels, or voice control. 

Here’s how it works: an infrared, 3D camera is placed in the hood liner inside the overhead console. Using an easy-to-learn universal gesture language, machine learning algorithms recognize the motions and distinguish between meaningful gestures and other actions not intended for the controls.

The system is intuitive and convenient, keeping the driver’s eyes on the road to improve safety and allow for future HMI innovation including multi-modal concepts which combine voice, eye glance and gesture technologies.

The system debuted on the 2016 BMW 7 Series sedan and will be added to additional vehicles in the near future. 

tags