Navigation tech company Garmin and Meta have unveiled an automotive OEM proof of concept integrating Meta’s Neural Band with Garmin’s Unified Cabin.

The demonstration was presented at Consumer Electronics Show (CES) 2026.

Discover B2B Marketing That Performs

Combine business intelligence and editorial excellence to reach engaged professionals across 36 leading media platforms.

Find out more

The set-up combines the US-based technology major’s electromyography (EMG)-based wearable with Garmin’s in-vehicle digital cockpit system.

It focuses on exploring wearable-driven inputs for vehicle infotainment and cabin controls.

Meta’s Neural Band is a wrist-worn device designed to translate neural signals generated by wrist muscles into digital commands.

Its EMG technology currently enables basic interactions such as clicking, scrolling and dial-style adjustments.

Garmin automotive OEM executive vice president and managing director Matt Munn said: “Garmin Automotive OEM is excited to team up with Meta to create a groundbreaking proof of concept for in-vehicle entertainment that supports convenience and situational awareness.”

Over time, the companies say the system could support more advanced gestures, including subtle finger movements that resemble handwriting and can be converted into digital text.

Within the automotive concept, the Neural Band allows passengers to manage selected infotainment features through small hand gestures.

Movements of the thumb, index and middle fingers are detected at the wrist and interpreted as in-car commands, removing the need for physical touchscreens or traditional input devices.

Garmin’s Unified Cabin, which forms the other half of the demonstration, is a digital cockpit showcase built around a single control module.

At CES 2026, the platform was shown with features including a digital vehicle key and an AI-powered virtual assistant capable of carrying out several actions from a single voice instruction.

Other elements demonstrated included seat-specific audio and visual settings, enhanced personalisation options, cabin chat, lighting effects and a personal audio sphere, with configurations adaptable to individual OEM needs.

The companies described the collaboration as an automotive OEM proof of concept aimed at developing “next-generation”, wearable-based in-vehicle command and control.

Meta wearables vice president Alex Himel added: “Meta Neural Band and its EMG technology could be the best way to control any device. Once you start using the band regularly, you want it to control more than just your AI glasses.”