Market Watch

Proximity Sensors Enabling Gesture Recognition Primed for Robust Expansion


Nimble sensors that track hand movements in conjunction with touch screens are headed for explosive growth as devices like smartphones, tablets, PCs and other gear increasingly incorporate gesture functionality into their user interfaces (UI), according to a MEMS & Sensors topical report from IHS Inc., a leading global source of critical information and insight.

Global revenue for proximity-based gesture sensors is projected to reach $123 million this year, up from just $42,000 in 2012. The primary application of gesture sensors will be in wireless devices, followed by consumer electronics and automotive. Growth will be impressive in the next few years—as much as 68 percent from 2014 to 2015—with sensor revenue hitting $545 million by 2017.

Current and future gesture solutions on the market are of either capacitive type or infrared proximity. In gesture-based mechanisms, specialized proximity sensors detect movements in two or three dimensions and to recognize hand swipes going left, right, up and down. While the current crop of touch screens requires the direct touch of a finger or stylus on the screen, capacitive gesture controls go further, allowing a user to interact with a device without necessarily having to touch the screen. Instead, employing intuitive motions—such as waving one’s hand near to the screen to simulate a swiping motion

Handsets and tablets in wireless are major growth drivers

The Galaxy S4 from Samsung represented the first major push toward gesture interface capability in a handset when the smartphone was released this year. This is a step that others in the industry are likely to follow, IHS believes, thanks to the growing availability of gesture solutions from suppliers like U.S.-based Maxim Integrated Products and soon from both Japan’s Sharp and Taiwan-based Capella Microsystems Inc.

Because IHS does not believe that gesture sensors will be available in low-end handsets, gesture functionality will be limited to midrange and high-end cellphones. Handsets will account for $330 million in revenue for gesture sensors by 2017, up from $123 million this year, even though handsets are likely to use just one sensor because of the limited gesture use cases imposed by the size of the handset display.

PC and media tablets, meanwhile, will be the fastest-growing category for gesture sensors, boasting a revenue compound annual growth rate (CAGR) of 76 percent between 2014 and 2017. Unlike handsets, however, tablets could conceivably make use of multiple gesture packages spread around the tablet’s display in order to provide proper functionality.

With handsets and tablets the main devices for gesture-based sensors, wireless communications as a whole will be the major category for gesture functionality. From 2013 to 2017, gesture-based sensor revenue in wireless devices will grow at a sizable 44 percent CAGR.

Consumer and automotive applications also are a good source

Beyond handsets and tablets in wireless, potential growth for gesture-based sensors also will be found in PC tablets and laptops.

In PC tablets, as in media tablets, opportunity exists for multiple gesture components. Tablets offering Microsoft’s Windows 8 operating system (OS), in particular, could be very friendly to gesture interactions due to the touch-friendly design of the OS.

In the automotive sector, which has already embraced gesture interfaces, a different approach will be employed. Automotive will likely favor using multiple simple Photo Diode-type sensors with intelligence contained in one application-specific integrated circuit (ASIC). Proximity/infrared-based gesture solutions could be used with other solutions—such as the capacitive touch sensors on a display, for instance—to offer more robust performance. Such a solution will be able to sense an approaching finger on a large automotive display screen in the dashboard, as an example.

Already gesture-based automotive solutions can be found in the Volkswagen’s Golf VII, using the Halios integrated circuit from Germany’s Elmos Semiconductor; or in General Motors’ Cadillac User Experience, or CUE.

Further ahead: Improved IR Proximity performance, camera and ultrasonic recognition

Sensor suppliers are already working on more advanced IR proximity-based gesture solutions, IHS believes, which could include 3-D gesture capabilities (adding motion detection in the z-axis) as well as finger level accuracy/resolution.

Near the end of the forecast window, camera-based gesture recognition could enter the handset and tablet market.  The main use case at present for camera-based gesture recognition is for high-accuracy and high-resolution gesture performance—the sort offered by products such as Microsoft’s Kinect.

Intel, for instance, is currently pushing for a camera-based gesture solution for laptops and desktops, which will use software and algorithms to perform useful gesture-based functions that will work in combination with IR proximity sensors for gesture.  

Intel’s approach is in contrast to the use of ultrasonic sensors for gesture recognition in tablets and laptops, which deploys ultrasonic emitters along with an array of microphones that can recognize detailed gestures. Such a solution is promoted by Elliptic Labs.

The arrival of new solutions will likely add to—instead of replace—existing mechanism like capacitive and IR proximity. The new arrivals in turn, will lead to devices that feature sensor fusion as well as potential application-specific approaches, such as cameras for highest accuracy, infrared proximity for most other uses and possibly a combination as needed.

Read more >> Light sensor market: sensing a bright future