Sentons CameraBar uses ultrasound to turn your phone’s frame into a zoom slider (XDA Developers)

Smartphones are incredibly versatile portable PCs, but with only a handful of physical buttons available on most devices, you have to rely on touchscreen controls for most things. When it comes to playing games or using the camera app, you have to juggle between many different onscreen buttons and sliders, resulting in a subpar, limiting experience due to limited screen real estate and awkward hand ergonomics. A company called Sentons wants to change this reality by introducing what they’re calling “Software-Defined Surfaces” (SDS) in place of physical buttons. Today, they’re introducing CameraBar, a new SDS that uses ultrasound to detect taps and slides on the frame of a phone to mimic the physical shutter and zoom buttons on a traditional camera.

With CameraBar, users can avail of virtual shutter and zoom controls without their fingers obstructing the view by touching the screen. The default configuration of CameraBar is to listen for a light press on the right side to set the focus, a hard press on the right side to snap a picture, and a slide-to-zoom on the left side for optical zoom. The video embedded below demonstrates CameraBar in action on a retail ASUS ROG Phone 3 unit as well as on custom development hardware.

The ROG Phone 3 shown in the video above is presumably running custom firmware to allow for Sentons’ custom camera app to react to inputs from the sensors, as the AirTriggers feature on the ASUS ROG Phone cannot currently be mapped to any actions in the stock ASUS Camera app. For this feature to make its way to the ROG Phone, ASUS will have to add support for it through a software update.

While the ROG Phone 3 can technically add support for the gestures shown in this demonstration, Sentons CTO Sam Sheng told XDA that the ideal device to feature CameraBar will have a larger area for sliding to allow for more fine-grained control of the zoom level. No such device currently exists on the market, though Sentons is in talks with several undisclosed partners who are taking this technology to production shortly. The company is providing OEMs with recommended sensor topology, guidance on how to design the module, and reference software on how to implement this as part of the stock camera app. OEMs can customize the gesture activation region, and if they choose to do so, they can also extend the same customization options to the consumer.

Eventually, it’s believed that OEMs making new smartphones with all-screen designs and “waterfall” displays will be the first to adopt Sentons’ new CameraBar technology, though as previously mentioned, smartphones that have implemented Sentons’ existing GamingBar technology (which includes the ROG Phone 3 and Lenovo Legion Phone Duel) can inherit functions of CameraBar.

Replacing Buttons with Ultrasound

Buttons are a common point of failure in smartphones and a hindrance to achieving a truly all-screen design, so it makes sense for smartphone manufacturers to attempt to get rid of them. The only problem is finding a worthwhile alternative to a physical button, and we’ve seen a few lackluster attempts at replacing them in the past. Huawei’s Mate 30 Pro used “invisible” touch buttons for the volume rocker which some users struggled to trigger. HTC’s U12+ featured faux buttons that were similarly frustrating for some reviewers. While Huawei tried to implement its volume keys capacitively, HTC used Sentons’ ultrasonic sensors, though I’m told HTC used a simple strain-gauge sensor. In contrast, the ROG Phone models from ASUS can sense much lighter touches, under 5 grams-force. Although I haven’t had the opportunity to test the HTC U12+ myself, my experience with the ROG Phone 3 and its customizable AirTriggers gestures has been mostly positive, so I’m looking forward to seeing how technology from Sentons can not only replace the buttons on phones but also augment their functionality.

So how exactly do OEMs actually replace a button with Sentons’ tech? Replicating a physical button on a smartphone using ultrasound involves combining piezoelectric and strain-gauge sensors. Sentons likens its technology to sonar, which uses ultrasonic waves for echolocation. The time-of-flight of the vibration field created by the piezoelectric sensors is used to uniquely determine the position of the user’s finger, and the coupling of the finger and substrate that’s vibrating is used to determine the force from the vibrating sound wave. In other words, ultrasonic waves help identify the location, while a strain-gauge sensor determines the level of force applied.

Thus, the principles behind the technology aren’t new, but what Sentons is selling to OEMs is its line of SDSwave force-and-touch processors, its machine learning algorithms to weed out false touches from taps and gestures, and its ultrasonic strain-gauge sensor. The piezoelectric sensors, though, can be off-the-shelf, making them very inexpensive to incorporate into the smartphone design. So long as the material used in the smartphone body is stiff enough, and thus allows for ultrasonic waves to propagate, it can be turned into a virtual touch sensor.

Sentons says its ultrasonic sensors can recognize finger taps through glass, plastic, and even millimeters of aluminum, meaning the sensing elements can be mounted on the phone’s mid plate rather than right behind where the finger is expected to be placed. The caveat, though, is that this can only be done when the smartphone maker wishes to replace “lower performance” buttons like volume or power buttons — replicating gestures that need more precision, such as a slider, will generally require the sensing element to be mounted on the sidewall behind the contact point. These sensing elements are said to be very, very tiny and can easily be slotted in between antenna elements (such as mmWave antennas placed around the body of a 5G smartphone), and since there are no wires involved, there won’t be any degradation of the antenna performance.

The small size of the sensing sensors even makes it possible for them to be used in devices as small as smartwatches and hearables (like true wireless earbuds). For smartwatches, ultrasonic gestures could be used to replace a physically rotating crown or a touch-sensitive capacitive bezel. For true wireless earbuds, ultrasound could bring us better tap and gesture detection for music controls. Sentons is currently experimenting with implementing their technology in more form factors, with even automotive uses on the table, but there haven’t been any commercial products (outside of smartphones) to use their technology just yet. But Sentons is far from the only company using machine learning to analyze ultrasound for use in virtual smart sensors—there’s also Elliptic Labs which has partnered with multiple smartphone makers for its ultrasonic proximity detection tech—so there’s a good chance ultrasound will stick around and become even more widely adopted.

Read full article HERE.

return to news

Get in touch!

General Inquiries
info@sentons.com

Press Inquiries
press@sentons.com

+1.408.732.9000

Sentons USA, Inc.
627 River Oaks Parkway
San Jose, CA 95134
USA