Apple’s Finger-Controller Patent is a Glimpse at Mixed Reality’s Future

 

THIS ARTICLE WAS ORIGINALLY PUBLISHED ON Fast Company ON JAN 24, 2019

For mixed-reality experiences to take off, input will be as crucial as output. An Apple patent suggests that the company is already at work on that challenge.

By Mark Sullivan

Apple’s forte has always been its mastery of the user interface on devices from PCs to phones to smartwatches. Its engineers are now looking past the phone touchscreen toward mixed reality, where the company’s next great UX will very likely be built. A recent patent application gives some tantalizing clues as to how Apple’s people are thinking about aspects of that challenge. It describes a wearable system that senses the movements of your fingers to control digital content in a mixed reality space.

For the past few years, Apple has been laying the groundwork for its mixed-reality work. In 2017 the company released its ARKit framework, which has helped developers build thousands of augmented-reality apps. But the ARKit apps of today must be experienced through the display of an iPhone or iPad, which is a bit clunky. Relatively soon, however, this view will move to a device we wear over our eyes. Apple will not want to sell a bulky headset, so it will likely wait until it’s possible to fit all the needed components into a relatively normal-looking pair of glasses.

The devices and methods for controlling and navigating mixed reality are also still a little clunky, but evolving. The leading devices in the mixed reality space–like Microsoft’s Hololens and Magic Leap’s Magic Leap One Creator Edition–allow you to use hand gestures for navigation and control. But the depth cameras and sensors needed for the tracking require space, and therefore make for a bulkier headset. One way of reducing the size of the headset is to move the hand-motion sensors to some other spot on the body.

Apple’s patent, called “Finger-mounted Device with Sensors and Haptics,” describes a system of small sensor-laden pieces that fit around the finger just above the fingernail or thumbnail.

Apple patent application image [Source: United States Patent and Trademark Office]

Apple patent application image [Source: United States Patent and Trademark Office]

Each finger wearable is outfitted with a number of sensor types. Optical sensors, for one, measure the movements of the fingertips. Accelerometers help measure motion. This enables a number of touchless gestures for navigation and control, including “taps, force input, persistent touch input, air gestures, and/or other user input,” the patent says.

The wearables leave the fingertips exposed, and contain sensors that detect the “press” or “roll” of the fingertip on a surface, which could be used for additional input types: “[the finger-mounted device] may allow a user to supply joystick-type input using only lateral movement of the user’s fingertips, may gather force sensor measurements (user finger press force measurements) that are used in controlling other equipment…” the patent reads.

The wearable pieces can respond to the user’s touches or gestures via built-in haptic feedback units. For example: “A finger-mounted device may be used to control a virtual reality or augmented reality system [or] may provide a user with the sensation of interacting on a physical keyboard when the user is making finger taps on a table surface (e.g., a virtual keyboard surface that is being displayed in alignment with the table surface using a head-mounted display)…” says the patent.

Apple isn’t the only big company tackling the challenges of mixed-reality input. Google has been working on a gesture-based control system called Project Soli for several years. The Soli sensor captures hand motions within a three-dimensional space using a miniature radar system. So far, Google has shown applications where the radar sensor resided in a device like a smartwatch or a TV to enable gesture control, but the tech could also be used to track hand gestures within a mixed reality context. It may even be possible that a wearable device like a smartwatch could house the radar sensor, which would then emit a radar beam toward the user’s hand to detect gestures, one expert told me. Or the sensor could be built into a new type of device, perhaps something more like Apple’s finger wearables, or into clothing.

Apple patent application image [Source: United States Patent and Trademark Office]

Apple patent application image [Source: United States Patent and Trademark Office]

Google is apparently serious about the Soli technology. It recently asked the Federal Communications Commission to grant it permission to emit the radar beam at a higher power level to pick up finer hand gestures. The FCC granted the request at the end of last year.

Samsung is also actively developing gesture control systems for mixed reality. The U.S. patent office published a patent application from the South Korean giant describing a virtual reality headset that uses a combination of 3D sensors and infrared sensors to detect complex movements of the user’s hands. Samsung filed another patent application where a similar technology is used to control content in an augmented reality system.

At the moment, both Apple’s ARKit and Google’s ARCore create phone-based augmented reality (like Pokémon Go) using regular 2D cameras on the backs of phones. But phone-based AR could advance an additional step before advancing to AR glasses. Smartphone makers are showing strong interest in putting 3D depth cameras on the backs of smartphones to more effectively map 3D for digital content. Bloombergreported in December that Huawei plans to include Sony 3D cameras on the backs of several of its new phones for 2019. The report cited sources saying that the 3D cameras will be used for more than taking 3D pictures and spacial scanning, but will also be used to track hand gestures.

Apple’s Memoji [Image courtesy of Apple]

Apple’s Memoji [Image courtesy of Apple]

Apple’s iPhone “X” models use a 3D camera on the front of the devices for facial recognition, Animoji, and Memoji. But 3D cameras can do much more, especially if they’re mounted on the back of the phone. With a more accurate 3D map of the room, ARKit apps could place content in more realistic and functional ways. Such a 3D camera might also be used for tracking hand gesture control. However, Apple may decide to wait until the Apple AR glasses are a reality before supporting gesture controls in the iPhone.

The technology described in Apple’s patents often never makes it to market. Sometimes it does, but in a very different form than that described. Apple’s “finger-mounted” controllers patent is most interesting for the challenges it addresses. The company is thinking hard about the next frontier in user interfaces, and how it might arrive at the right time with the right technology to dominate mixed reality like it has dominated mobile experiences.