The interface between man and machine typically has been a tactile touch-oriented interface. Even before electricity, levers, cables, rods, and good old-fashioned hand-and-arm strength required operating mining, smelting, casting, grinding, polishing, and other operations. With the advent of electrical power, these operations became as simple as a switch or button in many cases.
Modern user interfaces use buttons, switches, touch panels, and even voice for device access and control. Speech recognition has made tremendous inroads for many touchless applications. But the modern world of pathogenic paranoia is causing machine-makers to rethink how humans interact with machines, especially in public places.
Public places have always presented unique challenges for machine interfaces. Simple subway turnstiles or railings and banisters are touched in some way, shape, or form. Automatic doors have solved that in some places, but complex machines that require more sophisticated user interfaces are especially challenging. For example, most ATM interfaces use a keypad or touch screen to enter PINs or passwords. Both are surfaces that do not self-disinfect. And no one comes around to wipe them down with a swab or alcohol.
Touchless technology allows a new and safe way for machines and humans to interact without spreading pathogens. Here we explore how designers can integrate touchless technology for machine interfaces.
Video is becoming a crucial part of secure and safe public interaction interfaces for individual and networked systems. Smartphones using facial recognition can lock and unlock these handheld devices (Figure 1). Facial recognition software works in real time through a massive network of globally accessible video feeds. It is one of many individual remote-tracking technologies at the disposal of law enforcement and international security groups.
Figure 1: Facial recognition has been used to lock and unlock these smartphones. (Source: HQuality/Shutterstock.com)
3-D cameras offer another option for implementing touchless technology into user interfaces. 3-D cameras have made significant improvements over the years. Both types—fisheye, such as the Samsung Gear 360 devices, and the spatially distanced stereoscopic Microsoft Kinect–have demonstrated impressive capabilities. Gamers and programmers have used older planar technologies, such as Xbox 1 (launched around 2001), to create impressive body-motion tracking and measuring.
This technology could recognize contrast-determined boundaries, determine centers of mass to create a skeletal wireframe, and track more than one person at a time—an impressive but underutilized technology back then. But it is often the gaming world that pushes the limits of processing power and graphics speeds and resolutions.
Modern Kinect 2 sensor technology uses a time of flight (ToF) measurement technique in which 3-D depth for each video point is determined by monitoring the reflection time from an LED or a laser. Thanks to higher resolution high-definition video, it is even more impressive today. Video can measure heart rate by monitoring blood vessel expansion and contraction on your face, neck, or forehead.
At the center of all this technology is the ability to recognize gestures and motions. This technology is contactless for the most part. You can interact with handheld devices, but never have to touch the heart of the machine.
Touchless interfaces must keep practical considerations in mind. We can’t expect people to dance at a doorway to gain entry, for example. Full body gesturing is not in itself an ideal way of securely and safely interacting with a handheld or stationary machine. But, at close range combined with a display technology and haptic feedback, hand gesturing could provide a contactless and secure means of entering data or shooting virtual bad guys (Figure 2). Three technologies must integrate to make this happen.
Figure 2: Precision hand-tracking modules allow detailed maps of hand position and actions, including bones and joints’ location. (Source: Ultraleap)
These technologies are merging and are available today. If you face a dilemma about how to engineer your next-generation public machine user interface amid the new normal of pathogen-sensitive populations, Ultraleap is a technology company to consider.
Ultraleap is a coming together of two companies and technologies. Leap Motion and Ultrahaptics joined forces in 2019 to combine precision tracking with haptic feedback. The Multiple Function Sensor Motion Controller module for hand tracking is a small (13mm x 80mm x 30mm) dedicated camera-based hand tracking surface-mountable module with a 120˚-x-150˚ field of tracking that extends up to 24 inches away. This software can already detect 27 distinct hand positions, motions, and elements, including bones and joints' locations.
Couple this with an ultrasonic-based Stratos Inspire haptic feedback system, and you will literally feel your actions. An array of ultrasonic transmitters are triggered at different times to create an ultrasonic wavefront whose super positioned waveform can be felt (Figure 3).
Development kits for the Stratos Inspire and Stratos Explore are available now to begin your development. A development kit for hand tracking is also available from Ultraleap.
Figure 3: Ultrasonic waves triggered at specific times superimpose to interfere and create a sensation of touch or physical feedback force. A small-scale version of acoustic levitation. (Source: Ultraleap)
Combining these technologies allows a new and safe way for machines and humans to interact without the spread of pathogens. It is also a prime technology to integrate with virtual reality and augmented reality for local and remote machine control and gaming. The company also provides a VR Developers mount and has a cool video showing how it can be used for gaming and other interesting applications.
In the modern world, the fear of spreading pathogens compels machine-makers to rethink effective ways we humans interact with machines. Integrating touchless technology—such as facial and gesture recognition, motion detection and tracking, and heptic feedback—into machine interfaces provides an effective method for humans to interact with machines
After completing his studies in electrical engineering, Jon Gabay has worked with defense, commercial, industrial, consumer, energy, and medical companies as a design engineer, firmware coder, system designer, research scientist, and product developer. As an alternative energy researcher and inventor, he has been involved with automation technology since he founded and ran Dedicated Devices Corp. up until 2004. Since then, he has been doing research and development, writing articles, and developing technologies for next-generation engineers and students.
Privacy Centre |
Terms and Conditions
Copyright ©2021 Mouser Electronics, Inc.
Mouser® and Mouser Electronics® are trademarks of Mouser Electronics, Inc. in the U.S. and/or other countries.
All other trademarks are the property of their respective owners.
Corporate headquarters and logistics centre in Mansfield, Texas USA.