Using our hands and bodies has been the dominant means of control since the dawn of time. This of course is natural for activities like farming and hunting and made a lot of sense as service businesses emerged. As our machines got more intricate and complex, touch made a lot of sense for mechanical devices, and interestingly enough even in the rise of electronics, using buttons, knobs, and various finger-based controls has been pervasive.
When I started Sensory 25 years ago, I thought the time had come for Sensory functions (gestures, voice, and computer vision) using AI to replace all the touch controls. My logic was simply that our consumer electronics had become so full of features and capabilities that we didn’t even know how to access them. One of Sensory’s early customers was called “Flashing 12.” I loved the name as it accurately stated the problem with VCR’s…a lot of them had flashing 12’s because nobody knew how to set the time.
Of course, I was right, but about 20 years too early. In the last decade we have seen the rise of voice interfaces and the value that they create at home and in cars and even on our personal devices like mobile phones. But so many of our shared devices still use touch!