top of page
Image by SIMON LEE
  • Writer's pictureantoniodellomo

The future of input devices

In the past 12-15 years, we witnessed the so-called, digital evolution the shift from mechanical and analog electronic technology to digital electronics. Technology has impacted the world and everything has changed. Consequently, the remarkable advances in technologies enabled us to radically change our relationship with computers. A few years ago, the words “input device” evoked in our mind only two specific objects: the keyboard and the mouse.


In fact, in the early 1980s, when HCI emerged the keyboard and the mouse were the main instruments used to provide data to a personal computer. In a high-tech world where change is rapid and constant, the computer keyboard has been the undisputed monarch of input devices from the very beginning. Today, with the evolution of computers, we have a large set of input devices that changed the way we interact with computers. Continuing innovations in areas like speech recognition and gesture control are merging with even newer technologies to open up possibilities.


Touch screens

Yes, touch displays have been around for nearly half a century now, but only with the advent of smartphones and mobile internet, we can say that touch screen has become the most common method of entering information into a computer. Today, nearly anything can be turned into an interactive surface. There are touch-based phones, computers, television, cash registers, information kiosks, and gaming consoles. Even children are so exposed to touchscreen technology today, that they are becoming so adapted to it. The advantages of touch screens are that they are easy to use, easy to learn, require no extra workspace, have no moving parts, and are durable. They can provide a very direct interaction. Ease of learning makes them ideal for use to everyone.


The most common functions of touchscreens are the following: Tap, Double-tap, Touch and hold, Drag, Swipe, and Pinch. As a consequence, fingers can be now considered input devices.



Speech recognition

Many speech recognition applications and devices are available, but the more advanced solutions use AI and machine learning. They integrate grammar, syntax, structure, and composition of audio and voice signals to understand and process human speech. Ideally, they learn as they go — evolving responses with each interaction. Speech recognition has become an increasingly popular concept in recent years. Virtual assistants are increasingly becoming integrated within our daily lives, particularly on our mobile devices. From organizations to individuals, the technology is widely used for various advantages it provides. Speech recognition technology also makes invaluable contributions to organizations. Businesses that provide customer services benefit from the technology to improve self-service in a way that enriches the customer experience and reduces organizational costs. As with any automated tool, there are limits to how well a product like this can work. Consider these typical concerns when using speech recognition tools:

  • Accuracy is always imperfect.

  • The software cannot understand the complexities of jargon or phrasing.

  • Some voices don’t come across well.


Gesture Control

If talking to our machines is the most natural way to input information, the gesture is probably a close second. Kinect, Leap Motion, and HTC Vive Controller use gestures to help the user interact with the virtual world on screen, or in VR. Hundreds of companies and research labs around the world are working on gesture-control systems, but interestingly, the automotive industry is taking a leading role. In fact, gesture control is BMW’s latest feature, which allows drivers to control select iDrive functions with the use of hand gestures captured by a 3D camera. It is amazing that computers can recognize common human gestures and translate them into the world.



Virtual Reality (VR)

At the moment is probably the most talked-about way to provide input. It includes free movement using sensors on headsets and motion controllers. A user is immersed in the environment and the brain is basically tricked into thinking what someone is seeing in the virtual world is real. The list of industries that are already adopting VR grows every day and this is something that could really impact our future. Although there’s understandably some nervousness over whether VR can live up to its hype, I believe that well-executed VR has enormous potential to enhance events, build brand loyalty through immersive content, and make virtual experiences as powerful as real-life ones.



Handwriting recognition

The technique by which a computer system can recognize characters and other symbols written by hand in natural handwriting. The technology is used for identification and also on devices such as PDA and tablet PCs where a stylus is used to handwrite on a screen with a stylus, after which the computer turns the handwriting into digital text.



Eye-tracking

Computers can't tell what you are thinking. Not yet, anyway - but they can tell where you are looking. The technology known as eye-tracking does just that, monitoring the position and appearance of your eyeballs to determine what you are looking at, and for how long. It allows for unique insights into human behavior and facilitates natural user interfaces in a broad range of devices. The ability to control a computer using the eyes is also vital for people who are unable to speak or use their hands. By using the eyes as a “pointer” on a screen, eye tracking facilitates interaction with computers and other devices when the user cannot or does not wish to use their hands as the form of input. By combining eye tracking with other input modalities, allow you creating new user experiences and innovative interfaces. These will be more intuitive, natural, engaging, and efficient than conventional user interfaces.









bottom of page